All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
A new competitor has entered the arena!
When we first saw the announcement of the MateBook in Spain back in March, pricing was immediately impressive. The base model of the tablet starts at just $699; $200 less than the lowest-priced Surface Pro 4, with features and performance that pretty closely match one another.
The MateBook only ships with Core m processors, a necessity of the incredibly thin and fanless design that Huawei is using. That obviously will put the MateBook behind other tablets and notebooks that use the Core i3/i5/i7 processors, but with a power consumption advantage along the way. Honestly, the performance differences between the Core m3 and m5 and m7 parts is pretty small – all share the same 4.5 watt TDP and all have fairly low base clock speeds and high boost clocks. The Core m5-6Y54 that rests in our test sample has a base clock of 1.1 GHz and a maximum Turbo Boost clock of 2.7 GHz. The top end Core m7-6Y75 has a base of 1.2 GHz and Boost of 3.1 GHz. The secret of course is that these processors run at Turbo clocks very infrequently; only during touch interactions and when applications demand performance.
If you work-load regularly requires you to do intensive transcoding, video editing or even high-resolution photo manipulation, the Core m parts are going to be slower than the Core i-series options available in other solutions. If you just occasionally need to use an application like Photoshop, the MateBook has no problems doing so.
|Huawei MateBook Tablet PC|
|Screen||12-in 2160x1440 IPS|
|CPU||Core m3||Core m3||Core m5||Core m5||Core m7||Core m7|
|GPU||Intel HD Graphics 515|
|Network||802.11ac MIMO (2.4 GHz, 5.0 GHz)
Gigabite Ethernet (MateDock)
|Display Output||HDMI / VGA (through MateDock)|
|Connectivity||USB 3.0 Type-C
USB 3.0 x 2 (MateDock)
|Audio||Dual Digital Mic
|Weight||640g (1.41 lbs)|
|Dimensions||278.8mm x 194.1mm x 6.9mm
(10.9-in x 7.6-in x 0.27-in)
|Operating System||Windows 10 Home / Pro|
At the base level, both the Surface Pro 4 and the MateBook have identical specs, but the Huawei unit is priced $200 lower. After that, things get more complicated as the Surface Pro 4 moves to Core i5 and Core i7 processors while the MateBook sticks with m5 and m7 parts. Storage capacities and memory size scale though. The lowest entry point for the MateBook to get 256GB of storage and 8GB of memory is $999 and comes with a Core m5 processor; a comparable Surface Pro 4 uses a Core i5 CPU instead but will run you $1199. If you want to move from 256GB to 512GB of storage, Microsoft wants $400 more for your SP4, while Huawei’s price only goes up $200.
27 notebooks can't be wrong
A month or so back, I had a friend come to me asking for advice on which gaming notebook he should purchase. He had specific needs that were tailored to a portable gaming machine: he wanted to have a single machine for home and mobile use, he wanted to be able to game while traveling and he had a pretty reasonable budget. As the "guy that runs the gaming hardware website" I was expected to have an answer...immediately. But I didn't. As it turns out, dissecting and digesting the gaming notebook field is pretty complex.
I sent a note to MSI, offering to build a video and a short story around its products if they sent me one of each of line of gaming notebooks they sold. Honestly, I didn't expect them to be able to pull it together, but just a couple of weeks later, a handful of large boxes arrived and we were staring at a set of six powerful gaming notebooks to analyze.
|GE62 Apache Pro-014||GS40 Phantom-001||GS60 Ghost Pro-002||GS72 Stealth Pro 4K-202||GT72S Dominator Pro G-220||GT80S Titan SLI-002|
|Screen||15.6-in 1080p||14-in 1080p||15.6-in 1080p||17.3-in 4K||17.3-in 1080p G-Sync||18.4-in 1080p|
|CPU||Core i7-6700HQ||Core i7-6700HQ||Core i7-6700HQ||Core i7-6700HQ||Core i7-6820HK||Core i7-6820HK|
|GPU||GTX 960M 2GB||GTX 970M 3GB||GTX 970M 6GB||GTX 970M 3GB||GTX 980M 8GB||GTX 980M 8GB SLI|
|Storage||128GB M.2 SATA
|128GB PCIE SSD
|128GB PCIE SSD
|256GB PCIE SSD
|256GB PCIE RAID SSD
|256GB PCIE RAID SSD
|Optical||DVD Super-multi||None||None||None||Blu-ray Burner||Blu-ray Burner|
|Display Output||HDMI 1.4
|Connectivity||USB 3.1 Type-C
USB 3.0 x 2
USB 2.0 x 1
USB 3.0 x 2
USB 3.0 x 2
|USB 3.1 x 2
USB 3.0 x 2
USB 3.0 x 6
USB 3.0 x 5
|Dimensions||15.07-in x 10.23-in x 1.06-in||13.58-in x 9.65-in x 0.87-in||15.35-in x 10.47-in x 0.78-in||16.47-in x 11.39-in x 0.78-in||16.85-in x 11.57-in x 1.89-in||17.95-in x 13.02-in x 1.93-in|
|Weight||5.29 pounds||3.75 pounds||4.2 pounds||5.7 pounds||8.4 pounds||9.9 pounds|
MSI sent this collection along as it appears to match closely with entire range of available options in its own gaming notebook line, without actually sending us ALL 27 OF THE AVAILABLE SKUs! Yes, twenty-seven.
MSI GS40 Phantom
In the video below, I'll walk through the discussion of each of the series of notebooks that MSI offers for gamers, what the prevailing characteristics are for each and what kind of consumer should be most interested in it. I also discuss the specifics of each of the models we received for the project as well as getting into the performance deltas between them.
MSI GS72 Stealth Pro 4K
- MSI GE Series
- The entry level of gaming notebooks, available in both 15.6 and 17.3-in 1080p screens, limited to GTX 970M or GTX 960M GPUs. You still get 16GB of memory, SSDs in MOST systems, Killer Networking hardware, Steel Series keyboards and weights range from 5.29 to 5.95 pounds.
- MSI GS Series
- Varies in screen size from 14-in to 17.3-in but the focus here is on slimmer designs. Both 1080p and 4K screens are available, though you are still maxing out at a GTX 970M graphics solution. 16GB of RAM, NVMe PCIe SSDs are standard, with available models as thin as 0.78-inches and as light as 3.75 pounds.
- MSI GT72 Series
- These focus on performance per dollar, getting maximum single GPU performance in the chassis. They all have 17-in screens with available G-Sync integration, and GPUs from the GTX 970M to the GTX 980 (full). 16-32GB of memory, all using SSDs, optical drives, Thunderbolt, six USB 3.0 ports but GT72 systems are bigger and heavier to compensate for all this.
- MSI GT80 Series
- These are for the crazy enthusiasts only, all of which include SLI configurations or GTX 970M, 980M or 980. An 18.3-in 1080p screen is the only option for your display, but you get 16-64GB of memory, RAID enabled SSD configurations, Blu-ray burners, Thunderbolt, five USB 3.0 ports and a friggin Cherry Brown mechanical keyboard!
After going through this project, here are a few recommendations I would have for users looking to pick up an MSI gaming notebook.
- Best Gaming Value
- GT72 Dominator G-831 - This combines the larger form factor with a GTX 970M GPU, 17.3-in 1080p screen, 16GB of memory, 128GB SSD and priced at $1599. I think this is a good balance of cost and GPU horsepower.
- Looking for a Slimmer Design
- GS70 Stealth Pro-006 - For $1699 you lose the optical drive from the above GT72, but get a lighter and thinner design. You have the same technical horsepower, GTX 970M, Core i7 processor, etc., but the integrated fans will likely be noticeably louder to expel the heat from the more narrow chassis.
- If you need more performance
- GT72 Dominator Pro G-034 - With a jump from the $1599 GT72 above to $2099, this model gets you a GTX 980M and a 256GB SSD. Based on the performance metrics I ran, that should net you another 40-50% of GPU horsepower.
Let me know if you have any questions or comments about these machines and I'll do my best to answer them!
Seeing Ryan transition from being a long-time Android user over to iOS late last year has had me thinking. While I've had hands on with flagship phones from many manufacturers since then, I haven't actually carried an Android device with me since the Nexus S (eventually, with the 4.0 Ice Cream Sandwich upgrade). Maybe it was time to go back in order to gain a more informed perspective of the mobile device market as it stands today.
So that's exactly what I did. When we received our Samsung Galaxy S7 review unit (full review coming soon, I promise!), I decided to go ahead and put a real effort forth into using Android for an extended period of time.
Full disclosure, I am still carrying my iPhone with me since we received a T-Mobile locked unit, and my personal number is on Verizon. However, I have been using the S7 for everything but phone calls, and the occasional text message to people who only has my iPhone number.
Now one of the questions you might be asking yourself right now is why did I choose the Galaxy S7 of all devices to make this transition with. Most Android aficionados would probably insist that I chose a Nexus device to get the best experience and one that Google intends to provide when developing Android. While these people aren't wrong, I decided that I wanted to go with a more popular device as opposed to the more niche Nexus line.
Whether you Samsung's approach or not, the fact is that they sell more Android devices than anyone else and the Galaxy S7 will be their flagship offering for the next year or so.
A new fighter has entered the ring
When EVGA showed me that it was entering the world of gaming notebooks at CES in January, I must admit, I questioned the move. A company that, at one point, only built and distributed graphics cards based on NVIDIA GeForce GPUs had moved to mice, power supplies, tablets (remember that?) and even cases, was going to get into the cutthroat world of notebooks. But I was promised that EVGA had an angle; it would not be cutting any corners in order to bring a truly competitive and aggressive product to the market.
Just a couple of short months later (seriously, is it the end of March already?) EVGA presented us with a shiny new SC17 Gaming Notebook to review. It’s thinner than you might expect, heavier than I would prefer and packs some impressive compute power, along with unique features and overclocking capability, that will put it on your short list of portable gaming rigs for 2016.
Let’s start with a dive into the spec table and then go from there.
|EVGA SC17 Specifications|
|Processor||Intel Core i7-6820HK|
|Memory||32GB G.Skill DDR4-2666|
|Graphics Card||GeForce GTX 980M 8GB|
|Storage||256GB M.2 NVMe PCIe SSD
1TB 7200 RPM SATA 6G HDD
|Display||Sharp 17.3 inch UDH 4K with matte finish|
|Connectivity||Intel 219-V Gigabit Ethernet
Intel AC-8260 802.11ac
2x USB 3.0 Type-A
1x USB 3.1 Type-C
|Audio||Realtek ALC 255
|Video||1x HDMI 1.4
2x mini DisplayPort (1x G-Sync support)
|Dimensions||16-in x 11.6-in x 1.05-in|
|OS||Windows 10 Home|
With a price tag of $2,699, EVGA owes you a lot – and it delivers! The processor of choice is the Intel Core i7-6820HK, an unlocked, quad-core, HyperThreaded processor that brings desktop class computing capability to a notebook. The base clock speed is 2.7 GHz but the Turbo clock reaches as high as 3.6 GHz out of the box, supplying games, rendering programs and video editors plenty of horsepower for production on the go. And don’t forget that this is one of the first unlocked processors from Intel for mobile computing – multipliers and voltages can all be tweaked in the UEFI or through Precision X Mobile software to push it even further.
Based on EVGA’s relationship with NVIDIA, it should surprise exactly zero people that a mobile GeForce GPU is found inside the SC17. The GTX 980M is based on the Maxwell 2.0 design and falls slightly under the desktop consumer class GeForce GTX 970 card in CUDA core count and clock speed. With 1536 CUDA cores and a 1038 MHz base clock, with boost capability, the discrete graphics will have enough juice for most games at very high image quality settings. EVGA has configured the GPU with 8GB of GDDR5 memory, more than any desktop GTX 970… so there’s that. Obviously, it would have been great to see the full powered GTX 980 in the SC17, but that would have required changes to the thermal design, chassis and power delivery.
It's Easier to Be Convincing than Correct
This is a difficult topic to discuss. Some perspectives assume that law enforcement have terrible, Orwellian intentions. Meanwhile, law enforcement officials, with genuinely good intentions, don't understand that the road to Hell is paved with those. Bad things are much more likely to happen when human flaws are justified away, which is easy to do when your job is preventing mass death and destruction. Human beings like to use large pools of evidence to validate assumptions, without realizing it, rather than discovering truth.
Ever notice how essays can always find sources, regardless of thesis? With increasing amounts of data, you are progressively more likely to make a convincing argument, but not necessarily a more true one. Mix in good intentions, which promotes complacency, and mistakes can happen.
But this is about Apple. Recently, the FBI demanded that Apple creates a version of iOS that can be broken into by law enforcement. They frequently use the term “back door,” while the government prefers other terminology. Really, words are words and the only thing that matters is what it describes -- and it describes a mechanism to compromise the device's security in some way.
This introduces several problems.
The common line that I hear is, “I don't care, because I have nothing to hide.” Well... that's wrong in a few ways. First, having nothing to hide is irrelevant if the person who wants access to your data assumes that you have something you want to hide, and is looking for evidence that convinces themselves that they're right. Second, you need to consider all the people who want access to this data. The FBI will not be the only one demanding a back door, or even the United States as a whole. There are a whole lot of nations that trusts individuals, including their own respective citizens, less than the United States. You can expect that each of them would request a backdoor.
You can also expect each of them, and organized criminals, wanting to break into each others'.
Lastly, we've been here before, and what it comes down to is criminalizing math. Encryption is just a mathematical process that is easy to perform, but hard to invert. It all started because it is easy to multiply two numbers together, but hard to factor them. The only method we know is dividing by every possible number that's smaller than the square root of said number. If the two numbers are prime, then you are stuck finding one number out of all those possibilities (the other prime number will be greater than the square root). In the 90s, numbers over a certain size were legally classified as weapons. That may sound ridiculous, and there would be good reason for that feeling. Either way, it changed; as a result, online banks and retailers thrived.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Good intentions lead to complacency, which is where the road to (metaphorical) Hell starts.
Caught Up to DirectX 12 in a Single Day
I'm not just talking about the specification. Members of the Khronos Group have also released compatible drivers, SDKs and tools to support them, conformance tests, and a proof-of-concept patch for Croteam's The Talos Principle. To reiterate, this is not a soft launch. The API, and its entire ecosystem, is out and ready for the public on Windows (at least 7+ at launch but a surprise Vista or XP announcement is technically possible) and several distributions of Linux. Google will provide an Android SDK in the near future.
I'm going to editorialize for the next two paragraphs. There was a concern that Vulkan would be too late. The thing is, as of today, Vulkan is now just as mature as DirectX 12. Of course, that could change at a moment's notice; we still don't know how the two APIs are being adopted behind the scenes. A few DirectX 12 titles are planned to launch in a few months, but no full, non-experimental, non-early access game currently exists. Each time I say this, someone links the Wikipedia list of DirectX 12 games. If you look at each entry, though, you'll see that all of them are either: early access, awaiting an unreleased DirectX 12 patch, or using a third-party engine (like Unreal Engine 4) that only list DirectX 12 as an experimental preview. No full, released, non-experimental DirectX 12 game exists today. Besides, if the latter counts, then you'll need to accept The Talos Principle's proof-of-concept patch, too.
But again, that could change. While today's launch speaks well to the Khronos Group and the API itself, it still needs to be adopted by third party engines, middleware, and software. These partners could, like the Khronos Group before today, be privately supporting Vulkan with the intent to flood out announcements; we won't know until they do... or don't. With the support of popular engines and frameworks, dependent software really just needs to enable it. This has not happened for DirectX 12 yet, and, now, there doesn't seem to be anything keeping it from happening for Vulkan at any moment. With the Game Developers Conference just a month away, we should soon find out.
But back to the announcement.
Vulkan-compatible drivers are launching today across multiple vendors and platforms, but I do not have a complete list. On Windows, I was told to expect drivers from NVIDIA for Windows 7, 8.x, 10 on Kepler and Maxwell GPUs. The standard is compatible with Fermi GPUs, but NVIDIA does not plan on supporting the API for those users due to its low market share. That said, they are paying attention to user feedback and they are not ruling it out, which probably means that they are keeping an open mind in case some piece of software gets popular and depends upon Vulkan. I have not heard from AMD or Intel about Vulkan drivers as of this writing, one way or the other. They could even arrive day one.
On Linux, NVIDIA, Intel, and Imagination Technologies have submitted conformant drivers.
Drivers alone do not make a hard launch, though. SDKs and tools have also arrived, including the LunarG SDK for Windows and Linux. LunarG is a company co-founded by Lens Owen, who had a previous graphics software company that was purchased by VMware. LunarG is backed by Valve, who also backed Vulkan in several other ways. The LunarG SDK helps developers validate their code, inspect what the API is doing, and otherwise debug. Even better, it is also open source, which means that the community can rapidly enhance it, even though it's in a releasable state as it is. RenderDoc,
the open-source graphics debugger by Crytek, will also add Vulkan support. ((Update (Feb 16 @ 12:39pm EST): Baldur Karlsson has just emailed me to let me know that it was a personal project at Crytek, not a Crytek project in general, and their GitHub page is much more up-to-date than the linked site.))
The major downside is that Vulkan (like Mantle and DX12) isn't simple.
These APIs are verbose and very different from previous ones, which requires more effort.
Image Credit: NVIDIA
There really isn't much to say about the Vulkan launch beyond this. What graphics APIs really try to accomplish is standardizing signals that enter and leave video cards, such that the GPUs know what to do with them. For the last two decades, we've settled on an arbitrary, single, global object that you attach buffers of data to, in specific formats, and call one of a half-dozen functions to send it.
Compute APIs, like CUDA and OpenCL, decided it was more efficient to handle queues, allowing the application to write commands and send them wherever they need to go. Multiple threads can write commands, and multiple accelerators (GPUs in our case) can be targeted individually. Vulkan, like Mantle and DirectX 12, takes this metaphor and adds graphics-specific instructions to it. Moreover, GPUs can schedule memory, compute, and graphics instructions at the same time, as long as the graphics task has leftover compute and memory resources, and / or the compute task has leftover memory resources.
This is not necessarily a “better” way to do graphics programming... it's different. That said, it has the potential to be much more efficient when dealing with lots of simple tasks that are sent from multiple CPU threads, especially to multiple GPUs (which currently require the driver to figure out how to convert draw calls into separate workloads -- leading to simplifications like mirrored memory and splitting workload by neighboring frames). Lots of tasks aligns well with video games, especially ones with lots of simple objects, like strategy games, shooters with lots of debris, or any game with large crowds of people. As it becomes ubiquitous, we'll see this bottleneck disappear and games will not need to be designed around these limitations. It might even be used for drawing with cross-platform 2D APIs, like Qt or even webpages, although those two examples (especially the Web) each have other, higher-priority bottlenecks. There are also other benefits to Vulkan.
The WebGL comparison is probably not as common knowledge as Khronos Group believes.
Still, Khronos Group was criticized when WebGL launched as "it was too tough for Web developers".
It didn't need to be easy. Frameworks arrived and simplified everything. It's now ubiquitous.
In fact, Adobe Animate CC (the successor to Flash Pro) is now a WebGL editor (experimentally).
Open platforms are required for this to become commonplace. Engines will probably target several APIs from their internal management APIs, but you can't target users who don't fit in any bucket. Vulkan brings this capability to basically any platform, as long as it has a compute-capable GPU and a driver developer who cares.
Thankfully, it arrived before any competitor established market share.
Dell has never exactly been a brand that gamers gravitate towards. While we have seen some very high quality products out of Dell in the past few years, including the new XPS 13, and people have loved their Ultrasharp monitor line, neither of these target gamers directly. Dell acquired Alienware in 2006 in order to enter the gaming market and continues to make some great products, but they retain the Alienware branding. It seems to me a gaming-centric notebook with just the Dell brand could be a hard sell.
However, that's exactly what we have today with the Dell Inspiron 15 7000. Equipped with an Intel Core i5-6300HQ and NVIDIA GTX 960M for $799, has Dell created a contender in the entry-level gaming notebook race?
For years, the Inspiron line has been Dell's entry level option for notebooks and subsequently has a questionable reputation as far as quality and lifespan. With the Inspiron 15 7000 being the most expensive product offering in the Inspiron line though, I was excited to see if it could sway my opinion of the brand.
Fighting for Relevance
AMD is still kicking. While the results of this past year have been forgettable, they have overcome some significant hurdles and look like they are improving their position in terms of cutting costs while extracting as much revenue as possible. There were plenty of ups and downs for this past quarter, but when compared to the rest of 2015 there were some solid steps forward here.
The company reported revenues of $958 million, which is down from $1.06 billion last quarter. The company also recorded a $103 million loss, but that is down significantly from the $197 million loss the quarter before. Q3 did have a $65 million write-down due to unsold inventory. Though the company made far less in revenues, they also shored up their losses. The company is still bleeding, but they still have plenty of cash on hand for the next several quarters to survive. When we talk about non-GAAP figures, AMD reports a $79 million loss for this past quarter.
For the entire year AMD recorded $3.99 billion in revenue with a net loss of $660 million. This is down from FY 2014 revenues of $5.51 billion and a net loss of $403 million. AMD certainly is trending downwards year over year, but they are hoping to reverse that come 2H 2016.
Graphics continues to be solid for AMD as they increased their sales from last quarter, but are down year on year. Holiday sales were brisk, but with only the high end Fury series being a new card during this season, the impact of that particular part was not as great as compared to the company having a new mid-range series like the newly introduced R9 380X. The second half of 2016 will see the introduction of the Polaris based GPUs for both mobile and desktop applications. Until then, AMD will continue to provide the current 28 nm lineup of GPUs to the market. At this point we are under the assumption that AMD and NVIDIA are looking at the same timeframe for introducing their next generation parts due to process technology advances. AMD already has working samples on Samsung’s/GLOBALFOUNDRIES 14nm LPP (low power plus) that they showed off at CES 2016.
Design and Compute Performance
I'm going to be honest with you right off the bat: there isn't much more I can say about the MSI GT72S notebook that hasn't already been said either on this website or on the PC Perspective Podcast. Though there are many iterations of this machine, the version we are looking at today is known as the "GT72S Dominator Pro G Dragon-004" and it includes some impressive hardware and design choices. Perhaps you've heard of this processor called "Skylake" and a GPU known as the "GTX 980"?
The GT72S is a gaming notebook in the truest sense of the term. It is big, heavy and bulky, not meant for daily travel or walking around campus for very long distances. It has a 17-in screen, more USB 3.0 ports than most desktop computers and also more gaming horsepower than we've ever seen crammed into that kind of space. That doesn't make it perfect for everyone of course: battery life is poor and you may have to sell one of your kids to be able to afford it. But then, you might be able to afford A LOT if you sold the kids, amiright?
Let's dive into what makes the new MSI GT72S so impressive and why every PC gamer that has a hankering for moving their rig will be drooling.
Design - A Tablet and a Notebook
For the last 30 days or so, I have been using both Microsoft's new Surface Book and Surface Pro 4 as every day computing devices. The goal was to review these items from not just a handful of days of testing and benchmarking, but with some lengthy time under my belt utilizing both products in a real-world environment. The following is my review with that premise. Enjoy!
A lot has already been said about the design and style of both the updated Surface Pro 4 and the new Surface Book. Let’s start with the Surface Pro 4 as it sees the least dramatic changes from previous product.
The Surface Pro 4 uses the same kickstand tablet design that made the Surface brand so memorable as well as functional. Many different OEMs are starting to copy the design style because it has a lot of positive merits to it. For instance, it allows viewing angles from nearly 90 degree to flat. The Surface Pro 4 is a tablet in its purest form, though. It doesn’t have a keyboard or trackpad standard – you’ll have purchase the optional Type Cover. It’s only 8.5mm thick and weighs in at 1.73 lbs, without the added keyboard.
The kickstand works exceptionally, with unlimited positions between the starting and stop point of the hinge, and it allows smooth movement between them. It’s strong enough to stand up when being slid around on the tablet or desk. The biggest concern I have with the kickstand is that using it on your lap (or on an airplane tray table) is difficult to impossible, depending on the exact configuration or your legs / tray. Because the hinged kickstand needs a surface to make contact with, pushing the Surface Pro back on your legs where the hinged portion extends past your knees won’t work.
From a design and style perspective, I still think the Surface products are among the best that exist on the market today. The magnesium body is sleek and the angles are both professional and aggressive. Even when coupled with the magnetic Type Cover, it won’t look like a toy at the office or on the road.
The new Surface Book is a completely different beast – a unique design and a new product. I am sure that there are some people that simply won’t like the way the notebook looks, but I am not one of them. Though it is technically a tablet and a keyboard dock, the Surface Book only ships as a complete unit so calling this a notebook or a 2-in-1 convertible feels more accurate than calling it a tablet. It has a larger and more pronounced 13.5-in screen than the Pro, which makes it larger, heavier and bulkier in your bag as well. The magnesium body shares a lot of design cues with the Pro 4, but it’s the hinge on the Book that really makes it different than any notebook I have used.
Introduction and CPU Performance
We had a chance this week to go hands-on with the Snapdragon 820, the latest flagship SoC from Qualcomm, in a hardware session featuring prototype handsets powered by this new silicon. How did it perform? Read on to find out!
As you would expect from an all-new flagship part, the Snapdragon 820 offers improvements in virtually every category compared to their previous products. And with the 820 Qualcomm is emphasizing not only performance, but lower power consumption with claims of anywhere from 20% to 10x better efficiency across the components that make up this new SoC. And part of these power savings will undoubtedly come as the result of Qualcomm’s decision to move to a quad-core design with the 820, rather than the 8-core design of the 810.
So what exactly does comprise a high-end SoC like the Snapdragon 820? Ryan covered the launch in detail back in November (and we introduced aspects of the new SoC in a series of articles leading up to the launch). In brief, the Snapdragon 820 includes a custom quad-core CPU (Kryo), the Andreno 530 GPU, a new DSP (Hexagon 680), new ISP (Spectra), and a new LTE modem (X12). The previous flagship Snapdragon 810 used stock ARM cores (Cortex-A57, Cortex-A53) in a big.LITTLE configuration, but for various reasons Qualcomm has chosen not to introduce another 8-core SoC with this new product.
The four Kryo CPU cores found in the Snapdragon 820 can operate at speeds of up to 2.2 GHz, and since is half the number of the octo-core Snapdragon 810, the IPC (instructions per clock) of this new part will help determine how competitive the SD820's performance will be; but there’s a lot more to the story. This SoC design placed equal emphasis on all components therein, and the strategy with the SD820 seems to be leveraging the capability of the advanced signal processing (Hexagon 680) which should help offload the work to allow the CPU to work with greater efficiency, and at lower power.
Skylake Architecture Comes Through
When Intel finally revealed the details surrounding it's latest Skylake architecture design back in August at IDF, we learned for the first time about a new technology called Intel Speed Shift. A feature that moves some of the control of CPU clock speed and ramp up away from the operating system and into hardware gives more control to the processor itself, making it less dependent on Windows (and presumably in the future, other operating systems). This allows the clock speed of a Skylake processor to get higher, faster, allowing for better user responsiveness.
It's pretty clear that Intel is targeting this feature addition for tablets and 2-in-1s where the finger/pen to screen interaction is highly reliant on immediate performance to enable improved user experiences. It has long been known that one of the biggest performance deltas between iOS from Apple and Android from Google centers on the ability for the machine to FEEL faster when doing direct interaction, regardless of how fast the background rendering of an application or web browser actually is. Intel has been on a quest to fix this problem for Android for some time, where it has the ability to influence software development, and now they are bringing that emphasis to Windows 10.
With the most recent Windows 10 update, to build v10586, Intel Speed Shift has finally been enabled for Skylake users. And since you cannot disable the feature once it's installed, this is the one and only time we'll be able to measure performance in our test systems. So let's see if Intel's claims of improved user experiences stand up to our scrutiny.
Cat 12 Modem, Wi-Fi Adaptive Calling
If you have been following PC Perspective over the last several months it would be hard to miss news about the upcoming release of Qualcomm's latest flagship SoC for smartphones and tablets, the Snapdragon 820. Beginning in early August with discussion of the Adreno 5xx GPU architecture, followed by information covering the Hexagon 680 DSP (digital signal processor) and then details on the LTE modem in the SoC (the X12), and ending with information on the Kryo CPU cores, the release of the Snapdragon 820 processor has been drawn out if nothing else.
The emphasis on distribution of data was likely at attempt to rebuild trust with the enthusiast consumer, media and even the OEMs, as the launch of the Snapdragon 810 was troubled by overheating concerns and second revisions. Several high impact flagship smartphone used the SoC,
including the LG G4 (correction: G4 shipped with the Snapdragon 808, while the HTC One M9, OnePlus 2, Sony Xperia Z5 and others use the 810) but with Samsung moving away from Qualcomm parts to its own designs, the processor didn't see nearly the ubiquitous adoption that we had expected and witnessed in previous generations.
Qualcomm invited some media and analysts out to New York this week to take the cover off of the Snapdragon 820 completely, at least as far as features are concerned. We still were not able to get to the meat of the details surrounding the CPU / Kryo implementation or architectural improvements, but we are promised those would be coming closer to product availability in 2016. Instead, Qualcomm wanted to show off the consumer benefits that phones and tablets based on Snapdragon 820 could feature (based on OEM implementation); that means lots of demos, lots of time with product and feature managers.
Meet EMTECs powerful portable peripherals
You may not think you have heard of EMTEC before but if you are old enough to remember audio and video tape then their original name will ring a bell; they were once known as BASF Magnetics. BASF officially launched their new division in 2000, focused on modern magnetic storage such as hard drives and flash.
If you have seen an oddly shaped flash drive or ones made in the shapes of Looney Tunes or Angry Birds then you have run into EMTEC products. As you can see above their product line is quite varied and includes the Power Connect for Mobile devices and the Wi-Fi Hard Drive P600, both of which they have sent for us to review.
The packaging is reminiscent of a gaming mouse with Velcro so you can open the box to see the device inside. Even better is the lack of clamshell packaging, you won't have to risk a finger trying to open them.
The WiFi Drive P600 is designed for portability, it is slightly larger than a deck of cards and is available in 1TB as well as 500GB models, the latter of which is the version that we received. You can sync the hard drive wirelessly, over the LAN connection present on the bottom of the device or through the USB 3.0 port which is also the drives recharging port. You can connect your devices to this drive in numerous ways, including setting it up as a SAMBA server or through DNLA if your devices are compatible.
The Power Connect U600 is perhaps the more interesting of the two devices for people on the go. With a large enough MicroSD card installed it can fulfill the same role as the WiFi HDD as it offers the same connectivity choices, including a LAN port, with the exception of DNLA functionality which is replaced with UPNP. In addition to offering portable storage it can function as a WiFi hot spot and with the internal 5200 mAh battery it will be able to charge your phone when you are away from power.
Last month NVIDIA introduced the world to the GTX 980 in a new form factor for gaming notebook. Using the same Maxwell GPU, the same performance levels but with slightly tweaked power delivery and TDPs, notebooks powered by the GTX 980 promise to be a noticeable step faster than anything before it.
Late last week I got my hands on the updated MSI GT72S Dominator Pro G, the first retail ready gaming notebook to not only integrate the new GTX 980 GPU but also an unlocked Skylake mobile processor.
This machine is something to behold - though it looks very similar to previous GT72 versions, this machine hides hardware unlike anything we have been able to carry in a backpack before. And the sexy red exterior with MSI Dragon Army logo blazoned across the back definitely help it to stand out in a crowd. If you happen to be in a crowd of notebooks.
A quick spin around the GT72S reveals a sizeable collection of hardware and connections. On the left you'll find a set of four USB 3.0 ports as well as four audio inputs and ouputs and an SD card reader.
On the opposite side there are two more USB 3.0 ports (totalling six) and the optical / Blu-ray burner. With that many USB 3.0 ports you should never struggle with accessories availability - headset, mouse, keyboard, hard drive and portable fan? Check.
Pack a full GTX 980 on the go!
For many years, the idea of a truly mobile gaming system has been attainable if you were willing to pay the premium for high performance components. But anyone that has done research in this field would tell you that though they were named similarly, the mobile GPUs from both AMD and NVIDIA had a tendency to be noticeably slower than their desktop counterparts. A GeForce GTX 970M, for example, only had a CUDA core count that was slightly higher than the desktop GTX 960, and it was 30% lower than the true desktop GTX 970 product. So even though you were getting fantastic mobile performance, there continued to be a dominant position that desktop users held over mobile gamers in PC gaming.
This fall, NVIDIA is changing that with the introduction of the GeForce GTX 980 for gaming notebooks. Notice I did not put an 'M' at the end of that name; it's not an accident. NVIDIA has found a way, through binning and component design, to cram the entirety of a GM204-based Maxwell GTX 980 GPU inside portable gaming notebooks.
The results are impressive and the implications for PC gamers are dramatic. Systems built with the GTX 980 will include the same 2048 CUDA cores, 4GB of GDDR5 running at 7.0 GHz and will run at the same base and typical GPU Boost clocks as the reference GTX 980 cards you can buy today for $499+. And, while you won't find this GPU in anything called a "thin and light", 17-19" gaming laptops do allow for portability of gaming unlike any SFF PC.
So how did they do it? NVIDIA has found a way to get a desktop GPU with a 165 watt TDP into a form factor that has a physical limit of 150 watts (for the MXM module implementations at least) through binning, component selection and improved cooling. Not only that, but there is enough headroom to allow for some desktop-class overclocking of the GTX 980 as well.
A Diverse Lineup
ThinkPads have always been one of our favorite notebook brands here at PC Perspective. While there certainly has been some competition from well-designed portables such as the Dell XPS 13 and Microsoft Surface Pro 3, the ThinkPad line remains a solid choice for power users.
We had the chance to look at a lot of Lenovo's ThinkPad lineup for Broadwell, and as this generation comes to a close we decided to give a brief overview of the diversity available. Skylake-powered notebooks may be just on the horizon, but the comparisons of form factor and usability should remain mostly applicable into the next generation.
Within the same $1200-$1300 price range, Lenovo offers a myriad of portable machines with roughly the same hardware in vastly different form factors.
First, let's take a look at the more standard ThinkPads.
Lenovo ThinkPad T450s
The ThinkPad T450s is my default recommendation for anyone looking for a notebook in the $1000+ range. Featuring a 14" 1080p display and an Intel Core i5-5300U processor, it will perform great for the majority of users. While you won't be using this machine for 3D Modeling or CAD/CAM applications, general productivity tasks will feel right at home here.
Technically classified as an Ultrabook, the T450s won't exactly be turning any heads with it's thinness. Lenovo strikes a balance here, making the notebook as thin as possible at 0.83" while retaining features such as a gigabit Ethernet port, 3 USB 3.0 Ports, an SD card reader, and plenty of display connectivity with Mini DisplayPort and VGA.
The Dell Venue 10 7000 Series tablet features a stunning 10.5" OLED screen and is designed to mate perfectly with the optional keyboard. So how does it perform as both a laptop and a tablet? Read on for the full review!
To begin with I will simply say the keyboard should not be an optional accessory. There, I've said it. As I used the Venue 10 7000, which arrived bundled with the keyboard, I was instantly excited about this design. The Venue 10 is a device that is as remarkable for its incredible screen as much as any other feature, but once coupled with the magnetically attached keyboard becomes something more - and quite different than existing implementations of the transforming tablet. More than a simple accessory the keyboard felt like it was really a part of the device when connected, and made it feel like a real laptop.
I'm getting way ahead of myself here so let's go back to the beginning, and back to a world where one might consider purchasing this tablet by itself. At $499 for the 16GB model you might reasonably ask how it compares to the identically-priced Apple iPad Air 2. Well, most of the comparison is going to be software/app related as the Venue 10 7000 is running Android 5.1 Lollipop, and of course the iPad runs iOS. The biggest difference between these tablets (besides the keyboard integration) becomes the 10.5-inch, 2560x1600 OLED screen, and oh what a screen it is!
A third primary processor
As the Hot Chips conference begins in Cupertino this week, Qualcomm is set to divulge another set of information about the upcoming Snapdragon 820 processor. Earlier this month the company revealed details about the Adreno 5xx GPU architecture, showcasing improved performance and power efficiency while also adding a new Spectra 14-bit image processor. Today we shift to what Qualcomm calls the “third pillar in the triumvirate of programmable processors” that make up the Snapdragon SoC. The Hexagon DSP (digital signal processor), introduced initially by Qualcomm in 2004, has gone through a massive architecture shift and even programmability shift over the last 10 years.
Qualcomm believes that building a balanced SoC for mobile applications is all about heterogeneous computing with no one processor carrying the entire load. The majority of the work that any modern Snapdragon processor must handle goes through the primary CPU cores, the GPU or the DSP. We learned about upgrades to the Adreno 5xx series for the Snapdragon 820 and we are promised information about Kryo CPU architecture soon as well. But the Hexagon 600-series of DSPs actually deals with some of the most important functionality for smartphones and tablets: audio, voice, imaging and video.
Interestingly, Qualcomm opened up the DSP to programmability just four years ago, giving developers the ability to write custom code and software to take advantages of the specific performance capabilities that the DSP offers. Custom photography, videography and sound applications could benefit greatly in terms of performance and power efficiency if utilizing the QC DSP rather than the primary system CPU or GPU. As of this writing, Qualcomm claims there are “hundreds” of developers actively writing code targeting its family of Hexagon processors.
The Hexagon DSP in Snapdragon 820 consists of three primary partitions. The main compute DSP works in conjunction with the GPU and CPU cores and will do much of the heavy lifting for encompassed workloads. The modem DSP aids the cellular modem in communication throughput. The new guy here is the lower power DSP in the Low Power Island (LPI) that shifts how always-on sensors can communicate with the operating system.
After spending some time in the computer hardware industry, it's easy to become jaded about trade shows and unannounced products. The vast majority of hardware we see at events like CES every year is completely expected beforehand. While this doesn't mean that these products are bad by any stretch, they can be difficult to get excited about.
Everyone once and a while however, we find ourselves with our hands on something completely unexpected. Hidden away in a back room of Lenovo's product showcase at CES this year, we were told there was a product would amaze us — called the LaVie.
And they were right.
Unfortunately, the Lenovo LaVie-Z is one of those products that you can't truly understand until you get it in your hands. Billed as the world's lightest 13.3" notebook, the standard LaVie-Z comes in at a weight of just 1.87 lbs. The touchscreen-enabled LaVie-Z 360 gains a bit of weight, coming in at 2.04 lbs.
While these numbers are a bit difficult to wrap your head around, I'll try to provide a bit of context. For example, the Google Nexus 9 weighs .94 lbs. For just over twice the weight as Google's flagship tablet, Lenovo has provided a full Windows notebook with an i7 ultra mobile processor.
Furthermore the new 12" Apple MacBook which people are touting as being extremely light comes in at 2.03 lbs, almost the same weight as the touchscreen version of the LaVie-Z. For the same weight, you also gain a much more powerful Intel i7 processor in the LaVie, when compared to the Intel Core-M option in the MacBook.
All of this comes together to provide an experience that is quite unbelievable. Anyone that I have handed one of these notebooks to has been absolutely amazed that it's a real, functioning computer. The closest analog that I have been able to come up with for picking up the LaVie-Z is one of the cardboard placeholder laptops they have at furniture stores.
The personal laptop that I carry day-to-day is a 11" MacBook Air, which only weighs 2.38 lbs, but the LaVie-Z feels infinitely lighter.
However, as impressive as the weight (or lack thereof) of the LaVie-Z is, let's dig deeper into what the experience of using the world's lightest notebook.