All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Design and Compute Performance
I'm going to be honest with you right off the bat: there isn't much more I can say about the MSI GT72S notebook that hasn't already been said either on this website or on the PC Perspective Podcast. Though there are many iterations of this machine, the version we are looking at today is known as the "GT72S Dominator Pro G Dragon-004" and it includes some impressive hardware and design choices. Perhaps you've heard of this processor called "Skylake" and a GPU known as the "GTX 980"?
The GT72S is a gaming notebook in the truest sense of the term. It is big, heavy and bulky, not meant for daily travel or walking around campus for very long distances. It has a 17-in screen, more USB 3.0 ports than most desktop computers and also more gaming horsepower than we've ever seen crammed into that kind of space. That doesn't make it perfect for everyone of course: battery life is poor and you may have to sell one of your kids to be able to afford it. But then, you might be able to afford A LOT if you sold the kids, amiright?
Let's dive into what makes the new MSI GT72S so impressive and why every PC gamer that has a hankering for moving their rig will be drooling.
Design - A Tablet and a Notebook
For the last 30 days or so, I have been using both Microsoft's new Surface Book and Surface Pro 4 as every day computing devices. The goal was to review these items from not just a handful of days of testing and benchmarking, but with some lengthy time under my belt utilizing both products in a real-world environment. The following is my review with that premise. Enjoy!
A lot has already been said about the design and style of both the updated Surface Pro 4 and the new Surface Book. Let’s start with the Surface Pro 4 as it sees the least dramatic changes from previous product.
The Surface Pro 4 uses the same kickstand tablet design that made the Surface brand so memorable as well as functional. Many different OEMs are starting to copy the design style because it has a lot of positive merits to it. For instance, it allows viewing angles from nearly 90 degree to flat. The Surface Pro 4 is a tablet in its purest form, though. It doesn’t have a keyboard or trackpad standard – you’ll have purchase the optional Type Cover. It’s only 8.5mm thick and weighs in at 1.73 lbs, without the added keyboard.
The kickstand works exceptionally, with unlimited positions between the starting and stop point of the hinge, and it allows smooth movement between them. It’s strong enough to stand up when being slid around on the tablet or desk. The biggest concern I have with the kickstand is that using it on your lap (or on an airplane tray table) is difficult to impossible, depending on the exact configuration or your legs / tray. Because the hinged kickstand needs a surface to make contact with, pushing the Surface Pro back on your legs where the hinged portion extends past your knees won’t work.
From a design and style perspective, I still think the Surface products are among the best that exist on the market today. The magnesium body is sleek and the angles are both professional and aggressive. Even when coupled with the magnetic Type Cover, it won’t look like a toy at the office or on the road.
The new Surface Book is a completely different beast – a unique design and a new product. I am sure that there are some people that simply won’t like the way the notebook looks, but I am not one of them. Though it is technically a tablet and a keyboard dock, the Surface Book only ships as a complete unit so calling this a notebook or a 2-in-1 convertible feels more accurate than calling it a tablet. It has a larger and more pronounced 13.5-in screen than the Pro, which makes it larger, heavier and bulkier in your bag as well. The magnesium body shares a lot of design cues with the Pro 4, but it’s the hinge on the Book that really makes it different than any notebook I have used.
Introduction and CPU Performance
We had a chance this week to go hands-on with the Snapdragon 820, the latest flagship SoC from Qualcomm, in a hardware session featuring prototype handsets powered by this new silicon. How did it perform? Read on to find out!
As you would expect from an all-new flagship part, the Snapdragon 820 offers improvements in virtually every category compared to their previous products. And with the 820 Qualcomm is emphasizing not only performance, but lower power consumption with claims of anywhere from 20% to 10x better efficiency across the components that make up this new SoC. And part of these power savings will undoubtedly come as the result of Qualcomm’s decision to move to a quad-core design with the 820, rather than the 8-core design of the 810.
So what exactly does comprise a high-end SoC like the Snapdragon 820? Ryan covered the launch in detail back in November (and we introduced aspects of the new SoC in a series of articles leading up to the launch). In brief, the Snapdragon 820 includes a custom quad-core CPU (Kryo), the Andreno 530 GPU, a new DSP (Hexagon 680), new ISP (Spectra), and a new LTE modem (X12). The previous flagship Snapdragon 810 used stock ARM cores (Cortex-A57, Cortex-A53) in a big.LITTLE configuration, but for various reasons Qualcomm has chosen not to introduce another 8-core SoC with this new product.
The four Kryo CPU cores found in the Snapdragon 820 can operate at speeds of up to 2.2 GHz, and since is half the number of the octo-core Snapdragon 810, the IPC (instructions per clock) of this new part will help determine how competitive the SD820's performance will be; but there’s a lot more to the story. This SoC design placed equal emphasis on all components therein, and the strategy with the SD820 seems to be leveraging the capability of the advanced signal processing (Hexagon 680) which should help offload the work to allow the CPU to work with greater efficiency, and at lower power.
Skylake Architecture Comes Through
When Intel finally revealed the details surrounding it's latest Skylake architecture design back in August at IDF, we learned for the first time about a new technology called Intel Speed Shift. A feature that moves some of the control of CPU clock speed and ramp up away from the operating system and into hardware gives more control to the processor itself, making it less dependent on Windows (and presumably in the future, other operating systems). This allows the clock speed of a Skylake processor to get higher, faster, allowing for better user responsiveness.
It's pretty clear that Intel is targeting this feature addition for tablets and 2-in-1s where the finger/pen to screen interaction is highly reliant on immediate performance to enable improved user experiences. It has long been known that one of the biggest performance deltas between iOS from Apple and Android from Google centers on the ability for the machine to FEEL faster when doing direct interaction, regardless of how fast the background rendering of an application or web browser actually is. Intel has been on a quest to fix this problem for Android for some time, where it has the ability to influence software development, and now they are bringing that emphasis to Windows 10.
With the most recent Windows 10 update, to build v10586, Intel Speed Shift has finally been enabled for Skylake users. And since you cannot disable the feature once it's installed, this is the one and only time we'll be able to measure performance in our test systems. So let's see if Intel's claims of improved user experiences stand up to our scrutiny.
Cat 12 Modem, Wi-Fi Adaptive Calling
If you have been following PC Perspective over the last several months it would be hard to miss news about the upcoming release of Qualcomm's latest flagship SoC for smartphones and tablets, the Snapdragon 820. Beginning in early August with discussion of the Adreno 5xx GPU architecture, followed by information covering the Hexagon 680 DSP (digital signal processor) and then details on the LTE modem in the SoC (the X12), and ending with information on the Kryo CPU cores, the release of the Snapdragon 820 processor has been drawn out if nothing else.
The emphasis on distribution of data was likely at attempt to rebuild trust with the enthusiast consumer, media and even the OEMs, as the launch of the Snapdragon 810 was troubled by overheating concerns and second revisions. Several high impact flagship smartphone used the SoC,
including the LG G4 (correction: G4 shipped with the Snapdragon 808, while the HTC One M9, OnePlus 2, Sony Xperia Z5 and others use the 810) but with Samsung moving away from Qualcomm parts to its own designs, the processor didn't see nearly the ubiquitous adoption that we had expected and witnessed in previous generations.
Qualcomm invited some media and analysts out to New York this week to take the cover off of the Snapdragon 820 completely, at least as far as features are concerned. We still were not able to get to the meat of the details surrounding the CPU / Kryo implementation or architectural improvements, but we are promised those would be coming closer to product availability in 2016. Instead, Qualcomm wanted to show off the consumer benefits that phones and tablets based on Snapdragon 820 could feature (based on OEM implementation); that means lots of demos, lots of time with product and feature managers.
Meet EMTECs powerful portable peripherals
You may not think you have heard of EMTEC before but if you are old enough to remember audio and video tape then their original name will ring a bell; they were once known as BASF Magnetics. BASF officially launched their new division in 2000, focused on modern magnetic storage such as hard drives and flash.
If you have seen an oddly shaped flash drive or ones made in the shapes of Looney Tunes or Angry Birds then you have run into EMTEC products. As you can see above their product line is quite varied and includes the Power Connect for Mobile devices and the Wi-Fi Hard Drive P600, both of which they have sent for us to review.
The packaging is reminiscent of a gaming mouse with Velcro so you can open the box to see the device inside. Even better is the lack of clamshell packaging, you won't have to risk a finger trying to open them.
The WiFi Drive P600 is designed for portability, it is slightly larger than a deck of cards and is available in 1TB as well as 500GB models, the latter of which is the version that we received. You can sync the hard drive wirelessly, over the LAN connection present on the bottom of the device or through the USB 3.0 port which is also the drives recharging port. You can connect your devices to this drive in numerous ways, including setting it up as a SAMBA server or through DNLA if your devices are compatible.
The Power Connect U600 is perhaps the more interesting of the two devices for people on the go. With a large enough MicroSD card installed it can fulfill the same role as the WiFi HDD as it offers the same connectivity choices, including a LAN port, with the exception of DNLA functionality which is replaced with UPNP. In addition to offering portable storage it can function as a WiFi hot spot and with the internal 5200 mAh battery it will be able to charge your phone when you are away from power.
Last month NVIDIA introduced the world to the GTX 980 in a new form factor for gaming notebook. Using the same Maxwell GPU, the same performance levels but with slightly tweaked power delivery and TDPs, notebooks powered by the GTX 980 promise to be a noticeable step faster than anything before it.
Late last week I got my hands on the updated MSI GT72S Dominator Pro G, the first retail ready gaming notebook to not only integrate the new GTX 980 GPU but also an unlocked Skylake mobile processor.
This machine is something to behold - though it looks very similar to previous GT72 versions, this machine hides hardware unlike anything we have been able to carry in a backpack before. And the sexy red exterior with MSI Dragon Army logo blazoned across the back definitely help it to stand out in a crowd. If you happen to be in a crowd of notebooks.
A quick spin around the GT72S reveals a sizeable collection of hardware and connections. On the left you'll find a set of four USB 3.0 ports as well as four audio inputs and ouputs and an SD card reader.
On the opposite side there are two more USB 3.0 ports (totalling six) and the optical / Blu-ray burner. With that many USB 3.0 ports you should never struggle with accessories availability - headset, mouse, keyboard, hard drive and portable fan? Check.
Pack a full GTX 980 on the go!
For many years, the idea of a truly mobile gaming system has been attainable if you were willing to pay the premium for high performance components. But anyone that has done research in this field would tell you that though they were named similarly, the mobile GPUs from both AMD and NVIDIA had a tendency to be noticeably slower than their desktop counterparts. A GeForce GTX 970M, for example, only had a CUDA core count that was slightly higher than the desktop GTX 960, and it was 30% lower than the true desktop GTX 970 product. So even though you were getting fantastic mobile performance, there continued to be a dominant position that desktop users held over mobile gamers in PC gaming.
This fall, NVIDIA is changing that with the introduction of the GeForce GTX 980 for gaming notebooks. Notice I did not put an 'M' at the end of that name; it's not an accident. NVIDIA has found a way, through binning and component design, to cram the entirety of a GM204-based Maxwell GTX 980 GPU inside portable gaming notebooks.
The results are impressive and the implications for PC gamers are dramatic. Systems built with the GTX 980 will include the same 2048 CUDA cores, 4GB of GDDR5 running at 7.0 GHz and will run at the same base and typical GPU Boost clocks as the reference GTX 980 cards you can buy today for $499+. And, while you won't find this GPU in anything called a "thin and light", 17-19" gaming laptops do allow for portability of gaming unlike any SFF PC.
So how did they do it? NVIDIA has found a way to get a desktop GPU with a 165 watt TDP into a form factor that has a physical limit of 150 watts (for the MXM module implementations at least) through binning, component selection and improved cooling. Not only that, but there is enough headroom to allow for some desktop-class overclocking of the GTX 980 as well.
A Diverse Lineup
ThinkPads have always been one of our favorite notebook brands here at PC Perspective. While there certainly has been some competition from well-designed portables such as the Dell XPS 13 and Microsoft Surface Pro 3, the ThinkPad line remains a solid choice for power users.
We had the chance to look at a lot of Lenovo's ThinkPad lineup for Broadwell, and as this generation comes to a close we decided to give a brief overview of the diversity available. Skylake-powered notebooks may be just on the horizon, but the comparisons of form factor and usability should remain mostly applicable into the next generation.
Within the same $1200-$1300 price range, Lenovo offers a myriad of portable machines with roughly the same hardware in vastly different form factors.
First, let's take a look at the more standard ThinkPads.
Lenovo ThinkPad T450s
The ThinkPad T450s is my default recommendation for anyone looking for a notebook in the $1000+ range. Featuring a 14" 1080p display and an Intel Core i5-5300U processor, it will perform great for the majority of users. While you won't be using this machine for 3D Modeling or CAD/CAM applications, general productivity tasks will feel right at home here.
Technically classified as an Ultrabook, the T450s won't exactly be turning any heads with it's thinness. Lenovo strikes a balance here, making the notebook as thin as possible at 0.83" while retaining features such as a gigabit Ethernet port, 3 USB 3.0 Ports, an SD card reader, and plenty of display connectivity with Mini DisplayPort and VGA.
The Dell Venue 10 7000 Series tablet features a stunning 10.5" OLED screen and is designed to mate perfectly with the optional keyboard. So how does it perform as both a laptop and a tablet? Read on for the full review!
To begin with I will simply say the keyboard should not be an optional accessory. There, I've said it. As I used the Venue 10 7000, which arrived bundled with the keyboard, I was instantly excited about this design. The Venue 10 is a device that is as remarkable for its incredible screen as much as any other feature, but once coupled with the magnetically attached keyboard becomes something more - and quite different than existing implementations of the transforming tablet. More than a simple accessory the keyboard felt like it was really a part of the device when connected, and made it feel like a real laptop.
I'm getting way ahead of myself here so let's go back to the beginning, and back to a world where one might consider purchasing this tablet by itself. At $499 for the 16GB model you might reasonably ask how it compares to the identically-priced Apple iPad Air 2. Well, most of the comparison is going to be software/app related as the Venue 10 7000 is running Android 5.1 Lollipop, and of course the iPad runs iOS. The biggest difference between these tablets (besides the keyboard integration) becomes the 10.5-inch, 2560x1600 OLED screen, and oh what a screen it is!
A third primary processor
As the Hot Chips conference begins in Cupertino this week, Qualcomm is set to divulge another set of information about the upcoming Snapdragon 820 processor. Earlier this month the company revealed details about the Adreno 5xx GPU architecture, showcasing improved performance and power efficiency while also adding a new Spectra 14-bit image processor. Today we shift to what Qualcomm calls the “third pillar in the triumvirate of programmable processors” that make up the Snapdragon SoC. The Hexagon DSP (digital signal processor), introduced initially by Qualcomm in 2004, has gone through a massive architecture shift and even programmability shift over the last 10 years.
Qualcomm believes that building a balanced SoC for mobile applications is all about heterogeneous computing with no one processor carrying the entire load. The majority of the work that any modern Snapdragon processor must handle goes through the primary CPU cores, the GPU or the DSP. We learned about upgrades to the Adreno 5xx series for the Snapdragon 820 and we are promised information about Kryo CPU architecture soon as well. But the Hexagon 600-series of DSPs actually deals with some of the most important functionality for smartphones and tablets: audio, voice, imaging and video.
Interestingly, Qualcomm opened up the DSP to programmability just four years ago, giving developers the ability to write custom code and software to take advantages of the specific performance capabilities that the DSP offers. Custom photography, videography and sound applications could benefit greatly in terms of performance and power efficiency if utilizing the QC DSP rather than the primary system CPU or GPU. As of this writing, Qualcomm claims there are “hundreds” of developers actively writing code targeting its family of Hexagon processors.
The Hexagon DSP in Snapdragon 820 consists of three primary partitions. The main compute DSP works in conjunction with the GPU and CPU cores and will do much of the heavy lifting for encompassed workloads. The modem DSP aids the cellular modem in communication throughput. The new guy here is the lower power DSP in the Low Power Island (LPI) that shifts how always-on sensors can communicate with the operating system.
After spending some time in the computer hardware industry, it's easy to become jaded about trade shows and unannounced products. The vast majority of hardware we see at events like CES every year is completely expected beforehand. While this doesn't mean that these products are bad by any stretch, they can be difficult to get excited about.
Everyone once and a while however, we find ourselves with our hands on something completely unexpected. Hidden away in a back room of Lenovo's product showcase at CES this year, we were told there was a product would amaze us — called the LaVie.
And they were right.
Unfortunately, the Lenovo LaVie-Z is one of those products that you can't truly understand until you get it in your hands. Billed as the world's lightest 13.3" notebook, the standard LaVie-Z comes in at a weight of just 1.87 lbs. The touchscreen-enabled LaVie-Z 360 gains a bit of weight, coming in at 2.04 lbs.
While these numbers are a bit difficult to wrap your head around, I'll try to provide a bit of context. For example, the Google Nexus 9 weighs .94 lbs. For just over twice the weight as Google's flagship tablet, Lenovo has provided a full Windows notebook with an i7 ultra mobile processor.
Furthermore the new 12" Apple MacBook which people are touting as being extremely light comes in at 2.03 lbs, almost the same weight as the touchscreen version of the LaVie-Z. For the same weight, you also gain a much more powerful Intel i7 processor in the LaVie, when compared to the Intel Core-M option in the MacBook.
All of this comes together to provide an experience that is quite unbelievable. Anyone that I have handed one of these notebooks to has been absolutely amazed that it's a real, functioning computer. The closest analog that I have been able to come up with for picking up the LaVie-Z is one of the cardboard placeholder laptops they have at furniture stores.
The personal laptop that I carry day-to-day is a 11" MacBook Air, which only weighs 2.38 lbs, but the LaVie-Z feels infinitely lighter.
However, as impressive as the weight (or lack thereof) of the LaVie-Z is, let's dig deeper into what the experience of using the world's lightest notebook.
Introduction and First Impressions
The MSI GT72 Dominator Pro G gaming laptop is a beast of a portable, with a GeForce GTX 980M graphics card and a 5th-Gen Intel Core i7 processor within its massive frame. And this iteration of the GT72 features NVIDIA's G-SYNC technology, which should help provide smooth gameplay on its 75 Hz IPS display.
The gaming laptop market is filled with options at just about any price you can imagine (as long as your imagination starts at around $1000), and there are seemingly limitless combinations of specs and minute configuration differences even within a particular brand’s offering. A few names stand out in this market, and MSI has created a product meant to stand tall against the likes of Alienware and ASUS ROG. And it doesn’t just stand tall, it stands wide - and deep for that matter. Running about the size of home plate on a regulation baseball diamond (well, approximately anyway), this is nearly 8 ½ lbs of PC gaming goodness.
Not everyone needs a 17-inch notebook, but there’s something awesome about these giant things when you see them in person. The design of this GT72 series is reminiscent of an exotic sports car (gaming laptops in general seem to have fully embraced the sports car theme), and if you’re considering completely replacing a desktop for gaming and all of your other computing the extra space it takes up is more than worth it if you value a large display and full keyboard. Doubtless there are some who would simply be augmenting a desktop experience with a supremely powerful notebook like this, but for most people laptops like this are a major investment that generally replaces the need for a dedicated PC tower.
What about the cost? It certainly isn’t “cheap” considering the top-of-the-line specs, and price is clearly the biggest barrier to entry with a product like this - far beyond the gargantuan size. Right off the bat I’ll bring up this laptop’s $2099 retail price - and not because I think it’s high. It’s actually very competitive as equipped. And in addition to competitive pricing MSI is also ahead of the curve a bit with its adoption of the 5th-Gen Core i7 Broadwell mobile processors, while most gaming laptops are still on Haswell. Broadwell’s improved efficiency should help with battery life a bit, but your time away from a power plug is always going to be limited with gaming laptops!
Gaming laptops are something that most people are quick to reject as out of their price range. There is a lot of sense in this train of thought. We know that laptop components are inherently lower performing than their desktop counterparts, and significantly more expensive. So the idea of spending more money for less powerful components seems like a bad trade off for the added gains of portability for many gamers.
However, we also seem to be in a bit of a plateau as far as generation-to-generation performance gain with desktop components. Midrange processors from a few generations ago are still more than capable of playing the vast majority of games, and even lower-end modern GPUs are able to game at 1080p.
So maybe it's time to take another look at the sub-$1000 gaming notebook options, and that's exactly what we are doing today with the Acer Aspire V 15 Nitro Black Edition.
The Aspire V Nitro is equipped with fairly modest components when compared to what most people think of gaming laptops as. Where machines such as the MSI GT70 Dominator or ASUS G751 seem to take the kitchen sink approach towards mobile gaming machines, The Aspire V is a more carefully balanced option.
|Acer Aspire V 15 Nitro Black Edition|
|Processor||Intel Core i7-4720HQ 2.6 GHz|
|Graphics Card||NVIDIA GTX 960M 4GB|
|Storage||1 TB Hard Drive|
|Dimensions (W x D x H)||15.34" x 10.14" x 0.86" - 0.94"|
Anchored by an Intel Core i7-4720HQ and a GTX 960M, the Aspire V Nitro isn't trying to reach to the top stack of mobile performance. A 15.6" display along with 8GB of RAM, and a single 1TB spindle drive are all logical choices for a machine aimed towards gaming on a budget.
While it's difficult for us to recommend that you buy any machine without an SSD these days, a 1TB drive is great for game storage on a machine like there. There are also other configurations optiosn which add SATA M.2 SSDs alongside the 1TB drive, and we managed to open up our sample and put an SSD in ourselves with little pain.
Business Model Based on Partnerships
|Alexandru Voica works for Imagination Technologies. His background includes research in computer graphics at the School of Advanced Studies Sant'Anna in Pisa and a brief stint as a CPU engineer, working on several high-profile 32-bit processors used in many mobile and embedded devices today. You can follow Alex on Twitter @alexvoica.|
Some months ago my colleague Rys Sommefeldt wrote an article offering his (deeply) technical perspective on how a chip gets made, from R&D to manufacturing. While his bildungsroman production covers a lot of the engineering details behind silicon production, it is light on the business side of things; and that is a good thing because it gives me opportunity to steal some of his spotlight!
This article will give you a breakdown of the IP licensing model, describing the major players and the relationships between them. It is not designed to be a complete guide by any means and some parts might already sound familiar, but I hope it is a comprehensive overview that can be used by anyone who is new to product manufacturing in general.
The diagram below offers an analysis of the main categories of companies involved in the semiconductor food chain. Although I’m going to attempt to paint a broad picture, I will mainly offer examples based on the ecosystem formed around Imagination (since that is what I know best).
A simplified view of the manufacturing chain
Let’s work our way from left to right.
Traditionally, these are the companies that design and sell silicon IP. ARM and Imagination Technologies are perhaps the most renowned for their sub-brands: Cortex CPU + Mali GPU and MIPS CPU + PowerVR GPU, respectively.
Given the rapid evolution of the semiconductor market, such companies continue to evolve their business models beyond point solutions to become one-stop shops that offer more than for a wide variety of IP cores and platforms, comprising CPUs, graphics, video, connectivity, cloud software and more.
Qualcomm’s GPU History
Despite its market dominance, Qualcomm may be one of the least known contenders in the battle for the mobile space. While players like Apple, Samsung, and even NVIDIA are often cited as the most exciting and most revolutionary, none come close to the sheer sales, breadth of technology, and market share that Qualcomm occupies. Brands like Krait and Snapdragon have helped push the company into the top 3 semiconductor companies in the world, following only Intel and Samsung.
Founded in July 1985, seven industry veterans came together in the den of Dr. Irwin Jacobs’ San Diego home to discuss an idea. They wanted to build “Quality Communications” (thus the name Qualcomm) and outlined a plan that evolved into one of the telecommunications industry’s great start-up success stories.
Though Qualcomm sold its own handset business to Kyocera in 1999, many of today’s most popular mobile devices are powered by Qualcomm’s Snapdragon mobile chipsets with integrated CPU, GPU, DSP, multimedia CODECs, power management, baseband logic and more. In fact the typical “chipset” from Qualcomm encompasses up to 20 different chips of different functions besides just the main application processor. If you are an owner of a Galaxy Note 4, Motorola Droid Turbo, Nexus 6, or Samsung Galaxy S5, then you are most likely a user of one of Qualcomm’s Snapdragon chipsets.
Qualcomm’s GPU History
Before 2006, the mobile GPU as we know it today was largely unnecessary. Feature phones and “dumb” phones were still the large majority of the market with smartphones and mobile tablets still in the early stages of development. At this point all the visual data being presented on the screen, whether on a small monochrome screen or with the color of a PDA, was being drawn through a software renderer running on traditional CPU cores.
But by 2007, the first fixed-function, OpenGL ES 1.0 class of GPUs started shipping in mobile devices. These dedicated graphics processors were originally focused on drawing and updating the user interface on smartphones and personal data devices. Eventually these graphics units were used for what would be considered the most basic gaming tasks.
Introduction and Specifications
The ASUS Zenfone 2 is a 5.5-inch smartphone with a premium look and the specs to match. But the real story here is that it sells for just $199 or $299 unlocked, making it a tempting alternative to contract phones without the concessions often made with budget devices; at least on paper. Let's take a closer look to see how the new Zenfone 2 stacks up! (Note: a second sample unit was provided by Gearbest.com.)
When I first heard about the Zenfone 2 from ASUS I was eager to check it out given its mix of solid specs, nice appearance, and a startlingly low price. ASUS has created something that has the potential to transcend the disruptive nature of a phone like the Moto E, itself a $149 alternative to contract phones that we reviewed recently. With its premium specs to go along with very low unlocked pricing, the Zenfone 2 could be more than just a bargain device, and if it performs well it could make some serious waves in the smartphone industry.
The Zenfone 2 also features a 5.5-inch IPS LCD screen with 1920x1080 resolution (in line with an iPhone 6 Plus or the OnePlus One), and beyond the internal hardware ASUS has created a phone that looks every bit the part of a premium device that one would expect to cost hundreds more. In fact, without spoiling anything up front, I will say that the context of price won't be necessary to judge the merit of the Zenfone 2; it stands on its own as a smartphone, and not simply a budget phone.
The big question is going to be how the Zenfone 2 compares to existing phones, and with its quad-core Intel Atom SoC this is something of an unknown. Intel has been making a push to enter the U.S. smartphone market (with earlier products more widely available in Europe) and the Zenfone 2 marks an important milestone for both Intel and ASUS in this regard. The Z3580 SoC powering my review unit certainly sounds fast on paper with its 4 cores clocked up to 2.33 GHz, and no less than 4 GB of RAM on board (and a solid 64 GB onboard storage as well).
Introduction and Design
With the introduction of the ASUS G751JT-CH71, we’ve now got our first look at the newest ROG notebook design revision. The celebrated design language remains the same, and the machine’s lineage is immediately discernible. However, unlike the $2,000 G750JX-DB71 unit we reviewed a year and a half ago, this particular G751JT configuration is 25% less expensive at just $1,500. So first off, what’s changed on the inside?
(Editor's Note: This is NOT the recent G-Sync version of the ASUS G751 notebook that was announced at Computex. This is the previously released version, one that I am told will continue to sell for the foreseeable future and one that will come at a lower overall price than the G-Sync enabled model. Expect a review on the G-Sync derivative very soon!)
Quite a lot, as it turns out. For starters, we’ve moved all the way from the 700M series to the 900M series—a leap which clearly ought to pay off in spades in terms of GPU performance. The CPU and RAM remain virtually equivalent, while the battery has migrated from external to internal and enjoyed a 100 mAh bump in the process (from 5900 to 6000 mAh). So what’s with the lower price then? Well, apart from the age difference, it’s the storage: the G750JX featured both a 1 TB storage drive and a 256 GB SSD, while the G751JT-CH71 drops the SSD. That’s a small sacrifice in our book, especially when an SSD is so easily added thereafter. By the way, if you’d rather simply have ASUS handle that part of the equation for you, you can score a virtually equivalent configuration (chipset and design evolutions notwithstanding) in the G751JT-DH72 for $1750—still $250 less than the G750JX we reviewed.
Digging into a specific market
A little while ago, I decided to think about processor design as a game. You are given a budget of complexity, which is determined by your process node, power, heat, die size, and so forth, and the objective is to lay out features in the way that suits your goal and workload best. While not the topic of today's post, GPUs are a great example of what I mean. They make the assumption that in a batch of work, nearby tasks are very similar, such as the math behind two neighboring pixels on the screen. This assumption allows GPU manufacturers to save complexity by chaining dozens of cores together into not-quite-independent work groups. The circuit fits the work better, and thus it lets more get done in the same complexity budget.
Carrizo is aiming at a 63 million unit per year market segment.
This article is about Carrizo, though. This is AMD's sixth-generation APU, starting with Llano's release in June 2011. For this launch, Carrizo is targeting the 15W and 35W power envelopes for $400-$700 USD notebook devices. AMD needed to increase efficiency on the same, 28nm process that we have seen in their product stack since Kabini and Temash were released in May of 2013. They tasked their engineers to optimize their APU's design for these constraints, which led to dense architectures and clever features on the same budget of complexity, rather than smaller transistors or a bigger die.
15W was their primary target, and they claim to have exceeded their own expectations.
Backing up for a second. Beep. Beep. Beep. Beep.
When I met with AMD last month, I brought up the Bulldozer architecture with many individuals. I suspected that it was a quite clever design that didn't reach its potential because of external factors. As I started this editorial, processor design is a game and, if you can save complexity by knowing your workload, you can do more with less.
Bulldozer looked like it wanted to take a shortcut by cutting elements that its designers believed would be redundant going forward. First and foremost, two cores share a single floating point (decimal) unit. While you need some floating point capacity, upcoming workloads could use the GPU for a massive increase in performance, which is right there on the same die. As such, the complexity that is dedicated to every second FPU can be cut and used for something else. You can see this trend throughout various elements of the architecture.
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.