All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Meet EMTECs powerful portable peripherals
You may not think you have heard of EMTEC before but if you are old enough to remember audio and video tape then their original name will ring a bell; they were once known as BASF Magnetics. BASF officially launched their new division in 2000, focused on modern magnetic storage such as hard drives and flash.
If you have seen an oddly shaped flash drive or ones made in the shapes of Looney Tunes or Angry Birds then you have run into EMTEC products. As you can see above their product line is quite varied and includes the Power Connect for Mobile devices and the Wi-Fi Hard Drive P600, both of which they have sent for us to review.
The packaging is reminiscent of a gaming mouse with Velcro so you can open the box to see the device inside. Even better is the lack of clamshell packaging, you won't have to risk a finger trying to open them.
The WiFi Drive P600 is designed for portability, it is slightly larger than a deck of cards and is available in 1TB as well as 500GB models, the latter of which is the version that we received. You can sync the hard drive wirelessly, over the LAN connection present on the bottom of the device or through the USB 3.0 port which is also the drives recharging port. You can connect your devices to this drive in numerous ways, including setting it up as a SAMBA server or through DNLA if your devices are compatible.
The Power Connect U600 is perhaps the more interesting of the two devices for people on the go. With a large enough MicroSD card installed it can fulfill the same role as the WiFi HDD as it offers the same connectivity choices, including a LAN port, with the exception of DNLA functionality which is replaced with UPNP. In addition to offering portable storage it can function as a WiFi hot spot and with the internal 5200 mAh battery it will be able to charge your phone when you are away from power.
Last month NVIDIA introduced the world to the GTX 980 in a new form factor for gaming notebook. Using the same Maxwell GPU, the same performance levels but with slightly tweaked power delivery and TDPs, notebooks powered by the GTX 980 promise to be a noticeable step faster than anything before it.
Late last week I got my hands on the updated MSI GT72S Dominator Pro G, the first retail ready gaming notebook to not only integrate the new GTX 980 GPU but also an unlocked Skylake mobile processor.
This machine is something to behold - though it looks very similar to previous GT72 versions, this machine hides hardware unlike anything we have been able to carry in a backpack before. And the sexy red exterior with MSI Dragon Army logo blazoned across the back definitely help it to stand out in a crowd. If you happen to be in a crowd of notebooks.
A quick spin around the GT72S reveals a sizeable collection of hardware and connections. On the left you'll find a set of four USB 3.0 ports as well as four audio inputs and ouputs and an SD card reader.
On the opposite side there are two more USB 3.0 ports (totalling six) and the optical / Blu-ray burner. With that many USB 3.0 ports you should never struggle with accessories availability - headset, mouse, keyboard, hard drive and portable fan? Check.
Pack a full GTX 980 on the go!
For many years, the idea of a truly mobile gaming system has been attainable if you were willing to pay the premium for high performance components. But anyone that has done research in this field would tell you that though they were named similarly, the mobile GPUs from both AMD and NVIDIA had a tendency to be noticeably slower than their desktop counterparts. A GeForce GTX 970M, for example, only had a CUDA core count that was slightly higher than the desktop GTX 960, and it was 30% lower than the true desktop GTX 970 product. So even though you were getting fantastic mobile performance, there continued to be a dominant position that desktop users held over mobile gamers in PC gaming.
This fall, NVIDIA is changing that with the introduction of the GeForce GTX 980 for gaming notebooks. Notice I did not put an 'M' at the end of that name; it's not an accident. NVIDIA has found a way, through binning and component design, to cram the entirety of a GM204-based Maxwell GTX 980 GPU inside portable gaming notebooks.
The results are impressive and the implications for PC gamers are dramatic. Systems built with the GTX 980 will include the same 2048 CUDA cores, 4GB of GDDR5 running at 7.0 GHz and will run at the same base and typical GPU Boost clocks as the reference GTX 980 cards you can buy today for $499+. And, while you won't find this GPU in anything called a "thin and light", 17-19" gaming laptops do allow for portability of gaming unlike any SFF PC.
So how did they do it? NVIDIA has found a way to get a desktop GPU with a 165 watt TDP into a form factor that has a physical limit of 150 watts (for the MXM module implementations at least) through binning, component selection and improved cooling. Not only that, but there is enough headroom to allow for some desktop-class overclocking of the GTX 980 as well.
A Diverse Lineup
ThinkPads have always been one of our favorite notebook brands here at PC Perspective. While there certainly has been some competition from well-designed portables such as the Dell XPS 13 and Microsoft Surface Pro 3, the ThinkPad line remains a solid choice for power users.
We had the chance to look at a lot of Lenovo's ThinkPad lineup for Broadwell, and as this generation comes to a close we decided to give a brief overview of the diversity available. Skylake-powered notebooks may be just on the horizon, but the comparisons of form factor and usability should remain mostly applicable into the next generation.
Within the same $1200-$1300 price range, Lenovo offers a myriad of portable machines with roughly the same hardware in vastly different form factors.
First, let's take a look at the more standard ThinkPads.
Lenovo ThinkPad T450s
The ThinkPad T450s is my default recommendation for anyone looking for a notebook in the $1000+ range. Featuring a 14" 1080p display and an Intel Core i5-5300U processor, it will perform great for the majority of users. While you won't be using this machine for 3D Modeling or CAD/CAM applications, general productivity tasks will feel right at home here.
Technically classified as an Ultrabook, the T450s won't exactly be turning any heads with it's thinness. Lenovo strikes a balance here, making the notebook as thin as possible at 0.83" while retaining features such as a gigabit Ethernet port, 3 USB 3.0 Ports, an SD card reader, and plenty of display connectivity with Mini DisplayPort and VGA.
The Dell Venue 10 7000 Series tablet features a stunning 10.5" OLED screen and is designed to mate perfectly with the optional keyboard. So how does it perform as both a laptop and a tablet? Read on for the full review!
To begin with I will simply say the keyboard should not be an optional accessory. There, I've said it. As I used the Venue 10 7000, which arrived bundled with the keyboard, I was instantly excited about this design. The Venue 10 is a device that is as remarkable for its incredible screen as much as any other feature, but once coupled with the magnetically attached keyboard becomes something more - and quite different than existing implementations of the transforming tablet. More than a simple accessory the keyboard felt like it was really a part of the device when connected, and made it feel like a real laptop.
I'm getting way ahead of myself here so let's go back to the beginning, and back to a world where one might consider purchasing this tablet by itself. At $499 for the 16GB model you might reasonably ask how it compares to the identically-priced Apple iPad Air 2. Well, most of the comparison is going to be software/app related as the Venue 10 7000 is running Android 5.1 Lollipop, and of course the iPad runs iOS. The biggest difference between these tablets (besides the keyboard integration) becomes the 10.5-inch, 2560x1600 OLED screen, and oh what a screen it is!
A third primary processor
As the Hot Chips conference begins in Cupertino this week, Qualcomm is set to divulge another set of information about the upcoming Snapdragon 820 processor. Earlier this month the company revealed details about the Adreno 5xx GPU architecture, showcasing improved performance and power efficiency while also adding a new Spectra 14-bit image processor. Today we shift to what Qualcomm calls the “third pillar in the triumvirate of programmable processors” that make up the Snapdragon SoC. The Hexagon DSP (digital signal processor), introduced initially by Qualcomm in 2004, has gone through a massive architecture shift and even programmability shift over the last 10 years.
Qualcomm believes that building a balanced SoC for mobile applications is all about heterogeneous computing with no one processor carrying the entire load. The majority of the work that any modern Snapdragon processor must handle goes through the primary CPU cores, the GPU or the DSP. We learned about upgrades to the Adreno 5xx series for the Snapdragon 820 and we are promised information about Kryo CPU architecture soon as well. But the Hexagon 600-series of DSPs actually deals with some of the most important functionality for smartphones and tablets: audio, voice, imaging and video.
Interestingly, Qualcomm opened up the DSP to programmability just four years ago, giving developers the ability to write custom code and software to take advantages of the specific performance capabilities that the DSP offers. Custom photography, videography and sound applications could benefit greatly in terms of performance and power efficiency if utilizing the QC DSP rather than the primary system CPU or GPU. As of this writing, Qualcomm claims there are “hundreds” of developers actively writing code targeting its family of Hexagon processors.
The Hexagon DSP in Snapdragon 820 consists of three primary partitions. The main compute DSP works in conjunction with the GPU and CPU cores and will do much of the heavy lifting for encompassed workloads. The modem DSP aids the cellular modem in communication throughput. The new guy here is the lower power DSP in the Low Power Island (LPI) that shifts how always-on sensors can communicate with the operating system.
After spending some time in the computer hardware industry, it's easy to become jaded about trade shows and unannounced products. The vast majority of hardware we see at events like CES every year is completely expected beforehand. While this doesn't mean that these products are bad by any stretch, they can be difficult to get excited about.
Everyone once and a while however, we find ourselves with our hands on something completely unexpected. Hidden away in a back room of Lenovo's product showcase at CES this year, we were told there was a product would amaze us — called the LaVie.
And they were right.
Unfortunately, the Lenovo LaVie-Z is one of those products that you can't truly understand until you get it in your hands. Billed as the world's lightest 13.3" notebook, the standard LaVie-Z comes in at a weight of just 1.87 lbs. The touchscreen-enabled LaVie-Z 360 gains a bit of weight, coming in at 2.04 lbs.
While these numbers are a bit difficult to wrap your head around, I'll try to provide a bit of context. For example, the Google Nexus 9 weighs .94 lbs. For just over twice the weight as Google's flagship tablet, Lenovo has provided a full Windows notebook with an i7 ultra mobile processor.
Furthermore the new 12" Apple MacBook which people are touting as being extremely light comes in at 2.03 lbs, almost the same weight as the touchscreen version of the LaVie-Z. For the same weight, you also gain a much more powerful Intel i7 processor in the LaVie, when compared to the Intel Core-M option in the MacBook.
All of this comes together to provide an experience that is quite unbelievable. Anyone that I have handed one of these notebooks to has been absolutely amazed that it's a real, functioning computer. The closest analog that I have been able to come up with for picking up the LaVie-Z is one of the cardboard placeholder laptops they have at furniture stores.
The personal laptop that I carry day-to-day is a 11" MacBook Air, which only weighs 2.38 lbs, but the LaVie-Z feels infinitely lighter.
However, as impressive as the weight (or lack thereof) of the LaVie-Z is, let's dig deeper into what the experience of using the world's lightest notebook.
Introduction and First Impressions
The MSI GT72 Dominator Pro G gaming laptop is a beast of a portable, with a GeForce GTX 980M graphics card and a 5th-Gen Intel Core i7 processor within its massive frame. And this iteration of the GT72 features NVIDIA's G-SYNC technology, which should help provide smooth gameplay on its 75 Hz IPS display.
The gaming laptop market is filled with options at just about any price you can imagine (as long as your imagination starts at around $1000), and there are seemingly limitless combinations of specs and minute configuration differences even within a particular brand’s offering. A few names stand out in this market, and MSI has created a product meant to stand tall against the likes of Alienware and ASUS ROG. And it doesn’t just stand tall, it stands wide - and deep for that matter. Running about the size of home plate on a regulation baseball diamond (well, approximately anyway), this is nearly 8 ½ lbs of PC gaming goodness.
Not everyone needs a 17-inch notebook, but there’s something awesome about these giant things when you see them in person. The design of this GT72 series is reminiscent of an exotic sports car (gaming laptops in general seem to have fully embraced the sports car theme), and if you’re considering completely replacing a desktop for gaming and all of your other computing the extra space it takes up is more than worth it if you value a large display and full keyboard. Doubtless there are some who would simply be augmenting a desktop experience with a supremely powerful notebook like this, but for most people laptops like this are a major investment that generally replaces the need for a dedicated PC tower.
What about the cost? It certainly isn’t “cheap” considering the top-of-the-line specs, and price is clearly the biggest barrier to entry with a product like this - far beyond the gargantuan size. Right off the bat I’ll bring up this laptop’s $2099 retail price - and not because I think it’s high. It’s actually very competitive as equipped. And in addition to competitive pricing MSI is also ahead of the curve a bit with its adoption of the 5th-Gen Core i7 Broadwell mobile processors, while most gaming laptops are still on Haswell. Broadwell’s improved efficiency should help with battery life a bit, but your time away from a power plug is always going to be limited with gaming laptops!
Gaming laptops are something that most people are quick to reject as out of their price range. There is a lot of sense in this train of thought. We know that laptop components are inherently lower performing than their desktop counterparts, and significantly more expensive. So the idea of spending more money for less powerful components seems like a bad trade off for the added gains of portability for many gamers.
However, we also seem to be in a bit of a plateau as far as generation-to-generation performance gain with desktop components. Midrange processors from a few generations ago are still more than capable of playing the vast majority of games, and even lower-end modern GPUs are able to game at 1080p.
So maybe it's time to take another look at the sub-$1000 gaming notebook options, and that's exactly what we are doing today with the Acer Aspire V 15 Nitro Black Edition.
The Aspire V Nitro is equipped with fairly modest components when compared to what most people think of gaming laptops as. Where machines such as the MSI GT70 Dominator or ASUS G751 seem to take the kitchen sink approach towards mobile gaming machines, The Aspire V is a more carefully balanced option.
|Acer Aspire V 15 Nitro Black Edition|
|Processor||Intel Core i7-4720HQ 2.6 GHz|
|Graphics Card||NVIDIA GTX 960M 4GB|
|Storage||1 TB Hard Drive|
|Dimensions (W x D x H)||15.34" x 10.14" x 0.86" - 0.94"|
Anchored by an Intel Core i7-4720HQ and a GTX 960M, the Aspire V Nitro isn't trying to reach to the top stack of mobile performance. A 15.6" display along with 8GB of RAM, and a single 1TB spindle drive are all logical choices for a machine aimed towards gaming on a budget.
While it's difficult for us to recommend that you buy any machine without an SSD these days, a 1TB drive is great for game storage on a machine like there. There are also other configurations optiosn which add SATA M.2 SSDs alongside the 1TB drive, and we managed to open up our sample and put an SSD in ourselves with little pain.
Business Model Based on Partnerships
|Alexandru Voica works for Imagination Technologies. His background includes research in computer graphics at the School of Advanced Studies Sant'Anna in Pisa and a brief stint as a CPU engineer, working on several high-profile 32-bit processors used in many mobile and embedded devices today. You can follow Alex on Twitter @alexvoica.|
Some months ago my colleague Rys Sommefeldt wrote an article offering his (deeply) technical perspective on how a chip gets made, from R&D to manufacturing. While his bildungsroman production covers a lot of the engineering details behind silicon production, it is light on the business side of things; and that is a good thing because it gives me opportunity to steal some of his spotlight!
This article will give you a breakdown of the IP licensing model, describing the major players and the relationships between them. It is not designed to be a complete guide by any means and some parts might already sound familiar, but I hope it is a comprehensive overview that can be used by anyone who is new to product manufacturing in general.
The diagram below offers an analysis of the main categories of companies involved in the semiconductor food chain. Although I’m going to attempt to paint a broad picture, I will mainly offer examples based on the ecosystem formed around Imagination (since that is what I know best).
A simplified view of the manufacturing chain
Let’s work our way from left to right.
Traditionally, these are the companies that design and sell silicon IP. ARM and Imagination Technologies are perhaps the most renowned for their sub-brands: Cortex CPU + Mali GPU and MIPS CPU + PowerVR GPU, respectively.
Given the rapid evolution of the semiconductor market, such companies continue to evolve their business models beyond point solutions to become one-stop shops that offer more than for a wide variety of IP cores and platforms, comprising CPUs, graphics, video, connectivity, cloud software and more.
Qualcomm’s GPU History
Despite its market dominance, Qualcomm may be one of the least known contenders in the battle for the mobile space. While players like Apple, Samsung, and even NVIDIA are often cited as the most exciting and most revolutionary, none come close to the sheer sales, breadth of technology, and market share that Qualcomm occupies. Brands like Krait and Snapdragon have helped push the company into the top 3 semiconductor companies in the world, following only Intel and Samsung.
Founded in July 1985, seven industry veterans came together in the den of Dr. Irwin Jacobs’ San Diego home to discuss an idea. They wanted to build “Quality Communications” (thus the name Qualcomm) and outlined a plan that evolved into one of the telecommunications industry’s great start-up success stories.
Though Qualcomm sold its own handset business to Kyocera in 1999, many of today’s most popular mobile devices are powered by Qualcomm’s Snapdragon mobile chipsets with integrated CPU, GPU, DSP, multimedia CODECs, power management, baseband logic and more. In fact the typical “chipset” from Qualcomm encompasses up to 20 different chips of different functions besides just the main application processor. If you are an owner of a Galaxy Note 4, Motorola Droid Turbo, Nexus 6, or Samsung Galaxy S5, then you are most likely a user of one of Qualcomm’s Snapdragon chipsets.
Qualcomm’s GPU History
Before 2006, the mobile GPU as we know it today was largely unnecessary. Feature phones and “dumb” phones were still the large majority of the market with smartphones and mobile tablets still in the early stages of development. At this point all the visual data being presented on the screen, whether on a small monochrome screen or with the color of a PDA, was being drawn through a software renderer running on traditional CPU cores.
But by 2007, the first fixed-function, OpenGL ES 1.0 class of GPUs started shipping in mobile devices. These dedicated graphics processors were originally focused on drawing and updating the user interface on smartphones and personal data devices. Eventually these graphics units were used for what would be considered the most basic gaming tasks.
Introduction and Specifications
The ASUS Zenfone 2 is a 5.5-inch smartphone with a premium look and the specs to match. But the real story here is that it sells for just $199 or $299 unlocked, making it a tempting alternative to contract phones without the concessions often made with budget devices; at least on paper. Let's take a closer look to see how the new Zenfone 2 stacks up! (Note: a second sample unit was provided by Gearbest.com.)
When I first heard about the Zenfone 2 from ASUS I was eager to check it out given its mix of solid specs, nice appearance, and a startlingly low price. ASUS has created something that has the potential to transcend the disruptive nature of a phone like the Moto E, itself a $149 alternative to contract phones that we reviewed recently. With its premium specs to go along with very low unlocked pricing, the Zenfone 2 could be more than just a bargain device, and if it performs well it could make some serious waves in the smartphone industry.
The Zenfone 2 also features a 5.5-inch IPS LCD screen with 1920x1080 resolution (in line with an iPhone 6 Plus or the OnePlus One), and beyond the internal hardware ASUS has created a phone that looks every bit the part of a premium device that one would expect to cost hundreds more. In fact, without spoiling anything up front, I will say that the context of price won't be necessary to judge the merit of the Zenfone 2; it stands on its own as a smartphone, and not simply a budget phone.
The big question is going to be how the Zenfone 2 compares to existing phones, and with its quad-core Intel Atom SoC this is something of an unknown. Intel has been making a push to enter the U.S. smartphone market (with earlier products more widely available in Europe) and the Zenfone 2 marks an important milestone for both Intel and ASUS in this regard. The Z3580 SoC powering my review unit certainly sounds fast on paper with its 4 cores clocked up to 2.33 GHz, and no less than 4 GB of RAM on board (and a solid 64 GB onboard storage as well).
Introduction and Design
With the introduction of the ASUS G751JT-CH71, we’ve now got our first look at the newest ROG notebook design revision. The celebrated design language remains the same, and the machine’s lineage is immediately discernible. However, unlike the $2,000 G750JX-DB71 unit we reviewed a year and a half ago, this particular G751JT configuration is 25% less expensive at just $1,500. So first off, what’s changed on the inside?
(Editor's Note: This is NOT the recent G-Sync version of the ASUS G751 notebook that was announced at Computex. This is the previously released version, one that I am told will continue to sell for the foreseeable future and one that will come at a lower overall price than the G-Sync enabled model. Expect a review on the G-Sync derivative very soon!)
Quite a lot, as it turns out. For starters, we’ve moved all the way from the 700M series to the 900M series—a leap which clearly ought to pay off in spades in terms of GPU performance. The CPU and RAM remain virtually equivalent, while the battery has migrated from external to internal and enjoyed a 100 mAh bump in the process (from 5900 to 6000 mAh). So what’s with the lower price then? Well, apart from the age difference, it’s the storage: the G750JX featured both a 1 TB storage drive and a 256 GB SSD, while the G751JT-CH71 drops the SSD. That’s a small sacrifice in our book, especially when an SSD is so easily added thereafter. By the way, if you’d rather simply have ASUS handle that part of the equation for you, you can score a virtually equivalent configuration (chipset and design evolutions notwithstanding) in the G751JT-DH72 for $1750—still $250 less than the G750JX we reviewed.
Digging into a specific market
A little while ago, I decided to think about processor design as a game. You are given a budget of complexity, which is determined by your process node, power, heat, die size, and so forth, and the objective is to lay out features in the way that suits your goal and workload best. While not the topic of today's post, GPUs are a great example of what I mean. They make the assumption that in a batch of work, nearby tasks are very similar, such as the math behind two neighboring pixels on the screen. This assumption allows GPU manufacturers to save complexity by chaining dozens of cores together into not-quite-independent work groups. The circuit fits the work better, and thus it lets more get done in the same complexity budget.
Carrizo is aiming at a 63 million unit per year market segment.
This article is about Carrizo, though. This is AMD's sixth-generation APU, starting with Llano's release in June 2011. For this launch, Carrizo is targeting the 15W and 35W power envelopes for $400-$700 USD notebook devices. AMD needed to increase efficiency on the same, 28nm process that we have seen in their product stack since Kabini and Temash were released in May of 2013. They tasked their engineers to optimize their APU's design for these constraints, which led to dense architectures and clever features on the same budget of complexity, rather than smaller transistors or a bigger die.
15W was their primary target, and they claim to have exceeded their own expectations.
Backing up for a second. Beep. Beep. Beep. Beep.
When I met with AMD last month, I brought up the Bulldozer architecture with many individuals. I suspected that it was a quite clever design that didn't reach its potential because of external factors. As I started this editorial, processor design is a game and, if you can save complexity by knowing your workload, you can do more with less.
Bulldozer looked like it wanted to take a shortcut by cutting elements that its designers believed would be redundant going forward. First and foremost, two cores share a single floating point (decimal) unit. While you need some floating point capacity, upcoming workloads could use the GPU for a massive increase in performance, which is right there on the same die. As such, the complexity that is dedicated to every second FPU can be cut and used for something else. You can see this trend throughout various elements of the architecture.
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.
Introduction: Improving Portable Sound
The Calyx PaT is very small USB DAC and headphone amp that can be used with PCs and mobile devices, offering the possibility of better sound from just about any digital source. So how does it sound? Let’s find out!
The PaT is a very interesting little device, to be sure. It rather resembles a large domino and weighs less than 1 ounce thanks to an ultra-light aluminum construction. It requires no battery or power source other than its micro USB connection, yet it provides sufficient power (0.8 V output) for in-ear monitors and efficient headphones through its 3.5mm headphone jack. Inside is a proprietary mix of DAC and amplifier circuitry, and like other products produced by Calyx, a Korean company with little presence in the United States, there is the promise of a dedication to great sound. Did Calyx pull it off with the diminutive PaT?
Improving Portable Sound
Outboard DACs and headphone amplifiers for computers and mobile devices are nothing new, with recent products like AudioQuest’s Dragonfly a prime example in the portable USB DAC market (though it offers no mobile support). When I first heard about the PaT during CES it was still in the prototype stage, but I was interested because of the Calyx name if nothing else, as I already owned the Calyx M DAP and had been quite honestly blown away by the sound.
So what need might I have for the interestingly-named PaT (pronounced "paat", meaning "bean" in reference to the small size), which is itself a DAC that requires another device to play music files? It wasn’t until I had the opportunity to speak with Calyx president Seungmok Yi during CES (via video chat as I couldn’t attend the show) that I started realize that this could be a compelling product, not just for the $99 price tag - a bargain for an audiophile product - but because of how versatile the PaT can be. You don't have to identify as an "audiophile" to appreciate the clearer and more detailed sound of a good DAC, especially when so many of us simply haven't heard one (especially on mobile devices).
ARM Releases Cortex-A72 for Licensing
On February 3rd, ARM announced a slew of new designs, including the Cortex A72. Few details were shared with us, but what we learned was that it could potentially redefine power and performance in the ARM ecosystem. Ryan was invited to London to participate in a deep dive of what ARM has done to improve its position against market behemoth Intel in the very competitive mobile space. Intel has a leg up on process technology with their 14nm Tri-Gate process, but they are continuing to work hard in making their x86 based processors more power efficient, while still maintaining good performance. There are certain drawbacks to using an ISA that is focused on high performance computing rather than being designed from scratch to provide good performance with excellent energy efficiency.
ARM has been on a pretty good roll with their Cortex A9, A7, A15, A17, A53, and A57 parts over the past several years. These designs have been utilized in a multitude of products and scenarios, with configurations that have scaled up to 16 cores. While each iteration has improved upon the previous, ARM is facing the specter of Intel’s latest generation, highly efficient x86 SOCs based on the 2nd gen 14nm Tri-Gate process. Several things have fallen into place for ARM to help them stay competitive, but we also cannot ignore the experience and design hours that have led to this product.
(Editor's Note: During my time with ARM last week it became very apparent that it is not standing still, not satisfied with its current status. With competition from Intel, Qualcomm and others ramping up over the next 12 months in both mobile and server markets, ARM will more than ever be depedent on the evolution of core design and GPU design to maintain advantages in performance and efficiency. As Josh will go into more detail here, the Cortex-A72 appears to be an incredibly impressive design and all indications and conversations I have had with others, outside of ARM, believe that it will be an incredibly successful product.)
Cortex A72: Highest Performance ARM Cortex
ARM has been ubiquitous for mobile applications since it first started selling licenses for their products in the 90s. They were found everywhere it seemed, but most people wouldn’t recognize the name ARM because these chips were fabricated and sold by licensees under their own names. Guys like Ti, Qualcomm, Apple, DEC and others all licensed and adopted ARM technology in one form or the other.
ARM’s importance grew dramatically with the introduction of increased complexity cellphones and smartphones. They also gained attention through multimedia devices such as the Microsoft Zune. What was once a fairly niche company with low performance, low power offerings became the 800 pound gorilla in the mobile market. Billions of chips are sold yearly based on ARM technology. To stay in that position ARM has worked aggressively on continually providing excellent power characteristics for their parts, but now they are really focusing on overall performance and capabilities to address, not only the smartphone market, but also the higher performance computing and server spaces that they want a significant presence in.
When I first was handed the Intel Compute Stick product at CES back in January, my mind began to race with a lot of questions. The first set were centered around the capabilities of the device itself: where could it be used, how much performance could Intel pack into it and just how many users would be interested in a product like this? Another set of questions was much more philosophical in nature: why was Intel going in this direction, does this mean an end for the emphasis on high performance componentry from Intel and who comes up with these darned part numbers?
I have since settled my mind on the issues surrounding Intel’s purpose with the Compute Stick and began to dive into the product itself. On the surface the Intel Compute Stick is a product entering late into a potentially crowded market. We already have devices like the Roku, Google Chromecast, the Apple TV, and even the Amazon Fire TV Stick. All of those devices share some of the targets and goals of the Compute Stick, but the one area where Intel’s product really stands out is flexibility. The Roku has the most pre-built applications and “channels” for a streaming media box. The Chromecast is dirt cheap at just $30 or so. Even Amazon’s Fire TV Stick is clearly the best choice for streaming Amazon’s own multimedia services. But the Intel Compute Stick can do all of those things – in addition to operating as a standalone PC with Windows or Linux. Anything you can do I can do better…
But it’s not a product without a few flaws, most of which revolve around the status of the current operating system designs for TVs and larger displays. Performance obviously isn’t peeling the paint off any walls, as you would expect. But I still think at for $150 with a full copy of Windows 8.1 with Bing, the Intel Compute Stick is going to find more fans that you might have first expected.
Battle of the Sixes, they call it
GameBench is a low-level application released in 2014 that attempts to bring the technical analysis and benchmarking capability of the PC to the mobile device. You might remember that I showed you some early results and discussed our use of the GameBench testing capability in my Dell Venue 8 7000 review a few months back; my understanding and practice of using the software was just beginning at that time and continues to grow as I spend time with the software.
The idea is simple yet powerful: GameBench allows Android users, and soon iOS users, the ability to monitor frame rates of nearly any game or 3D application that you can run on your phone or tablet to accurately measure real-world performance. This is similar to what we have done for years on the PC with FRAPS and allows us to gather average frames per second data over time. This is something that was previously unavailable to consumers or press for that matter and could be a very powerful tool for device to device comparisons going forward. The ability to utilize actual games and applications and gather benchmark data that is accurate to consumer experiences, rather than simply synthetic graphics tests that we have been forced to use in the past, will fundamentally change how we test and compare mobile hardware.
Image source: GameBench.net
Today, GameBench itself released a small report meant to showcase some of the kinds of data the software can gather while also revealing early support for Apple’s iPhone and iPad devices. Primary competitors for the comparison include the Apple iPhone 6, the Samsung Galaxy S6, HTC One M9 and Motorola Nexus 6. I was able to get an early look at the report and offer some feedback, while sharing with our readers my views on the results.
GameBench tested those four devices in a total of 10 games:
- Asphalt 8: Airborne
- Real Racing 3
- Dead Trigger 2
- Kill Shot
- Modern Combat 5: Blackout
- Boom Beach
- XCOM: Enemy Unknown
- GTA: San Andreas
- Marvel: Contest of Champions
- Monument Valley
These games all vary in price and in play style, but they all are in the top 50 games lists for each platform and are known for their graphically intense settings and look.
Introduction and First Impressions
The ASUS X205 offers the full Windows 8.1 notebook experience for the cost of a Chromebook, and the design offers a surprising amount of polish for the price. Is this $199 Atom-powered notebook a viable solution as a daily driver? We're about to find out.
What do you use a laptop for? A thoughtful answer to this question can be the most important part of the process when selecting your next notebook PC, and if your needs are modest there are a growing number of very low-cost options on the market. For example, I personally do not play games on a laptop, typically alternating between web, email, and Microsoft Office. Thus for myself the most important aspects of a notebook PC become screen quality, keyboard, trackpad, and battery life. High performance is not of utmost importance, and I assure myself of at least speedy load times by always choosing (or installing) a solid-state hard drive. For those reasons when I first read the description and specifications of the ASUS X205 notebook, I took notice.
The X205 is a small notebook with an 11.6” display and 1366x768 resolution, essentially matching the form-factor of Apple's 11.6" MacBook Air. It is powered by a quad-core Intel Atom processor with 2GB of RAM, and onboard storage is solid-state - though limited to 32GB and of the slower eMMC variety (which is in keeping with many Chromebooks). There is adequate connectivity as well, with the expected wireless card and two USB 2.0 ports. One aspect of this design that intrigued me was the trackpad, which ASUS claims is using "smartphone technology", indicating a touchscreen digitizer implementation. Smoothness and accuracy are the biggest problems I find with most inexpensive notebook trackpads, and if this turns out to be a strong performer it would be a major boon to the X205's overall usability. I opted for the Microsoft Signature Edition of the X205TA, which carries the same $199 retail price but does not come preloaded with any trialware or other junk software.
At the outset this feels like a compelling product simply because it retails for the same price as an average Chromebook, but offers the flexibility of a full Windows 8.1 installation. Granted this is the “Windows 8.1 with Bing” version found on low-cost, low-power devices like this, but it offers the functionality of the standard version. While Chrome OS and Google's productivity apps are great for many people, the ability to install and run Windows applications made this highly preferable to a Chromebook for me. Of course beyond the operating system the overall experience of using the laptop will ultimately decide the viability of this inexpensive product, so without further preamble let's dive right into the X205TA notebook!