Manufacturer: StarTech

Introduction and Design

PB063557.jpg

We’re always on the hunt for good docking stations, and sometimes it can be difficult to locate one when you aren’t afforded the luxury of a dedicated docking port. Fortunately, with the advent of USB 3.0 and the greatly improved bandwidth that comes along with it, the options have become considerably more robust.

Today, we’ll take a look at StarTech’s USB3SDOCKHDV, more specifically labeled the Universal USB 3.0 Laptop Docking Station - Dual Video HDMI DVI VGA with Audio and Ethernet (whew). This docking station carries an MSRP of $155 (currently selling for $123 on Amazon.com) and is well above other StarTech options (such as the $100 USBVGADOCK2, which offers just one video output—VGA—10/100 Ethernet, and four USB 2.0 ports). In terms of street price, it is currently available at resellers such as Amazon for around $125.

The big selling points of the USB3SDOCKHDV are its addition of three USB 3.0 ports and Gigabit Ethernet—but most enticingly, its purported ability to provide three total screens simultaneously (including the connected laptop’s LCD) by way of dual HD video output. This video output can be achieved by way of either HDMI + DVI-D or HDMI + VGA combinations (but not by VGA + DVI-D). We’ll be interested to see how well this functionality works, as well as what sort of toll it takes on the CPU of the connected machine.

Continue reading our review of the StarTech USB3SDOCKHDV USB 3.0 Docking Station!!!

Author:
Subject: Mobile
Manufacturer: NVIDIA

Once known as Logan, now known as K1

NVIDIA has bet big on Tegra.  Since the introduction of the SoC's first iteration, that much was clear.  With the industry push to mobile computing and the decreased importance of the classic PC design, developing and gaining traction with a mobile processor was not only an expansion of the company’s portfolio but a critical shift in the mindset of a graphics giant. 

The problem thus far is that while NVIDIA continues to enjoy success in the markets of workstation and consumer discrete graphics, the Tegra line of silicon-on-chip processors has faltered.  Design wins have been tough to come by. Other companies with feet already firmly planted on this side of the hardware fence continue to innovate and seal deals with customers.  Qualcomm is the dominant player for mobile processors with Samsung, MediaTek, and others all fighting for the same customers NVIDIA needs.  While press conferences and releases have been all smiles and sunshine since day one, the truth is that Tegra hasn’t grown at the rate NVIDIA had hoped.

Solid products based on NVIDIA Tegra processors have been released.  The first Google Nexus 7 used the Tegra 3 processor, and was considered the best Android tablet on the market by most, until it was succeeded by the 2013 iteration of the Nexus 7 this year.  Tegra 4 slipped backwards, though – the NVIDIA SHIELD mobile gaming device was the answer for a company eager to show the market they built compelling and relevant hardware.  It has only partially succeeded in that task.

denver2.jpg

With today’s announcement of the Tegra K1, previously known as Logan or Tegra 5, NVIDIA hopes to once again spark a fire under partners and developers, showing them that NVIDIA’s dominance in the graphics fields of the PC has clear benefits to the mobile segment as well.  During a meeting with NVIDIA about Tegra K1, Dan Vivoli, Senior VP of marketing and a 16 year employee, equated the release of the K1 to the original GeForce GPU.  That is a lofty ambition and puts of a lot pressure on the entire Tegra team, not to mention the K1 product itself, to live up to.

Tegra K1 Overview

What we previously knew as Logan or Tegra 5 (and actually it was called Tegra 5 until just a couple of days ago), is now being released as the Tegra K1.  The ‘K’ designation indicated the graphics architecture that powers the SoC, in this case Kepler.  Also, it’s the first one.  So, K1.

The processor of the Tegra K1 look very familiar and include four ARM Cortex-A15 “r3” cores and 2MB of L2 cache with a fifth A15 core used for lower power situations.  This 4+1 design is the same that was introduced with the Tegra 4 processor last year and allows NVIDIA to implement a style of “big.LITTLE” design that is unique.  Some slight modifications to the cores are included with Tegra K1 that improve performance and efficiency, but not by much – the main CPU is very similar to the Tegra 4.

NVIDIA also unveiled late last night that another version of the Tegra K1 that replaces the quad A15 cores with two of the company's custom designs Denver CPU cores.  Project Denver, announced in early 2011, is NVIDIA's attempt at building its own core design based on the ARMv8 64-bit ISA.  This puts this iteration of Tegra K1 on the same level as Apple's A7 and Qualcomm's Krait processors.  When these are finally available in the wild it will be incredibly intriguing to see how well NVIDIA's architects did in the first true CPU design from the GPU giant.

Continue reading about NVIDIA's new Tegra K1 SoC with Kepler-based graphics!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Subject: Mobile
Manufacturer: Lenovo

Introduction and Design

PA273501.jpg

Contortionist PCs are a big deal these days as convertible models take the stage to help bridge the gap between notebook and tablet. But not everyone wants to drop a grand on a convertible, and not everyone wants a 12-inch notebook, either. Meanwhile, these same people may not wish to blow their cash on an underpowered (and far less capable) Chromebook or tablet. It’s for these folks that Lenovo has introduced the IdeaPad Flex 14 Ultrabook, which occupies a valuable middle ground between the extremes.

The Flex 14 looks an awful lot like a Yoga at first glance, with the same sort of acrobatic design and a thoroughly IdeaPad styling (Lenovo calls it a “dual-mode notebook”). The specs are also similar to that of the x86 Yoga, though with the larger size (and later launch), the Flex also manages to assemble a slightly more powerful configuration:

specs.png

The biggest internal differences here are the i5-4200U CPU, which is a 1.6 GHz Haswell model with a TDP of 15 W and the ability to Turbo Boost (versus the Yoga 11S’ i5-3339Y, which is Ivy Bridge with a marginally lower TDP of 13 W and no Turbo Boost), the integrated graphics improvements that follow with the newer CPU, and a few more ports made possible by the larger chassis. Well, and the regression to a TN panel from the Yoga 11S’ much-appreciated IPS display, which is a bummer. Externally, your wallet will also appreciate a $250 drop in price: our model, as configured here, retails for just $749 (versus the $999 Yoga 11S we reviewed a few months back).

You can actually score a Flex 14 for as low as $429 (as of this writing), by the way, but if you’re after any sort of respectable configuration, that price quickly climbs above the $500 mark. Ours is the least expensive option currently available with both a solid-state drive and an i5 CPU.

Continue reading our review of the Lenovo IdeaPad Flex 14!!!

Author:
Subject: Mobile
Manufacturer: NVIDIA

Streaming games straight from NVIDIA

Over the weekend NVIDIA released a December update for the SHIELD Android mobile gaming device that included a very interesting, and somewhat understated, new feature: Beta support for NVIDIA GRID.  

You have likely heard of GRID before, NVIDIA has been pushing it as part of the companies vision going forward to GPU computing in every facet and market.  GRID was aimed at creating GPU-based server farms to enable mobile, streaming gaming to users across the country and across the world.  While initially NVIDIA only talked about working with partners to launch streaming services based on GRID, they have obviously changed their tune slightly with this limited release.

01_0.png

If you own a SHIELD, and install the most recent platform update, you'll find a new icon in your NVIDIA SHIELD menu called GRID Beta.  The first time you start this new application, it will attempt to measure your bandwidth and latency to offer up an opinion on how good your experience should be.  NVIDIA is asking for at least 10 Mbps of sustained bandwidth, and wants round trip latency under 60 ms from your location to their servers.

02_0.png

Currently, servers are ONLY located in Northern California so the further out you are, the more likely you will be to run into problems.  However, oing some testing in Kentucky and Ohio resulted in a very playable gaming scenarios, though we did run into some connection problems that might be load-based or latency-based.

04_0.png

After the network setup portion users are shown 8 different games that they can try.  Darksiders, Darksiders II, Street Fighter X Tekken, Street Fighter IV, Alan Wake, The Witcher 2, Red Faction: Armageddon and Trine 2.  You are free to play them free of charge during this beta though I think you can be sure they will be removed and erased at some point; just a reminder.  Saves work well and we were able to save and resume games of Darksiders 2 on GRID easily and quickly.

06_0.png

Starting up the game was fast, about on par with starting up a game on a local PC, though obviously the server is loading it in the background.  Once the game is up and running, you are met with some button mapping information provided by NVIDIA for that particular game (great addition) and then you jump into the menus as if you were running it locally.

Continue reading our first hands on with NVIDIA GRID on SHIELD!!

Author:
Subject: Mobile
Manufacturer: EVGA

NVIDIA Tegra Note Program

Clearly, NVIDIA’s Tegra line has not been as successful as the company had hoped and expected.  The move for the discrete GPU giant into the highly competitive world of the tablet and phone SoCs has been slower than expected, and littered with roadblocks that were either unexpected or that NVIDIA thought would be much easier to overcome. 

IMG_1879.JPG

The truth is that this was always a long play for the company; success was never going to be overnight and anyone that thought that was likely or possible was deluded.  Part of it has to do with the development cycle of the ARM ecosystem.  NVIDIA is used to a rather quick development, production, marketing and sales pattern thanks to its time in high performance GPUs, but the SoC world is quite different.  By the time a device based on a Tegra chip is found in the retail channel it had to go through an OEM development cycle, NVIDIA SoC development cycle and even an ARM Cortex CPU development cycle.  The result is an extended time frame from initial product announcement to retail availability.

Partly due to this, and partly due to limited design wins in the mobile markets, NVIDIA has started to develop internal-designed end-user devices that utilize its Tegra SoC processors.  This has the benefit of being much faster to market – while most SoC vendors develop reference platforms during the normal course of business,  NVIDIA is essentially going to perfect and productize them.

Continue reading our review of the NVIDIA Tegra Note 7 $199 Tablet!!

Subject: Mobile
Manufacturer: MSI

Introduction and Design

P9173249.jpg

With few exceptions, it’s generally been taken for granted that gaming notebooks are going to be hefty devices. Portability is rarely the focus, with weight and battery life alike usually sacrificed in the interest of sheer power. But the MSI GE40 2OC—the lightest 14-inch gaming notebook currently available—seeks to compromise while retaining the gaming prowess. Trending instead toward the form factor of a large Ultrabook, the GE40 is both stylish and manageable (and perhaps affordable at around $1,300)—but can its muscle withstand the reduction in casing real estate?

While it can’t hang with the best of the 15-inch and 17-inch crowd, in context with its 14-inch peers, the GE40’s spec sheet hardly reads like it’s been the subject of any sort of game-changing handicap:

specs.png

One of the most popular CPUs for Haswell gaming notebooks has been the 2.4 GHz (3.4 GHz Turbo) i7-4700MQ. But the i7-4702MQ in the GE40-20C is nearly as powerful (managing 2.2 GHz and 3.2 GHz in those same areas respectively), and it features a TDP that’s 10 W lower at just 37 W. That’s ideal for notebooks such as the GE40, which seek to provide a thinner case in conjunction with uncompromising performance. Meanwhile, the NVIDIA GTX 760M is no slouch, even if it isn’t on the same level as the 770s and 780s that we’ve been seeing in some 15.6-inch and 17.3-inch gaming beasts.

Elsewhere, it’s business as usual, with 8 GB of RAM and a 120 GB SSD rounding out the major bullet points. Nearly everything here is on par with the best of rival 14-inch gaming models with the exception of the 900p screen resolution (which is bested by some notebooks, such as Dell’s Alienware 14 and its 1080p panel).

Continue reading our review of the MSI GE40 2OC!!!

Author:
Manufacturer: ARM

ARM is Serious About Graphics

Ask most computer users from 10 years ago who ARM is, and very few would give the correct answer.  Some well informed people might mention “Intel” and “StrongARM” or “XScale”, but ARM remained a shadowy presence until we saw the rise of the Smartphone.  Since then, ARM has built up their brand, much to the chagrin of companies like Intel and AMD.  Partners such as Samsung, Apple, Qualcomm, MediaTek, Rockchip, and NVIDIA have all worked with ARM to produce chips based on the ARMv7 architecture, with Apple being the first to release the first ARMv8 (64 bit) SOCs.  The multitude of ARM architectures are likely the most shipped chips in the world, going from very basic processors to the very latest Apple A7 SOC.

t700_01.jpg

The ARMv7 and ARMv8 architectures are very power efficient, yet provide enough performance to handle the vast majority of tasks utilized on smartphones and tablets (as well as a handful of laptops).  With the growth of visual computing, ARM also dedicated itself towards designing competent graphics portions of their chips.  The Mali architecture is aimed at being an affordable option for those without access to their own graphics design groups (NVIDIA, Qualcomm), but competitive with others that are willing to license their IP out (Imagination Technologies).

ARM was in fact one of the first to license out the very latest graphics technology to partners in the form of the Mali-T600 series of products.  These modules were among the first to support OpenGL ES 3.0 (compatible with 2.0 and 1.1) and DirectX 11.  The T600 architecture is very comparable to Imagination Technologies’ Series 6 and the Qualcomm Adreno 300 series of products.  Currently NVIDIA does not have a unified mobile architecture in production that supports OpenGL ES 3.0/DX11, but they are adapting the Kepler architecture to mobile and will be licensing it to interested parties.  Qualcomm does not license out Adreno after buying that group from AMD (Adreno is an anagram of Radeon).

Click to read the entire article here!

Subject: Mobile
Manufacturer: ASUS

Introduction and Design

P9033132.jpg

As we’re swimming through the veritable flood of Haswell refresh notebooks, we’ve stumbled across the latest in a line of very popular gaming models: the ASUS G750JX-DB71.  This notebook is the successor to the well-known G75 series, which topped out at an Intel Core i7-3630QM with NVIDIA GeForce GTX 670MX dedicated graphics.  Now, ASUS has jacked up the specs a little more, including the latest 4th-gen CPUs from Intel as well as 700-series NVIDIA GPUs.

Our ASUS G750JX-DB71 test unit features the following specs:

specs.png

Of course, the closest comparison to this unit is already the most recently-reviewed MSI GT60-2OD-026US, which featured nearly identical specifications, apart from a 15.6” screen, a better GPU (a GTX 780M with 4 GB GDDR5), and a slightly different CPU (the Intel Core i7-4700MQ).  In case you’re wondering what the difference is between the ASUS G750JX’s Core i7-4700MQ and the GT60’s i7-4700HQ, it’s very minor: the HQ features a slightly faster integrated graphics Turbo frequency (1.2 GHz vs. 1.15 GHz) and supports Intel Virtualization Technology for Directed I/O (VT-d).  Since the G750JX doesn’t support Optimus, we won’t ever be using the integrated graphics, and unless you’re doing a lot with virtual machines, VT-d isn’t likely to offer any benefits, either.  So for all intents and purposes, the CPUs are equivalent—meaning the biggest overall performance difference (on the spec sheet, anyway) lies with the GPU and the storage devices (where the G750JX offers more solid-state storage than the GT60).  It’s no secret that the MSI GT60 burned up our benchmarks—so the real question is, how close is the ASUS G750JX to its pedestal, and if the differences are considerable, are they justified?

At an MSRP of around $2,000 (though it can be found for around $100 less), the ASUS G750JX-DB71 competes directly with the likes of the MSI GT60, too (which is priced equivalently).  The question, of course, is whether it truly competes.  Let’s find out!

P9033137.jpg

Continue reading our review of the ASUS G750JX-DB71 Gaming Notebook!!!

Manufacturer: Scott Michaud

A new generation of Software Rendering Engines.

We have been busy with side projects, here at PC Perspective, over the last year. Ryan has nearly broken his back rating the frames. Ken, along with running the video equipment and "getting an education", developed a hardware switching device for Wirecase and XSplit.

My project, "Perpetual Motion Engine", has been researching and developing a GPU-accelerated software rendering engine. Now, to be clear, this is just in very early development for the moment. The point is not to draw beautiful scenes. Not yet. The point is to show what OpenGL and DirectX does and what limits are removed when you do the math directly.

Errata: BioShock uses a modified Unreal Engine 2.5, not 3.

In the above video:

  • I show the problems with graphics APIs such as DirectX and OpenGL.
  • I talk about what those APIs attempt to solve, finding color values for your monitor.
  • I discuss the advantages of boiling graphics problems down to general mathematics.
  • Finally, I prove the advantages of boiling graphics problems down to general mathematics.

I would recommend watching the video, first, before moving forward with the rest of the editorial. A few parts need to be seen for better understanding.

Click here, after you watch the video, to read more about GPU-accelerated Software Rendering.

Manufacturer: Microsoft

If Microsoft was left to their own devices...

Microsoft's Financial Analyst Meeting 2013 set the stage, literally, for Steve Ballmer's last annual keynote to investors. The speech promoted Microsoft, its potential, and its unique position in the industry. He proclaims, firmly, their desire to be a devices and services company.

The explanation, however, does not befit either industry.

microsoft-ballmer-goodbye.jpg

Ballmer noted, early in the keynote, how Bing is the only notable competitor to Google Search. He wanted to make it clear, to investors, that Microsoft needs to remain in the search business to challenge Google. The implication is that Microsoft can fill the cracks where Google does not, or even cannot, and establish a business from that foothold. I agree. Proprietary products (which are not inherently bad by the way), as Google Search is, require one or more rivals to fill the overlooked or under-served niches. A legitimate business can be established from that basis.

It is the following, similar, statement which troubles me.

Ballmer later mentioned, along the same vein, how Microsoft is among the few making fundamental operating system investments. Like search, the implication is that operating systems are proprietary products which must compete against one another. This, albeit subtly, does not match their vision as a devices and services company. The point of a proprietary platform is to own the ecosystem, from end to end, and to derive your value from that control. The product is not a device; the product is not a service; the product is a platform. This makes sense to them because, from birth, they were a company which sold platforms.

A platform as a product is not a device nor is it service.

Keep reading to see what Microsoft... probably still cannot.

Author:
Subject: Processors, Mobile
Manufacturer: Intel

A Whole New Atom Family

This past spring I spent some time with Intel at its offices in Santa Clara to learn about a brand new architecture called Silvermont.  Built for and targeted at low power platforms like tablets and smartphones, Silvermont was not simply another refresh of the aging Atom processors that were all based on Pentium cores from years ago; instead Silvermont was built from the ground up for low power consumption and high efficiency to compete against the juggernaut that is ARM and its partners.  My initial preview of the Silvermont architecture had plenty of detail about the change to an out-of-order architecture, the dual-core modules that comprise it and the power optimizations included. 

slides01.jpg

Today, during the annual Intel Developer Forum held in San Francisco, we are finally able to reveal the remaining details about the new Atom processors based on Silvermont, code named Bay Trail.  Not only do we have new information about the designs, but we were able to get our hands on some reference tablets integrating Bay Trail and the new Atom Z3000 series of SoCs to benchmark and compare to offerings from Qualcomm, NVIDIA and AMD.

 

A Whole New Atom Family

It should be surprise to anyone that the name “Intel Atom Processor” has had a stigma attached to it almost since its initial release during the netbook craze.  It was known for being slow and hastily put together though it was still a very successful product in terms of sales.  With each successive release and update, Diamondville to Pineview to Cedarview, Atom was improved but only marginally so.  Even with Medfield and Clover Trail the products were based around that legacy infrastructure and it showed.  Tablets and systems based on Clover Trail saw only moderate success and lukewarm reviews.

slides02.jpg

With Silvermont the Atom brand gets a second chance.  Some may consider it a fifth or sixth chance, but Intel is sticking with the name.  Silvermont as an architecture is incredibly flexible and will find its way into several Intel products like Avoton, Bay Trail and Merrifield and in segments from the micro-server to smartphones to convertible tablets.  Not only that, but Intel is aware that Windows isn’t the only game out there anymore and the company will support the architecture across Linux, Android and Windows environments. 

Atom has been in tablets for some time now, starting in September of last year with Clover Trail deigns being announced during IDF.  In February we saw the initial Android-based options also filter out, again based on Clover Trail.  They were okay, but really only stop-gaps to prove that Intel was serious about the space.  The real test will be this holiday season with Bay Trail at the helm.

slides03.jpg

While we always knew these Bay Trail platforms were going be branded as Atom we now have the full details on the numbering scheme and productization of the architecture.  The Atom Z3700 series will consist of quad-core SoCs with Intel HD graphics (the same design as the Core processor series though with fewer compute units) that will support Windows and Android operating systems.  The Atom Z3600 will be dual-core processors, still with Intel HD graphics, targeted only at the Android market.

Continue reading our review of the Intel Bay Trail processor, Atom Z3770!!

Subject: Mobile
Manufacturer:

Introduction and Design

P7192827.jpg

It seems like only yesterday (okay, last month) that we were testing the IdeaPad Yoga 11, which was certainly an interesting device. That’s primarily because of what it represents: namely, the slow merging of the tablet and notebook markets. You’ve probably heard people proclaiming the death of the PC as we know it. Not so fast—while it’s true that tablets have eaten into the sales of what were previously low-powered notebooks and now-extinct netbooks, there is still no way to replace the utility of a physical keyboard and the sensibility of a mouse cursor. Touch-centric devices are hard to beat when entertainment and education are the focus of a purchase, but as long as productivity matters, we aren’t likely to see traditional means of input and a range of connectivity options disappear anytime soon.

The IdeaPad Yoga 11 leaned so heavily in the direction of tablet design that it arguably was more tablet than notebook. That is, it featured a tablet-grade SOC (the nVidia Tegra 3) as opposed to a standard Intel or AMD CPU, an 11” display, and a phenomenal battery life that can only be compared to the likes of other ARM-based tablets. But, of course, with those allegiances come necessary concessions, not least of which is the inability to run x86 applications and the consequential half-baked experiment that is Windows RT.

P7192796.jpg

Fortunately, there’s always room for compromise, and for those of us searching for something closer to a notebook than the original Yoga 11, we’re now afforded the option of the 11S. Apart from being nearly identical in terms of form factor, the $999 (as configured) Yoga 11S adopts a standard x86 chipset with Intel ULV CPUs, which allows it to run full-blown Windows 8. That positions it squarely in-between the larger x86 Yoga 13 and the ARM-based Yoga 11, which makes it an ideal candidate for someone hoping for the best of both worlds. But can it survive the transition, or do its compromises outstrip its gains?

Our Yoga 11S came equipped with a fairly standard configuration:

specs.png

Unless you’re comparing to the Yoga 11’s specs, not much about this stands out. The Core i5-3339Y is the first thing that jumps out at you; in exchange for the nVidia Tegra 3 ARM-based SOC of the original Yoga 11, it’s a much more powerful chip with a 13W TDP and (thanks to its x86 architecture) the ability to run Windows 8 and standard Windows applications. Next on the list is the included 8 GB of DDR3 RAM—versus just 2 GB on the Yoga 11. Finally, there’s USB 3.0 and a much larger SSD (256 GB vs. 64 GB)—all valuable additions. One thing that hasn’t changed, meanwhile, is the battery size. Surely you’re wondering how this will affect the longevity of the notebook under typical usage. Patience; we’ll get to that in a bit! First, let’s talk about the general design of the notebook.

Continue reading our review of the Lenovo IdeaPad Yoga 11S Convertible Notebook!

Author:
Subject: Storage, Mobile
Manufacturer: Corsair

500GB on the go

Corsair seems to have its fingers in just about everything these days so why not mobile storage, right?  The Voyager Air a multi-function device that Corsair calls as "portable wireless drive, home network drive, USB drive, and wireless hub."  This battery powered device is meant to act as a mobile hard drive for users that need more storage on the go including PCs and Macs as well as iOS and Android users. 

The Voyager Air can also act as a basic home NAS device with a Gigabit Ethernet connection on board for all the computers on your local network. And if you happen to have DLNA ready Blu-ray players or TVs nearby, they can access the video and audio stored on the Voyager Air as well.

IMG_9838_0.JPG

Available in either red or black, with 500GB and 1TB capacities, the Voyager Air is slim and sleek, meant to be seen not hidden in a closet. 

IMG_9836_0.JPG

The front holds the power switch and WiFi on/off switch as well as back-lit icons to check for power, battery life and connection status. 

IMG_9835_0.JPG

Continue reading our review of the Corsair Voyager Air 500GB Wireless USB 3.0 HDD!!

Author:
Subject: Mobile
Manufacturer: NVIDIA

The Hardware

Dear NVIDIA,

It has come to my attention that you are planning on producing and selling a device to be called “NVIDIA SHIELD.”  It should be noted that even though it shares the same name, this device has no matching attributes of the super-hero comic-based security agency.  Please adjust.

 

When SHIELD was previewed to the world at CES in January of this year, there were a hundred questions about the device.  What would it cost?  Would the build quality stand up to expectations?  Would the Android operating system hold up as a dedicated gaming platform?  After months of waiting a SHIELD unit finally arrived in our offices in early July, giving us plenty of time (I thought) to really get a feel for the device and its strengths and weakness.  As it turned out though, it still seemed like an inadequate amount of time to really gauge this product.  But I am going to take a stab at it, feature by feature.

IMG_9794.JPG

NVIDIA SHIELD aims to be a mobile gaming platform based on Android with a flip out touch-screen interface, high quality console design integrated controller, and added features like PC game streaming and Miracast support.

Initial Unboxing and Overview of Product Video

 

The Hardware

At the heart of NVIDIA SHIELD is the brand new Tegra 4 SoC, NVIDIA’s latest entry into the world of mobile processors.  Tegra 4 is a quad-core, ARM Cortex-A15 based SoC that includes a 5th A15 core built on lower power optimized process technology to run background and idle tasks using less power.  This is very similar to what NVIDIA did with Tegra 3’s 4+1 technology, and how ARM is tackling the problem with big.LITTLE philosophy. 

t4.jpg

Continue reading our review of the NVIDIA SHIELD Android gaming device!!

Author:
Subject: Storage, Mobile
Manufacturer: Promise
Tagged:

Overview

Since the initial release of the first computers with Intel’s Thunderbolt technology, Promise has been on the forefront of Thunderbolt-enabled storage devices. Starting with the Pegasus R4 and R6, Promise was the first company to provide an external RAID solution with a Thunderbolt interface.
 
r4.jpg
 
Last year, we took a look the the Pegasus R4 in our initial Windows Thunderbolt testing, and were extremely satisfied with the performance we saw. Since then, a Pegasus Thunderbolt RAID device filled with SSDs has been crucial to our Frame Rating graphics testing methodology, providing the extremely high bandwidth we need to capture uncompressed video.
 
Today we are taking a look a different class of storage device from Promise, the Pegasus J2. The J2 is an external Thunderbolt-based SSD, which Promise says is capable of speeds up to  550 MB/s write and 750 MB/s read. Being one of the only standalone Thunderbolt drives we have seen, we were eager to take a look and evaluate these claims.
 
IMG_0155.JPG
 
The best way to describe the size of the Pegasus J2 would be approximately the same as standard deck of playing cards. While it may not be as small as some of the external USB3 SSDs we have seen, the J2 remains a reasonable size for throwing in a backpack or briefcase on the go.
 
Internals of the J2 consist of two mSATA SSDs each ssitting behind a ASmedia 1061 PCI-Express SATA 6G controller, which is then connected to Intel’s Port Ridge Thunderbolt controller. Due to the lack of RAID functionality in the ASMedia 1061, the SSDs appear as two separate logical drives, rely on software RAID inside of whatever OS you are using.
 
IMG_0166.JPG
 
The SSDs themselves are based on the Phison PS3108 controller. While Phison doesn’t get much coverage from their SSD controllers, their controllers have been found in some value SSDs from the likes of Kingston, Patriot, and other companies for a few years at this point. 
 
Author:
Manufacturer: NVIDIA

NVIDIA Finally Gets Serious with Tegra

Tegra has had an interesting run of things.  The original Tegra 1 was utilized only by Microsoft with Zune.  Tegra 2 had a better adoption, but did not produce the design wins to propel NVIDIA to a leadership position in cell phones and tablets.  Tegra 3 found a spot in Microsoft’s Surface, but that has turned out to be a far more bitter experience than expected.  Tegra 4 so far has been integrated into a handful of products and is being featured in NVIDIA’s upcoming Shield product.  It also hit some production snags that made it later to market than expected.

I think the primary issue with the first three generations of products is pretty simple.  There was a distinct lack of differentiation from the other ARM based products around.  Yes, NVIDIA brought their graphics prowess to the market, but never in a form that distanced itself adequately from the competition.  Tegra 2 boasted GeForce based graphics, but we did not find out until later that it was comprised of basically four pixel shaders and four vertex shaders that had more in common with the GeForce 7800/7900 series than it did with any of the modern unified architectures of the time.  Tegra 3 boasted a big graphical boost, but it was in the form of doubling the pixel shader units and leaving the vertex units alone.

kepler_smx.jpg

While NVIDIA had very strong developer relations and a leg up on the competition in terms of software support, it was never enough to propel Tegra beyond a handful of devices.  NVIDIA is trying to rectify that with Tegra 4 and the 72 shader units that it contains (still divided between pixel and vertex units).  Tegra 4 is not perfect in that it is late to market and the GPU is not OpenGL ES 3.0 compliant.  ARM, Imagination Technologies, and Qualcomm are offering new graphics processing units that are not only OpenGL ES 3.0 compliant, but also offer OpenCL 1.1 support.  Tegra 4 does not support OpenCL.  In fact, it does not support NVIDIA’s in-house CUDA.  Ouch.

Jumping into a new market is not an easy thing, and invariably mistakes will be made.  NVIDIA worked hard to make a solid foundation with their products, and certainly they had to learn to walk before they could run.  Unfortunately, running effectively entails having design wins due to outstanding features, performance, and power consumption.  NVIDIA was really only average in all of those areas.  NVIDIA is hoping to change that.  Their first salvo into offering a product that offers features and support that is a step above the competition is what we are talking about today.

Continue reading our article on the NVIDIA Kepler architecture making its way to mobile markets and Tegra!

Subject: Mobile
Manufacturer: MSI

Introduction and Design

P7102677.jpg

With the release of Haswell upon us, we’re being treated to an impacting refresh of some already-impressive notebooks. Chief among the benefits is the much-championed battery life improvements—and while better power efficiency is obviously valuable where portability is a primary focus, beefier models can also benefit by way of increased versatility. Sure, gaming notebooks are normally tethered to an AC adapter, but when it’s time to unplug for some more menial tasks, it’s good to know that you won’t be out of juice in a couple of hours.

Of course, an abundance of gaming muscle never hurts, either. As the test platform for one of our recent mobile GPU analyses, MSI’s 15.6” GT60 gaming notebook is, for lack of a better description, one hell of a beast. Following up on Ryan’s extensive GPU testing, we’ll now take a more balanced and comprehensive look at the GT60 itself. Is it worth the daunting $1,999 MSRP? Does the jump to Haswell provide ample and economical benefits? And really, how much of a difference does it make in terms of battery life?

Our GT60 test machine featured the following configuration:

specs.png

In case it wasn’t already apparent, this device makes no compromises. Sporting a desktop-grade GPU and a quad-core Haswell CPU, it looks poised to be the most powerful notebook we’ve tested to date. Other configurations exist as well, spanning various CPU, GPU, and storage options. However, all available GT60 configurations feature a 1080p anti-glare screen, discrete graphics (starting at the GTX 670M and up), Killer Gigabit LAN, and a case built from metal and heavy-duty plastic. They also come preconfigured with Windows 8, so the only way to get Windows 7 with your GT60 is to purchase it through a reseller that performs customizations.

P7102684.jpg

Continue reading our review of the MSI GT60 Gaming Notebook!!

Author:
Manufacturer: NVIDIA

Another Wrench – GeForce GTX 760M Results

Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board.  While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.

IMG_0141.JPG

The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics.  Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.

760mspecs.png

This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology.  It is configured with 2GB of GDDR5 memory running at 2.0 GHz

If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first.  Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing.  And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today. 

If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop.  The data presented below depends on that background knowledge!

Okay, you’ve been warned – on to the results.

Continue reading our story about GeForce GTX 760M Frame Rating results and Haswell Optimus issues!!

Author:
Manufacturer: Various

Battle of the IGPs

Our long journey with Frame Rating, a new capture-based analysis tool to measure graphics performance of PCs and GPUs, began almost two years ago as a way to properly evaluate the real-world experiences for gamers.  What started as a project attempting to learn about multi-GPU complications has really become a new standard in graphics evaluation and I truly believe it will play a crucial role going forward in GPU and game testing. 

Today we use these Frame Rating methods and tools, which are elaborately detailed in our Frame Rating Dissected article, and apply them to a completely new market: notebooks.  Even though Frame Rating was meant for high performance discrete desktop GPUs, the theory and science behind the entire process is completely applicable to notebook graphics and even on the integrated graphics solutions on Haswell processors and Richland APUs.  It also is able to measure performance of discrete/integrated graphics combos from NVIDIA and AMD in a unique way that has already found some interesting results.

 

Battle of the IGPs

Even though neither side wants us to call it this, we are testing integrated graphics today.  With the release of Intel’s Haswell processor (the Core i7/i5/i3 4000) the company has upgraded the graphics noticeably on several of their mobile and desktop products.  In my first review of the Core i7-4770K, a desktop LGA1150 part, the integrated graphics now known as the HD 4600 were only slightly faster than the graphics of the previous generation Ivy Bridge and Sandy Bridge.  Even though we had all the technical details of the HD 5000 and Iris / Iris Pro graphics options, no desktop parts actually utilize them so we had to wait for some more hardware to show up. 

 

mbair.JPG

When Apple held a press conference and announced new MacBook Air machines that used Intel’s Haswell architecture, I knew I could count on Ken to go and pick one up for himself.  Of course, before I let him start using it for his own purposes, I made him sit through a few agonizing days of benchmarking and testing in both Windows and Mac OS X environments.  Ken has already posted a review of the MacBook Air 11-in model ‘from a Windows perspective’ and in that we teased that we had done quite a bit more evaluation of the graphics performance to be shown later.  Now is later.

So the first combatant in our integrated graphics showdown with Frame Rating is the 11-in MacBook Air.  A small, but powerful Ultrabook that sports more than 11 hours of battery life (in OS X at least) but also includes the new HD 5000 integrated graphics options.  Along with that battery life though is the GT3 variation of the new Intel processor graphics that doubles the number of compute units as compared to the GT2.  The GT2 is the architecture behind the HD 4600 graphics that sits with nearly all of the desktop processors, and many of the notebook versions, so I am very curious how this comparison is going to stand. 

Continue reading our story on Frame Rating with Haswell, Trinity and Richland!!

Subject: Mobile
Manufacturer: Lenovo

Introduction and Design

P6052455_0.jpg

As headlines mount championing the supposed shift toward tablets for the average consumer, PC manufacturers continue to devise clever hybrid solutions to try and lure those who are on the fence toward more traditional machines.  Along with last year’s IdeaPad Yoga 13 and ThinkPad Twist, Lenovo shortly thereafter launched the smallest of the bunch, an 11.6” convertible tablet PC with a 5-point touch 720p IPS display.

Unlike its newer, more powerful counterpart, the Yoga 11S, it runs Windows RT and features an NVIDIA Tegra 3 Quad-Core system on a chip (SoC).  There are pros and cons to this configuration in contrast to the 11S.  For starters, the lower-voltage, fanless design of the 11 guarantees superior battery life (something which we’ll cover in detail in just a bit).  It’s also consequently (slightly) smaller and lighter than the 11S, which gains a hair on height and weighs around a quarter pound more.  But, as you’re probably aware, Windows RT also doesn’t qualify as a fully-functional version of Windows—and, in fact, the Yoga 11’s versatility is constrained by the relatively meager selection of apps available on the Windows Store.  The other obvious difference is architecture and chipset, where the Yoga 11’s phone- and tablet-grade ARM-based NVIDIA Tegra 3 is replaced on the 11S by Intel Core Ivy Bridge ULV processors.

But let’s forget about that for a moment.  What it all boils down to is that these two machines, while similar in terms of design, are different enough (both in terms of specs and price) to warrant a choice between them based on your intended use.  The IdeaPad Yoga 11 configuration we reviewed can currently be found for around $570 at retailers such as Amazon and Newegg.  In terms of its innards:

specs_0.png

If it looks an awful lot like the specs of your latest smartphone, that’s probably because it is.  The Yoga 11 banks on the fact that such ARM-based SoCs have become powerful enough to run a modern personal computer comfortably—and by combining the strengths of an efficient, low-power chipset with the body of a notebook, it reaps benefits from both categories.  Of course, there are trade-offs involved, starting with the 2 GB memory ceiling of the chipset and extending to the aforementioned limitations of Windows RT.  So the ultimate question is, once those trade-offs are considered, is the Yoga 11 still worth the investment?

Continue reading our review of the Lenovo IdeaPad Yoga 11 Tegra 3 notebook!!