Author:
Manufacturer: NVIDIA

A necessary gesture

NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC.  But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem.  But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.

nv14.jpg

The slide above shows NVIDIA targeting for each segment – expect for consoles obviously.  NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces.  I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?

The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers.  AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets.  But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.

nv02.jpg

With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine.  They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software.  They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.

nv03.jpg

This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more. 

nv04.jpg

Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming. 

Continue reading our editorial on NVIDIA's stance on it's future in PC gaming!!

NVIDIA Announces Shield Pricing, Taking Pre-orders

Subject: General Tech, Mobile | May 14, 2013 - 09:06 AM |
Tagged: tegra 4, tegra, shield, project shield, nvidia

Solid information about the NVIDIA Shield (no longer called Project Shield) is finally becoming available with a blog post written up today on NVIDIA's website.  The company will begin accepting pre-orders from users that have previously signed up for the Shield mailing list while the rest of you will have to wait until May 20th to plop down your money. 

The cost?  $349.  Newegg, Gamestop, Micro Center and Canada Computer will carry it.

NV_Shield_Angled_Left_RealBoxing_LR.JPG

If you want to sign up for official June release schedule of the Tegra 4 powered mobile Android gaming device, you'll have to head over to shield.nvidia.com

NVIDIA does point out in the blog that the PC game streaming feature that I truly believe is the one thing that makes Shield a compelling gaming device, will be launching as BETA feature.

And GeForce game streaming, launching as a beta feature, will give SHIELD the power to access your NVIDIA GeForce GTX GPU-powered computer from the comfort of your couch. We’re working on streaming your favorite PC games to SHIELD, including great titles from Steam.

High level features of the device, for those of you that are unaware, include:

  • Tegra 4 – The world’s fastest mobile processor delivers rich graphics and unbeatable performance thanks to 72 GPU cores, four CPU cores and 2GB of RAM
  • Console-grade controller – Precise control thanks to dual analog joysticks, a full-sized D-Pad, left and right analog triggers, full-sized bumpers and A/B/X/Y buttons
  • Multi-touch display – 5-inch, 720p retinal multi-touch display for high-fidelity visuals
  • Integrated speakers – Custom, bass reflex, tuned port audio system – we think this is SHIELD’s sleeper feature
  • Wi-Fi – 802.11n 2X2 MIMO game-speed Wi-Fi for seamless game streaming
  • Pure Android – Latest Android Jelly Bean operating system from Google, for access to Android games and apps
  • There’s more – We put into SHIELD everything we would want in a premium mobile gaming device: 16 GB memory, GPS, Bluetooth 3.0, a mini-HDMI output, micro-USB 2.0, a microSD storage slot, a 3.5-mm stereo headphone jack. See the full spec sheet, here.

NV_Shield_Front_Open_LR.JPG

We took a look at the NVIDIA Shield device at CES this year and posted a video of our experiences, so check it out below. 

NVIDIA has also posted a separate blog that talks about some of the upcoming Android games that will highlight the power of the Tegra 4 mobile processor including Broken Age and Costume Quest from Double Fine, Chuck's Challenge from Niffler and more. 

Broken_Age_001_Wtrmrkd_2013.jpg

I think many people at NVIDIA as well as in the media are very curious to see what the reaction of Shield will actully be upon its release.  I am very excited to test it out in real-world, long term usage models but I definitely have doubts about the market's desire for another mobile gaming platform. 

Leave me your thoughts in the comments below!!

Source: NVIDIA

NVIDIA shows Project Shield manufacturing mold, building hype

Subject: Mobile | May 10, 2013 - 03:28 PM |
Tagged: tegra 4, tegra, shield, project shield, nvidia

After the initial announcement at CES in January, NVIDIA has been trying hard to keep excitement and interest about Project Shield going.  The upcoming Tegra 4-powered mobile Android-based gaming machine will be launched sometime in the summer; both Computex and E3 would make perfect timing. 

NVIDIA passed us a photo of the mold for the casing of Project Shield and though you don't really get any awesome new information out of it, I thought I would share.

Project_Shield_Mold.jpg

The photo you see below shows the production mold that's used to craft the ergonomic casing that houses Project SHIELD's high-powered components: Tegra 4, 5-inch 720p HD retinal touchscreen, Stereo Bass Reflex Speakers, WiFi, accelerometer, gyro, a massive battery, and more. 

To create the casing, we inject a polycarbonate material into the RHCM (Rapid Heat Cycle Molding) tool at 10,800 PSI and 300 degrees Celsius. We use a polycarbonate mixture comprised of 90% Sabic 500ECR-739 PC and 10% glass. This material and injection molding process ensures a sturdy yet lightweight casing that will deliver hours of gaming with no fatigue.

 

In case you are behind on what Project Shield is, you should check out the hands-on video we made during our time with the device last January.

What do you think...are you excited about the launch of this device?  Do any of its features really make you want to buy it once available?

Source: NVIDIA

NVIDIA's plans for Tegra and Tesla

Subject: General Tech | April 24, 2013 - 01:38 PM |
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra

The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room.  They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward.  NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions.  A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings.  Read on for the full interview.

ELreg_nvidia_echelon_system.jpg

"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Khronos Group Announces WebGL 1.0.2 Specification and Extensions, Formal Release Expected in April

Subject: General Tech | March 28, 2013 - 12:54 AM |
Tagged: webgl 1.0.2, webgl, web browser, tegra, programming

The Khronos Group recently announced that the WebGL 1.0.1 specification is compliant across mobile and desktop systems on a number of platforms. Chrome 25 and Firefox 19 support WebGL 1.0.1 on Windows, Mac, and Linux operating systems. Further, mobile devices with Tegra SoCs can tap into WebGL using a WebGL-enhanced Android browser.

webgl_logo.jpg

Additionally, the WebGL 1.0.2 specification and its extensions have been submitted for ratification, and is expected to be formally released in April. According to the press release, the following features are being rolled into the WebGL 1.0.2 specification:

  • "adds many clarifications for specification behavioral precision
  • mandates support for certain combinations of framebuffer formats, to ease developer adoption;
  • clarifies interactions with the encompassing HTML5 platform, including the browser compositor and high-DPI displays;
  • dramatically increases the number of conformance tests to roughly 21,000 to improve both the breadth and depth of test coverage - thanks principally to work by Gregg Tavares at Google and the OpenGL ES working group."

The WebGL working group has also submitted several optional extensions for ratifications along with the 1.0.2 spec. On the developer side of things is a JavaScript debugger for WebGL API calls and shaders, and functionality to communicate when a mobile device is powered off while WebGL is running. The other extension deals with adding additional 3D features from OpenGL ES 2.0 including anisotropic filtering (AF), vertex array objects, and standard derivative functions.

Khronos President and NVIDIA Vice President of Mobile Content Neil Trevett stated that "The close cooperation between browser and silicon vendors to ensure the GPU is being reliably and securely exposed to the Web is ongoing proof that Khronos is a highly productive forum to evolve this vital Web standard." Meanwhile, WebGL Working group chair Ken Russell indicated that WebGL 1.0.2 is "a major milestone in bringing the power of the GPU to the Web.”

Although there are security concerns to consider, WebGL does open up some interesting opportunities for new web services. The full press release can be found here.

Source: Khronos

NVIDIA's Project SHIELD is just about ready

Subject: General Tech | March 26, 2013 - 01:49 PM |
Tagged: tegra 4, tegra, shield, nvidia, Tegrazone

Remember Project Shield from CES and before?  The Inquirer has managed to get their hands on an actual console at the Game Developers Conference and played a bit of Need For Speed streamed from a PC onto the Shield.  Project Shield its self is a Tegra 4 powered controller running Android 4.2 with a 5" 720p display attached and wireless connectivity.  The actual game is streamed wireless from a PC with a Kepler GPU via the Tegrazone application, so the real performance limit occurs from latency, similar to the company once known as Onlive.  While The Inq was not quite ready to toss their money at Project Shield, but it was close.

TheInq_nvidia-shield-console-front-540x334.JPG

"CHIP DESIGNER Nvidia caused something of a stir at CES when it announced the Project Shield handheld games console, and with its launch nearing, the firm is letting people try its first own-brand game console, which we managed to get our hands on at this week's GDC gaming conference in San Francisco."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Podcast #243 - ASUS Crosshair V Formula-Z, MSI Z77A-G45 Thunderbolt, 2TB SSDs and more!

Subject: General Tech | March 21, 2013 - 02:54 PM |
Tagged: z77a-g45 thunderbolt, video, tegra, quadro, podcast, GTX 690, GTC 2013, DDR3-3000, Crosshair V Formula Z, 2tb ssd

PC Perspective Podcast #243 - 03/21/2013

Join us this week as we discuss the ASUS Crosshair V Formula-Z, MSI Z77A-G45 Thunderbolt, 2TB SSDs and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Josh Walrath, Jeremy Hellstrom, Allyn Malventano, Morry Teitelman, and sometimes Ken Addison

This Podcast is brought to you by MSI!

Program length: 1:18:24

Podcast topics of discussion:
  1. Week in Review:
  1. News items of interest:
  2. Closing:
    1. 1:09:50 Hardware / Software Pick of the Week
      1. Morry: Memory, always more memory - G.Skill Sniper 1866 16GB DDR3
  3. 1-888-38-PCPER or podcast@pcper.com

 

NVIDIA Details Tegra 4 and Tegra 4i Graphics

Subject: Graphics Cards | February 25, 2013 - 08:01 PM |
Tagged: nvidia, tegra, tegra 4, Tegra 4i, pixel, vertex, PowerVR, mali, adreno, geforce

 

When Tegra 4 was introduced at CES there was precious little information about the setup of the integrated GPU.  We all knew that it would be a much more powerful GPU, but we were not entirely sure how it was set up.  Now NVIDIA has finally released a slew of whitepapers that deal with not only the GPU portion of Tegra 4, but also some of the low level features of the Cortex A15 processor.  For this little number I am just going over the graphics portion.

layout.jpg

This robust looking fellow is the Tegra 4.  Note the four pixel "pipelines" that can output 4 pixels per clock.

The graphics units on the Tegra 4 and Tegra 4i are identical in overall architecture, just that the 4i has fewer units and they are arranged slightly differently.  Tegra 4 is comprised of 72 units, 48 of which are pixel shaders.  These pixel shaders are VLIW based VEC4 units.  The other 24 units are vertex shaders.  The Tegra 4i is comprised of 60 units, 48 of which are pixel shaders and 12 are vertex shaders.  We knew at CES that it was not a unified shader design, but we were still unsure of the overall makeup of the part.  There are some very good reasons why NVIDIA went this route, as we will soon explore.

If NVIDIA were to transition to unified shaders, it would increase the overall complexity and power consumption of the part.  Each shader unit would have to be able to handle both vertex and pixel workloads, which means more transistors are needed to handle it.  Simpler shaders focused on either pixel or vertex operations are more efficient at what they do, both in terms of transistors used and power consumption.  This is the same train of thought when using fixed function units vs. fully programmable.  Yes, the programmability will give more flexibility, but the fixed function unit is again smaller, faster, and more efficient at its workload.

layout_4i.jpg

On the other hand here we have the Tegra 4i, which gives up half the pixel pipelines and vertex shaders, but keeps all 48 pixel shaders.

If there was one surprise here, it would be that the part is not completely OpenGL ES 3.0 compliant.  It is lacking in one major function that is required for certification.  This particular part cannot render at FP32 levels.  It has been quite a few years since we have heard of anything not being able to do FP32 in the PC market, but it is quite common to not support it in the power and transistor conscious mobile market.  NVIDIA decided to go with a FP 20 partial precision setup.  They claim that for all intents and purposes, it will not be noticeable to the human eye.  Colors will still be rendered properly and artifacts will be few and far between.  Remember back in the day when NVIDIA supported FP16 and FP32 while they chastised ATI for choosing FP24 with the Radeon 9700 Pro?  Times have changed a bit.  Going with FP20 is again a power and transistor saving decision.  It still supports DX9.3 and OpenGL ES 2.0, but it is not fully OpenGL ES 3.0 compliant.  This is not to say that it does not support any 3.0 features.  It in fact does support quite a bit of the functionality required by 3.0, but it is still not fully compliant.

This will be an interesting decision to watch over the next few years.  The latest Mali 600 series, PowerVR 6 series, and Adreno 300 series solutions all support OpenGL ES 3.0.  Tegra 4 is the odd man out.  While most developers have no plans to go to 3.0 anytime in the near future, it will eventually be implemented in software.  When that point comes, then the Tegra 4 based devices will be left a bit behind.  By then NVIDIA will have a fully compliant solution, but that is little comfort for those buying phones and tablets in the near future that will be saddled with non-compliance once applications hit.

ogles_feat.jpg

The list of OpenGL ES 3.0 features that are actually present in Tegra 4, but the lack of FP32 relegates it to 2.0 compliant status.

The core speed is increased to 672 MHz, well up from the 520 MHz in Tegra 3 (8 pixel and 4 vertex shaders).  The GPU can output four pixels per clock, double that of Tegra 3.  Once we consider the extra clock speed and pixel pipelines, the Tegra 4 increases pixel fillrate by 2.6x.  Pixel and vertex shading will get a huge boost in performance due to the dramatic increase of units and clockspeed.  Overall this is a very significant improvement over the previous generation of parts.

The Tegra 4 can output to a 4K display natively, and that is not the only new feature for this part.  Here is a quick list:

2x/4x Multisample Antialiasing (MSAA)

24-bit Z (versus 20-bit Z in the Tegra 3 processor) and 8-bit Stencil

4K x 4K texture size incl. Non-Power of Two textures (versus 2K x 2K in the Tegra 3 processor) – for higher quality textures, and easier to port full resolution textures from  console and PC games to Tegra 4 processor.  Good for high resolution displays.

16:1 Depth (Z) Compression and 4:1 Color Compression (versus none in Tegra 3 processor) – this is lossless compression and is useful for reducing bandwidth to/from the frame buffer, and especially effective in antialiasing processing when processing multiple samples per pixel

Depth Textures

Percentage Closer Filtering for Shadow Texture Mapping and Soft Shadows

Texture border color eliminate coarse MIP-level bleeding

sRGB for Texture Filtering, Render Surfaces and MSAA down-filter

1 - CSAA is no longer supported in Tegra 4 processors

This is a big generational jump, and now we only have to see how it performs against the other top end parts from Qualcomm, Samsung, and others utilizing IP from Imagination and ARM.

Source: NVIDIA

NVIDIA Releases Tegra 4i: I Shall Name It... Mini-Me!

Subject: Processors | February 20, 2013 - 09:35 PM |
Tagged: Tegra 4i, tegra 4, tegra 3, Tegra 2, tegra, phoenix, nvidia, icera, i500

 

The NVIDIA Tegra 4 and Shield project were announced at this year’s CES, but there were other products in the pipeline that were just not quite ready to see the light of day at that time.  While Tegra 4 is an impressive looking part for mobile applications, it is not entirely appropriate for the majority of smart phones out there.  Sure, the nebulous “Superphone” category will utilize Tegra 4, but that is not a large part of the smartphone market.  The two basic issues with Tegra 4 is that it pulls a bit more power at the rated clockspeeds than some manufacturers like, and it does not contain a built-in modem for communication needs.

Tegra 4i_die_shot.png

The die shot of the Tegra 4i.  A lot going on in this little guy.

NVIDIA bought up UK modem designer Icera to help create true all-in-one SOCs.  Icera has a unique method with building their modems that they say is not only more flexible than what others are offering, but also much more powerful.  These modems skip a lot of fixed function units that most modems are made of and rely on high speed general purpose compute units and an interesting software stack to create smaller modems with greater flexibility when it comes to wireless standards.  At CES NVIDIA showed off the first product of this acquisition, the i500.  This is a standalone chip and is set to be offered with the Tegra 4 SOC.

Yesterday NVIDIA introduced the Tegra 4i, formerly codenamed “Grey”.  This is a combined Tegra SOC with the Icera i500 modem.  This is not exactly what we were expecting, but the results are actually quite exciting.  Before I get too out of hand about the possibilities of the chip, I must make one thing perfectly clear.  The chip itself will not be available until Q4 2013.  It will be released in limited products with greater availability in Q1 2014.  While NVIDIA is announcing this chip, end users will not get to use it until much later this year.  I believe this issue is not so much that NVIDIA cannot produce the chips, but rather the design cycles of new and complex cell phones do not allow for rapid product development.

NV_T4i_Feat.png

Tegra 4i really should not be confused for the slightly earlier Tegra 4.  The 4i actually uses the 4th revision of the Cortex A9 processor rather than the Cortex A15 in the Tegra 4.  The A9 has been a mainstay of modern cell phone processors for some years now and offers a great deal of performance when considering die size and power consumption.  The 4th revision improves IPC of the A9 in a variety of ways (memory management, prefetch, buffers, etc.), so it will perform better than previous Cortex A9 solutions.  Performance will not approach that provided by the much larger and complex A15 cores, but it is a nice little boost from what we have previously seen.

The Tegra 4 features a 72 core GPU (though NVIDIA has still declined to detail the specifics of their new mobile graphics technology- these ain’t Kepler though), while the 4i features a nearly identical unit featuring 60 cores.  There is no word so far as to what speed these will be running at or how performance really compares to the latest graphics products from ARM, Imagination, or Qualcomm.

The chip is made on TSMC’s 28 nm HPM process and features core speeds up to 2.3 GHz.  We again have no information on if that will be all four cores at that speed or turbo functionality with one core.  The design adopts the previous 4+1 core setup with four high speed cores and one power saving core.  Considering how small each core is (Cortex A9 or A15) it is not a waste of silicon as compared to the potential power savings.  The HPM process is the high power version rather than the LPM (low power) used for Tegra 4.  My guess here is that the A9 cores are not going to pull all that much power anyway due to their simpler design as compared to A15.  Hitting 2.3 GHz is also a factor in the process decision.  Also consider that +1 core that is fabricated slightly differently than the other four to allow for slower transistor switching speed with much lower leakage.

NV_T4_Comp.png

The die size looks to be in the 60 to 65 mm squared range.  This is not a whole lot larger than the original Tegra 2 which was around 50 mm squared.  Consider that the Tegra 4i has three more cores, a larger and more able GPU portion, and the integrated Icera i500 modem.  The modem is a full Cat 3 LTE capable unit (100 mbps), so bandwidth should not be an issue for this phone.  The chip has all of the features of the larger Tegra 4, such as the Computational Photography Architecture, Image Signal Processor, video engine, and the “optimized memory interface”.  All of those neat things that NVIDIA showed off at CES will be included.  The only other major feature that is not present is the ability to output 3200x2000 resolutions.  This particular chip is limited to 1920x1200.  Not a horrific tradeoff considering this will be a smartphone SOC with a max of 1080P resolution for the near future.

We expect to see Tegra 4 out in late Q2 in some devices, but not a lot.  While Tegra 4 is certainly impressive, I would argue that Tegra 4i is the more marketable product with a larger chance of success.  If it were available today, I would expect its market impact to be similar to what we saw with the original 28nm Krait SOCs from Qualcomm last year.  There is simply a lot of good technology in this core.  It is small, it has a built-in modem, and performance per mm squared looks to be pretty tremendous.  Power consumption will be appropriate for handhelds, and perhaps might turn out to be better than most current solutions built on 28 nm and 32 nm processes.

NV_Phoenix.png

NVIDIA also developed the Phoenix Reference Phone which features the Tegra 4i.  This is a rather robust looking unit with a 5” screen and 1080P resolution.  It has front and rear facing cameras, USB and HDMI ports, and is only 8 mm thin.  Just as with the original Tegra 3 it features the DirectTouch functionality which uses the +1 core to handle all touch inputs.  This makes it more accurate and sensitive as compared to other solutions on the market.

Overall I am impressed with this product.  It is a very nice balance of performance, features, and power consumption.  As mentioned before, it will not be out until Q4 2013.  This will obviously give the competition some time to hone their own products and perhaps release something that will not only compete well with Tegra 4i in its price range, but exceed it in most ways.  I am not entirely certain of this, but it is a potential danger.  The potential is low though, as the design cycles for complex and feature packed cell phones are longer than 6 to 7 months.  While NVIDIA has had some success in the SOC market, they have not had a true homerun yet.  Tegra 2 and Tegra 3 had their fair share of design wins, but did not ship in numbers that came anywhere approaching Qualcomm or Samsung.  Perhaps Tegra 4i will be that breakthrough part for NVIDIA?  Hard to say, but when we consider how aggressive this company is, how deep their developer relations, and how feature packed these products seem to be, then I think that NVIDIA will continue to gain traction and marketshare in the SOC market.

Source: NVIDIA

NVIDIA Project SHIELD Development Detailed - Idea to Launch in One Year

Subject: Mobile | January 30, 2013 - 06:13 PM |
Tagged: tegra 4, tegra, shield, nvidia

A very interesting blog post by NVIDIA's Brian Caulfield tells the story of "How Project SHIELD Got Built" and you might be surprised about the timeline they were on.  In the story Caulfield details the team's move from idea to the CES release in under 12 months:

In less than a year, SHIELD has grown from an idea dreamed up by Jen-Hsun, Tony, and a handful of others into a conspiracy involving hundreds of gaming fanatics across every department at NVIDIA. “We’ve been talking on and off about building something for more than five years, maybe 10,” says Tony.

shield1.jpg

Caulfield goes on to detail several steps in the process including design, production, the first module shown to Huang, NVIDIA CEO, and more.  After realizing that they had the hardware with Tegra 4 and the software (both GeForce Experience and the controller drivers used by games from the TegraZone store), Tony Tomasi surmised that the company should "just build a device with a great controller built in."

The first prototype, assembled in early 2012, was little more than a game controller fastened to a smartphone with wood. From that crude beginning, NVIDIA’s team of industrial designers sculpted a device that could fit in a user’s hands. No outsourcing required: NVIDIA has a team of veterans who have already shaped the look of a number of products built around NVIDIA’s processors, such as the drool-worthy GeForce GTX 690.

Unfortunately no images of that wood-clad version of the SHIELD were shared, but the idea is amusing none the less.  NVIDIA does admit that the killer feature of the device is the ability to stream PC games from a GeForce powered machine to the SHIELD remotely.

Streaming games from PCs equipped with NVIDIA GeForce GTX 650 or better GPUs puts cutting-edge games on SHIELD on day one. As NVIDIA’s smart, funny marketing VP Ujesh Desai put it, when cynical gamers ask the eternal question – ‘but can it play Crysis’ – NVIDIA will have a simple answer ‘yes it does.’

And apparently some developers have been "lobbying" NVIDIA to make a console for years - a fact that I find both interesting and hilarious. 

shield2.jpg

Overall the post is incredibly insightful, if a bit overly "marketing-ish" about the product it discusses.  With such a tight timeline on the design and build I am curious to see if NVIDIA will be able to meet the deadline of release they set during CES: Q2 2013.  Also, many lines in the blog post are obviously meant to temper the fact that NVIDIA chips will find their way into exactly zero (0) consoles this generation and it is becoming more and more obvious that SHIELD is a reaction to that fact. 

We are eager to learn more and get our hands on it again!

Source: NVIDIA