Podcast #422 - Samsung 960 Pro, Acer Z850 Projector, Surface Studio and more!

Subject: General Tech | October 27, 2016 - 12:19 PM |
Tagged: z850, x50, video, tegra, switch, surface studio, Samsung, qualcomm, podcast, Optane, nvidia, Nintendo, microsoft, Intel, gtx 1050, Fanatec, evga, acer, 960 PRO, 5G

PC Perspective Podcast #422 - 10/27/16

Join us this week as we discuss the Samsung 960 Pro, Fanatec racing gear, an Acer UltraWide projector, Optane leaks, MS Surface Studio and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts:  Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom

Program length: 1:47:11

  1. Join our spam list to get notified when we go live!
  2. Patreon
  3. Fragging Frogs VLAN 14
  4. Week in Review:
    1. 0:06:00 Fanatec ClubSport V2 Ecosystem Review: What is Realism Worth?
    2. 0:25:20 Samsung 960 PRO 2TB M.2 NVMe SSD Full Review - Even Faster!
    3. 0:45:35 Acer Predator Z850 UltraWide 24:9 Gaming Projector Review
    4. 0:54:28 EVGA SuperNOVA 750W G2L Power Supply Review
  5. Today’s episode is brought to you by Harry’s! Use code PCPER at checkout!
  6. News items of interest:
    1. 1:00:50 GTX 1050 and 1050Ti
    2. 1:05:30 Intel Optane (XPoint) First Gen Product Specifications Leaked
    3. 1:11:20 Microsoft Introduces Surface Studio AiO Desktop PC
    4. 1:21:45 Microsoft Windows 10 Creators Update Formally Announced
    5. 1:25:25 Qualcomm Announces Snapdragon X50 5G Modem
    6. 1:31:55 NVIDIA Tegra SoC powers new Nintendo Switch gaming system
  7. Hardware/Software Picks of the Week
    1. Ryan: Chewbacca Hoodie
    2. Jeremy: The Aimpad R5 is actually much cooler than I thought
    3. Josh: Solid for the price. Get on special!
    4. Allyn: Factorio
  8. http://pcper.com/podcast
  9. http://twitter.com/pcper
  10. Closing/Outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

NVIDIA Tegra SoC powers new Nintendo Switch gaming system

Subject: Processors, Mobile | October 20, 2016 - 11:40 AM |
Tagged: Nintendo, switch, nvidia, tegra

It's been a hell of a 24 hours for NVIDIA and the Tegra processor. A platform that many considered dead in the water after the failure of it to find its way into smartphones or into an appreciable amount of consumer tablets, had two major design wins revealed. First, it was revealed that NVIDIA is powered the new fully autonomous driving system in the Autopilot 2.0 hardware implementation in Tesla's current Model S, X and upcoming Model 3 cars.

Now, we know that Nintendo's long rumored portable and dockable gaming system called Switch is also powered by a custom NVIDIA Tegra SoC.

20-nintendo-switch-1200x923.jpg

We don't know much about the hardware that gives the Switch life, but NVIDIA did post a short blog with some basic information worth looking at. Based on it, we know that the Tegra processor powering this Nintendo system is completely custom and likely uses Pascal architecture GPU CUDA cores; though we don't know how many and how powerful it will be. It will likely exceed the performance of the Nintendo Wii U, which was only 0.35 TFLOPS and consisting of 320 AMD-based stream processors. How much faster we just don't know yet.

On the CPU side we assume that this is built using an ARM-based processor, most likely off-the-shelf core designs to keep things simple. Basing it on custom designs like Denver might not be necessary for this type of platform. 

Nintendo has traditionally used custom operating systems for its consoles and that seems to be what is happening with the Switch as well. NVIDIA mentions a couple of times how much work the technology vendor put into custom APIs, custom physic engines, new libraries, etc. 

The Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.

We’ve optimized the full suite of hardware and software for gaming and mobile use cases. This includes custom operating system integration with the GPU to increase both performance and efficiency.

The system itself looks pretty damn interesting, with the ability to switch (get it?) between a docked to your TV configuration to a mobile one with attached or wireless controllers. Check out the video below for a preview.

I've asked both NVIDIA and Nintendo for more information on the hardware side but these guys tend to be tight lipped on the custom silicon going into console hardware. Hopefully one or the other is excited to tell us about the technology so we can some interesting specifications to discuss and debate!

UPDATE: A story on The Verge claims that Nintendo "took the chip from the Shield" and put it in the Switch. This is more than likely completely false; the Shield is a significantly dated product and that kind of statement could undersell the power and capability of the Switch and NVIDIA's custom SoC quite dramatically.

Source: Nintendo

NVIDIA Teases Low Power, High Performance Xavier SoC That Will Power Future Autonomous Vehicles

Subject: Processors | October 1, 2016 - 06:11 PM |
Tagged: xavier, Volta, tegra, SoC, nvidia, machine learning, gpu, drive px 2, deep neural network, deep learning

Earlier this week at its first GTC Europe event in Amsterdam, NVIDIA CEO Jen-Hsun Huang teased a new SoC code-named Xavier that will be used in self-driving cars and feature the company's newest custom ARM CPU cores and Volta GPU. The new chip will begin sampling at the end of 2017 with product releases using the future Tegra (if they keep that name) processor as soon as 2018.

NVIDIA_Xavier_SOC.jpg

NVIDIA's Xavier is promised to be the successor to the company's Drive PX 2 system which uses two Tegra X2 SoCs and two discrete Pascal MXM GPUs on a single water cooled platform. These claims are even more impressive when considering that NVIDIA is not only promising to replace the four processors but it will reportedly do that at 20W – less than a tenth of the TDP!

The company has not revealed all the nitty-gritty details, but they did tease out a few bits of information. The new processor will feature 7 billion transistors and will be based on a refined 16nm FinFET process while consuming a mere 20W. It can process two 8k HDR video streams and can hit 20 TOPS (NVIDIA's own rating for deep learning int(8) operations).

Specifically, NVIDIA claims that the Xavier SoC will use eight custom ARMv8 (64-bit) CPU cores (it is unclear whether these cores will be a refined Denver architecture or something else) and a GPU based on its upcoming Volta architecture with 512 CUDA cores. Also, in an interesting twist, NVIDIA is including a "Computer Vision Accelerator" on the SoC as well though the company did not go into many details. This bit of silicon may explain how the ~300mm2 die with 7 billion transistors is able to match the 7.2 billion transistor Pascal-based Telsa P4 (2560 CUDA cores) graphics card at deep learning (tera-operations per second) tasks. Of course in addition to the incremental improvements by moving to Volta and a new ARMv8 CPU architectures on a refined 16nm FF+ process.

  Drive PX Drive PX 2 NVIDIA Xavier Tesla P4
CPU 2 x Tegra X1 (8 x A57 total) 2 x Tegra X2 (8 x A57 + 4 x Denver total) 1 x Xavier SoC (8 x Custom ARM + 1 x CVA) N/A
GPU 2 x Tegra X1 (Maxwell) (512 CUDA cores total 2 x Tegra X2 GPUs + 2 x Pascal GPUs 1 x Xavier SoC GPU (Volta) (512 CUDA Cores) 2560 CUDA Cores (Pascal)
TFLOPS 2.3 TFLOPS 8 TFLOPS ? 5.5 TFLOPS
DL TOPS ? 24 TOPS 20 TOPS 22 TOPS
TDP ~30W (2 x 15W) 250W 20W up to 75W
Process Tech 20nm 16nm FinFET 16nm FinFET+ 16nm FinFET
Transistors ? ? 7 billion 7.2 billion

For comparison, the currently available Tesla P4 based on its Pascal architecture has a TDP of up to 75W and is rated at 22 TOPs. This would suggest that Volta is a much more efficient architecture (at least for deep learning and half precision)! I am not sure how NVIDIA is able to match its GP104 with only 512 Volta CUDA cores though their definition of a "core" could have changed and/or the CVA processor may be responsible for closing that gap. Unfortunately, NVIDIA did not disclose what it rates the Xavier at in TFLOPS so it is difficult to compare and it may not match GP104 at higher precision workloads. It could be wholly optimized for int(8) operations rather than floating point performance. Beyond that I will let Scott dive into those particulars once we have more information!

Xavier is more of a teaser than anything and the chip could very well change dramatically and/or not hit the claimed performance targets. Still, it sounds promising and it is always nice to speculate over road maps. It is an intriguing chip and I am ready for more details, especially on the Volta GPU and just what exactly that Computer Vision Accelerator is (and will it be easy to program for?). I am a big fan of the "self-driving car" and I hope that it succeeds. It certainly looks to continue as Tesla, VW, BMW, and other automakers continue to push the envelope of what is possible and plan future cars that will include smart driving assists and even cars that can drive themselves. The more local computing power we can throw at automobiles the better and while massive datacenters can be used to train the neural networks, local hardware to run and make decisions are necessary (you don't want internet latency contributing to the decision of whether to brake or not!).

I hope that NVIDIA's self-proclaimed "AI Supercomputer" turns out to be at least close to the performance they claim! Stay tuned for more information as it gets closer to launch (hopefully more details will emerge at GTC 2017 in the US).

What are your thoughts on Xavier and the whole self-driving car future?

Also read:

Source: NVIDIA

Podcast #410 - Data Recovery, New Titan X Launch, AMD builds a GPU with SSDs and more!!

Subject: Editorial | July 28, 2016 - 01:03 PM |
Tagged: XSPC, wings, windows 10, VR, video, titan x, tegra, Silverstone, sapphire, rx 480, Raystorm, RapidSpar, radeon pro ssg, quadro, px1, podcast, p6000, p5000, nvidia, nintendo nx, MX300, gp102, evga, dg-87, crucial, angelbird

PC Perspective Podcast #410 - 07/28/2016

Join us this week as we discuss the new Pascal based Titan X, an AMD graphics card with 1TB of SSD storage on-board, data recovery with RapidSpar and more!!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts:  Ryan Shrout, Allyn Malventano, Sebastian Peak, and Josh Walrath

Program length: 1:46:33
  1. Week in Review:
  2. News items of interest:
  3. 1:29:15 Hardware/Software Picks of the Week
    1. Allyn: Wii emulation is absolutely usable now (Dolphin 5)
  4. Closing/outro

Rumor: Nintendo NX Uses NVIDIA Tegra... Something

Subject: Graphics Cards, Systems, Mobile | July 27, 2016 - 07:58 PM |
Tagged: nvidia, Nintendo, nintendo nx, tegra, Tegra X1, tegra x2, pascal, maxwell

Okay so there's a few rumors going around, mostly from Eurogamer / DigitalFoundry, that claim the Nintendo NX is going to be powered by an NVIDIA Tegra system on a chip (SoC). DigitalFoundry, specifically, cites multiple sources who claim that their Nintendo NX development kits integrate the Tegra X1 design, as seen in the Google Pixel C. That said, the Nintendo NX release date, March 2017, does provide enough time for them to switch to NVIDIA's upcoming Pascal Tegra design, rumored to be called the Tegra X2, which uses NVIDIA's custom-designed Denver CPU cores.

Preamble aside, here's what I think about the whole situation.

First, the Tegra X1 would be quite a small jump in performance over the WiiU. The WiiU's GPU, “Latte”, has 320 shaders clocked at 550 MHz, and it was based on AMD's TeraScale 1 architecture. Because these stream processors have single-cycle multiply-add for floating point values, you can get its FLOP rating by multiplying 320 shaders, 550,000,000 cycles per second, and 2 operations per clock (one multiply and one add). This yields 352 GFLOPs. The Tegra X1 is rated at 512 GFLOPs, which is just 45% more than the previous generation.

This is a very tiny jump, unless they indeed use Pascal-based graphics. If this is the case, you will likely see a launch selection of games ported from WiiU and a few games that use whatever new feature Nintendo has. One rumor is that the console will be kind-of like the WiiU controller, with detachable controllers. If this is true, it's a bit unclear how this will affect games in a revolutionary way, but we might be missing a key bit of info that ties it all together.

nvidia-2016-shieldx1consoles.png

As for the choice of ARM over x86... well. First, this obviously allows Nintendo to choose from a wider selection of manufacturers than AMD, Intel, and VIA, and certainly more than IBM with their previous, Power-based chips. That said, it also jives with Nintendo's interest in the mobile market. They joined The Khronos Group and I'm pretty sure they've said they are interested in Vulkan, which is becoming the high-end graphics API for Android, supported by Google and others. That said, I'm not sure how many engineers exist that specialize in ARM optimization, as most mobile platforms try to abstract this as much as possible, but this could be Nintendo's attempt to settle on a standardized instruction set, and they opted for mobile over PC (versus Sony and especially Microsoft, who want consoles to follow high-end gaming on the desktop).

Why? Well that would just be speculating on speculation about speculation. I'll stop here.

CES 2016: NVIDIA Launches DRIVE PX 2 With Dual Pascal GPUs Driving A Deep Neural Network

Subject: General Tech | January 5, 2016 - 01:17 AM |
Tagged: tegra, pascal, nvidia, driveworks, drive px 2, deep neural network, deep learning, autonomous car

NVIDIA is using the Consumer Electronics Show to launch the Drive PX 2 which is the latest bit of hardware aimed at autonomous vehicles. Several NVIDIA products combine to create the company's self-driving "end to end solution" including DIGITS, DriveWorks, and the Drive PX 2 hardware to train, optimize, and run the neural network software that will allegedly be the brains of future self-driving cars (or so NVIDIA hopes).

NVIDIA DRIVE PX 2 Self Driving Car Supercomputer.jpg

The Drive PX 2 hardware is the successor to the Tegra-powered Drive PX released last year. The Drive PX 2 represents a major computational power jump with 12 CPU cores and two discrete "Pascal"-based GPUs! NVIDIA has not revealed the full specifications yet, but they have made certain details available. There are two Tegra SoCs along with two GPUs that are liquid cooled. The liquid cooling consists of a large metal block with copper tubing winding through it and then passing into what looks to be external connectors that attach to a completed cooling loop (an exterior radiator, pump, and reservoir).

There are a total of 12 CPU cores including eight ARM Cortex A57 cores and four "Denver" cores. The discrete graphics are based on the 16nm FinFET process and will use the company's upcoming Pascal architecture. The total package will draw a maximum of 250 watts and will offer up to 8 TFLOPS of computational horsepower and 24 trillion "deep learning operations per second." That last number relates to the number of special deep learning instructions the hardware can process per second which, if anything, sounds like an impressive amount of power when it comes to making connections and analyzing data to try to classify it. Drive PX 2 is, according to NVIDIA, 10 times faster than it's predecessor at running these specialized instructions and has nearly 4 times the computational horsepower when it comes to TLOPS.

Similar to the original Drive PX, the driving AI platform can accept and process the inputs of up to 12 video cameras. It can also handle LiDAR, RADAR, and ultrasonic sensors. NVIDIA compared the Drive PX 2 to the TITAN X in its ability to process 2,800 images per second versus the consumer graphics card's 450 AlexNet images which while possibly not the best comparison does make it look promising.

NVIDIA DRIVE PX 2 DRIVEWORKS.jpg

Neural networks and machine learning are at the core of what makes autonomous vehicles possible along with hardware powerful enough to take in a multitude of sensor data and process it fast enough. The software side of things includes the DriveWorks development kit which includes specialized instructions and a neural network that can detect objects based on sensor input(s), identify and classify them, determine the positions of objects relative to the vehicle, and calculate the most efficient path to the destination.

Specifically, in the press release NVIDIA stated:

"This complex work is facilitated by NVIDIA DriveWorks™, a suite of software tools, libraries and modules that accelerates development and testing of autonomous vehicles. DriveWorks enables sensor calibration, acquisition of surround data, synchronization, recording and then processing streams of sensor data through a complex pipeline of algorithms running on all of the DRIVE PX 2's specialized and general-purpose processors. Software modules are included for every aspect of the autonomous driving pipeline, from object detection, classification and segmentation to map localization and path planning."

DIGITS is the platform used to train the neural network that is then used by the Drive PX 2 hardware. The software is purportedly improving in both accuracy and training time with NVIDIA achieving a 96% accuracy rating at identifying traffic signs based on the traffic sign database from Ruhr University Bochum after a training session lasting only 4 hours as opposed to training times of days or even weeks.

NVIDIA claims that the initial Drive PX has been picked up by over 50 development teams (automakers, universities, software developers, et al) interested in autonomous vehicles. Early access to development hardware is expected to be towards the middle of the year with general availability of final hardware in Q4 2016.

The new Drive PX 2 is getting a serious hardware boost with the inclusion of two dedicated graphics processors (the Drive PX was based around two Tegra X1 SoCs), and that should allow automakers to really push what's possible in real time and push the self-driving car a bit closer to reality and final (self) drive-able products. I'm excited to see that vision come to fruition and am looking forward to seeing what this improved hardware will enable in the auto industry!

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA
Author:
Manufacturer: NVIDIA

SHIELD Specifications

Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.

NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.

DSC01740.jpg

Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.

And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.

But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.

Continue reading our review of the new NVIDIA SHIELD with Android TV!!

NVIDIA SHIELD and SHIELD Pro Show up on Amazon

Subject: Mobile | May 16, 2015 - 01:00 PM |
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia

UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)

Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.

shield1.jpg

Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.

shield2.jpg

No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.

shield3.jpg

For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

 

Source: Amazon.com

Rumor: New NVIDIA SHIELD Portable Coming with Tegra X1

Subject: Mobile | March 30, 2015 - 03:43 PM |
Tagged: Tegra X1, tegra, shield portable, shield, portable, nvidia

UPDATE (3/31/15): Thanks to another tip we can confirm that the new SHIELD P2523 will have the Tegra X1 SoC in it. From this manifest document you'll see the Tegra T210 listed (the same part marketed as X1) as well as the code name "Loki." Remember that the first SHIELD Portable device was code named Thor. Oh, so clever, NVIDIA.

shield3.jpg

Based on a rumor posted by Brad over at Lilliputing, it appears we can expect an updated NVIDIA SHIELD Portable device sometime later in 2015. According to both the Bluetooth and Wi-Fi certification websites, a device going by the name "NVIDIA Shield Portable P2523" has been submitted. There isn't a lot of detail though:

  • 802.11a/b/g/n/ac dual-band 2.4 GHz and 5 GHz WiFi
  • Bluetooth 4.1
  • Android 5.0
  • Firmware version 3.10.61

We definitely have a new device here as the initial SHIELD Portable did not includ 802.11ac support at all. And though no data is there to support it, you have to assume that NVIDIA would be using the new Tegra X1 processor in any new SHIELD devices coming out this year. I already previewd the new SHIELD console from GDC that utilizes that same SoC, but it might require a lower clocked, lower power version of the processor to help with heat and battery life on a portable unit.

shieldportable.JPG

There’s no information about the processor, screen, or other hardware. But if the new Shield portable is anything like the original, it’ll probably consist of what looks like an Xbox-style game controller with an attached 5 inch display which you can fold up to play games on the go.

And if it’s anything like the new NVIDIA Shield console, it could have a shiny new NVIDIA Tegra X1 processor to replace the aging Tegra 4 chip found in the original Shield Portable.

I wouldn’t be surprised if it also had a higher-resolution display, more memory, or other improvements.

Keep an eye out - NVIDIA may be making a push for even more SHIELD hardware this summer.

Source: Lilliputing
Author:
Manufacturer: NVIDIA

Finally, a SHIELD Console

NVIDIA is filling out the family of the SHIELD brand today with the announcement of SHIELD, a set-top box powered by the Tegra X1 processor. SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming device. Selling for $199 and available in May of this year, there is a lot to discuss.

Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk and bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movies and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.

10.jpg

Here is a full breakdown of the device's specifications.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

Obviously the most important feature is the Tegra X1 SoC, built on an 8-core 64-bit ARM processor and a 256 CUDA Core Maxwell architecture GPU. This gives the SHIELD set-top more performance than basically any other mobile part on the market, and demos showing Doom 3 and Crysis 3 running natively on the hardware drive the point home. With integrated HEVC decode support the console is the first Android TV device to offer the support for 4K video content at 60 FPS.

Even though storage is only coming in at 16GB, the inclusion of an MicroSD card slot enabled expansion to as much as 128GB more for content and local games.

11.jpg

The first choice for networking will be the Gigabit Ethernet port, but the 2x2 dual-band 802.11ac wireless controller means that even those of us that don't have hardwired Internet going to our TV will be able to utilize all the performance and features of SHIELD.

Continue reading our preview of the NVIDIA SHIELD set-top box!!