Computex 2015: ASUS 3800R 34″ Ultrawide 3440x1440 IPS 21:9 Curved G-SYNC Monitor

Subject: Displays | June 1, 2015 - 12:21 PM |
Tagged: nvidia, gsync, g-sync, asus, 3800R

At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor,  a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:

ROG-34-inch-curved-gaming-monitor.jpg

Here's a list of specifications:

  • 34” optimal dimension for QHD resolutions with 3440×1440 resolution
  • 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
  • IPS based panel for superior color reproduction, black levels and reduction of color shifting
  • NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
  • Frameless design for seamless surround gaming
  • ASUS exclusive GamePlus feature and Turbo Key
  • Ergonomic adjustment including tilt, swivel and height adjustment

Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...

Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.

This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.

The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!

Author:
Manufacturer: NVIDIA

Specifications

When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.

Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.

P1020283.jpg

The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?

Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?

Continue reading our review of the new NVIDIA GeForce GTX 980 Ti 6GB Graphics Card!!

NVIDIA G-Sync Update: New Monitors, Windowed Mode, V-Sync Options

Subject: Displays | May 31, 2015 - 06:00 PM |
Tagged: nvidia, gsync, g-sync, computex 2015, computex

In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.

First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.

980ti-42.jpg

Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.

ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain. 

Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though. 

Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.

gsync-windows.jpg

By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio. 

Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.

gsyncsettings2.png

One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.

Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.

Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.

gsyncsettings1.png

NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!

NVIDIA G-Sync for Notebooks Announced, No Module Required

Subject: Displays, Mobile | May 31, 2015 - 06:00 PM |
Tagged: nvidia, notebooks, msi, mobile, gsync, g-sync, asus

If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.

mobile1.jpg

G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.

mobile2.jpg

Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.

NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem. 

mobile3.jpg

NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic. 

980ti-32.jpg

All of this is required because of the incredible amount of variability in the monitor and panel markets today.

mobile4.jpg

But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.)  There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.

mobile5.jpg

As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.

Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.

mobile6.jpg

At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen. 

This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.

Author:
Manufacturer: NVIDIA

SHIELD Specifications

Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.

NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.

DSC01740.jpg

Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.

And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.

But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.

Continue reading our review of the new NVIDIA SHIELD with Android TV!!

Podcast #351 - More AMD Fiji Leaks, Rumors on GTX 980 Ti and a great $99 portable DAC!

Subject: Editorial | May 28, 2015 - 01:22 PM |
Tagged: X99, video, sapphire, r9 285, podcast, nvidia, GTX 980 Ti, gigabyte, Fiji, DAC, amd

PC Perspective Podcast #351 - 05/28/2015

Join us this week as we discuss AMD Fiji Leaks, rumors on GTX 980 Ti, a great $99 portable DAC, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Rumor: NVIDIA GeForce GTX 980 Ti Specifications Leaked

Subject: Graphics Cards | May 26, 2015 - 05:03 PM |
Tagged: rumors, nvidia, leaks, GTX 980 Ti, gpu, gm200

Who doesn’t love rumor and speculation about unreleased products? (Other than the manufacturers of such products, of course.) Today VideoCardz is reporting via HardwareBattle a GPUZ screenshot reportedly showing specs for an NIVIDIA GeForce GTX 980 Ti.

gpuz_screenshot.png

Image credit: HardwareBattle via VideoCardz.com

First off, the HardwareBattle logo conveniently obscures the hardware ID (as well as ROP/TMU counts). What is visible is the 2816 shader count, which places it between the GTX 980 (2048) and TITAN X (3072). The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X. As far as core clocks on this GPU (which seems likely to be a cut-down GM200), they are identical to those of the TITAN X as well with 1000 MHz Base and 1076 MHz Boost clocks shown in the screenshot.

videocardz_image.PNG

Image credit: VideoCardz.com

We await any official announcement, but from the frequency of the leaks it seems we won’t have to wait too long.

Could it be a 980 Ti, or does the bill of lading lie?

Subject: Graphics Cards | May 21, 2015 - 07:33 PM |
Tagged: rumour, nvidia, 980 Ti

980ti.PNG

The source of leaks and rumours is often unexpected, such as this import data of a shipment headed from China into India.  Could this 6GB card be the GTX 980 Ti that so many have theorized would be coming sometime around AMD's release of their new cards?  Does the fact that 60,709 Indian Rupees equal 954.447 US Dollars put a damper on your excitement or could it be that these 6 lonely cards are being sold at a higher rate overseas than they might be in the US? 

We don't know but we do know there is a mysterious card out there somewhere.

Source: Zauba

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

Subject: Graphics Cards | May 17, 2015 - 12:04 PM |
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd

I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.

Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.

gwlogo.jpg

Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX,  Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.

An example of The Witcher 3: Wild Hunt with HairWorks

One of the game's developers has been quoted as such:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.

I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:

We are not asking game developers do anything unethical.
 
GameWorks improves the visual quality of games running on GeForce for our customers.  It does not impair performance on competing hardware.
 
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
 
GameWorks licenses follow standard industry practice.  GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license. 
 
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
 
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
 
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.  

Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.

Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to."  And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.

It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings. 

NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.

In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.

Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.

wicher3-1.jpg

All arguing aside, this game looks amazing. Can we all agree on that?

The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.

Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.

So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!

NVIDIA SHIELD and SHIELD Pro Show up on Amazon

Subject: Mobile | May 16, 2015 - 01:00 PM |
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia

UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)

Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.

shield1.jpg

Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.

shield2.jpg

No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.

shield3.jpg

For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

 

Source: Amazon.com