Subject: Processors, Mobile | October 20, 2016 - 11:40 AM | Ryan Shrout
Tagged: Nintendo, switch, nvidia, tegra
It's been a hell of a 24 hours for NVIDIA and the Tegra processor. A platform that many considered dead in the water after the failure of it to find its way into smartphones or into an appreciable amount of consumer tablets, had two major design wins revealed. First, it was revealed that NVIDIA is powered the new fully autonomous driving system in the Autopilot 2.0 hardware implementation in Tesla's current Model S, X and upcoming Model 3 cars.
Now, we know that Nintendo's long rumored portable and dockable gaming system called Switch is also powered by a custom NVIDIA Tegra SoC.
We don't know much about the hardware that gives the Switch life, but NVIDIA did post a short blog with some basic information worth looking at. Based on it, we know that the Tegra processor powering this Nintendo system is completely custom and likely uses Pascal architecture GPU CUDA cores; though we don't know how many and how powerful it will be. It will likely exceed the performance of the Nintendo Wii U, which was only 0.35 TFLOPS and consisting of 320 AMD-based stream processors. How much faster we just don't know yet.
On the CPU side we assume that this is built using an ARM-based processor, most likely off-the-shelf core designs to keep things simple. Basing it on custom designs like Denver might not be necessary for this type of platform.
Nintendo has traditionally used custom operating systems for its consoles and that seems to be what is happening with the Switch as well. NVIDIA mentions a couple of times how much work the technology vendor put into custom APIs, custom physic engines, new libraries, etc.
The Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.
We’ve optimized the full suite of hardware and software for gaming and mobile use cases. This includes custom operating system integration with the GPU to increase both performance and efficiency.
The system itself looks pretty damn interesting, with the ability to switch (get it?) between a docked to your TV configuration to a mobile one with attached or wireless controllers. Check out the video below for a preview.
I've asked both NVIDIA and Nintendo for more information on the hardware side but these guys tend to be tight lipped on the custom silicon going into console hardware. Hopefully one or the other is excited to tell us about the technology so we can some interesting specifications to discuss and debate!
UPDATE: A story on The Verge claims that Nintendo "took the chip from the Shield" and put it in the Switch. This is more than likely completely false; the Shield is a significantly dated product and that kind of statement could undersell the power and capability of the Switch and NVIDIA's custom SoC quite dramatically.
Subject: Graphics Cards | October 19, 2016 - 08:08 PM | Scott Michaud
Tagged: amd, nvidia, gtx 1060, rx 480, dx12, dx11, battlefield 1
Battlefield 1 is just a few days from launching. In fact, owners of the Deluxe Edition have the game unlock yesterday. It's interesting that multiple publishers are using release date as a special edition bonus these days, including Microsoft's recent Windows Store releases. I'm not going to say interesting bad or good, though, because I'll leave that up to the reader to decide.
Anywho, DigitalFoundry is doing their benchmarking thing, and they wanted to see what GPU could provide a solid 60FPS when everything is maxed out (at 1080p). They start off with a DX12-to-DX12 comparison between the GTX 1060 and the RX 480. This is a relatively fair comparison, because the 3GB GTX 1060 and the 4GB RX 480 both come in at about $200, while upgrading to 6GB for the 1060 or 8GB for the 480 bumps each respective SKU up to the ~$250 price point. In this test, NVIDIA has a few dips slightly below 60 FPS in complex scenes, while AMD stays above that beloved threshold.
They also compare the two cards in DX11 and DX12 mode, with both cards using a Skylake-based Core i5 CPU. In this test, AMD's card noticed a nice increase in frame rate when switching to DirectX 12, while NVIDIA had a performance regression in the new API. This raises two questions, one of which is potentially pro-NVIDIA, and the other, pro-AMD. First, would the original test, if NVIDIA's card was allowed to use DirectX 11, show the GTX 1060 more competitive against the DX12-running RX 480? This brings me to the second question: what would the user see? A major draw of Mantle-based graphics APIs is that the application has more control over traditionally driver-level tasks. Would 60 FPS in DX12 be more smooth than 60 FPS in DX11?
I don't know. It's something we'll need to test.
Subject: General Tech | October 18, 2016 - 09:01 AM | Scott Michaud
Tagged: pascal, nvidia, GTX 1050 Ti, gtx 1050
NVIDIA has just announced that the GeForce GTX 1050 ($109) and GeForce GTX 1050 Ti ($139) will launch on October 25th. Both of these Pascal-based cards target the 75W thermal point, which allows them to be powered by a PCIe bus without being tethered directly to the power supply. Like the GTX 750 Ti before it, this allows users to drop it into many existing desktops, upgrading it with discrete graphics.
Most of NVIDIA's press deck focuses on the standard GTX 1050. This $109 SKU contains 2GB of GDDR5 memory and 640 CUDA cores, although the core frequency has not been announced at the time of writing. Instead, NVIDIA has provided a handful of benchmarks, comparing the GTX 1050 to the earlier GTX 650 and the Intel Core i5-4760k integrated graphics.
It should be noted that, to hit their >60FPS targets, Gears of War 4 and Grand Theft Auto V needed to be run at medium settings, and Overwatch was set to high. (DOTA2 and World of Warcraft were maxed out, though.) As you might expect, NVIDIA reminded the press about GeForce Experience's game optimization setting just a few slides later. The implication seems to be that, while it cannot max out these games at 1080p, NVIDIA will at least make it easy for users to experience its best-case scenario, while maintaining 60FPS.
So yes, while it's easy to claim 60 FPS is you're able to choose the settings that fit this role, it's a much better experience than the alternative parts they list. On the GTX 650, none of these titles are able to hit an average of 30 FPS, and integrated graphics cannot even hit 15 FPS. This card seems to be intended for users that are interested in playing eSports titles maxed out at 1080p60, while enjoying newer blockbusters, albeit at reduced settings, but have an old, non-gaming machine they can salvage.
Near the end of their slide deck, they also mention that the GTX 1050 Ti exists. It's basically the same use case as above, with its 75W TDP and all, but with $30 more performance. The VRAM doubles from 2GB to 4GB, which should allow higher texture resolutions and more mods, albeit still targeting 1080p. It also adds another 128 CUDA cores, a 20% increase, although, again, that is somewhat meaningless until we find out what the card is clocked at.
Update: Turns out we did find clock speeds! The GTX 1050 will have a base clock of 1354 MHz and a Boost clock of 1455 MHz while the GTX 1050 Ti will run at 1290/1392 MHz respectively.
NVIDIA's promotional video
Obviously, numbers from a vendor are one thing, and a third-party benchmark is something else entirely (especially when the vendor benchmarks do not compare their product to the latest generation of their competitor). Keep an eye out for reviews.
Subject: General Tech | October 13, 2016 - 03:50 PM | Jeremy Hellstrom
Tagged: shadow warrior 2, nvidia, geforce, geforce experience 3.0, giveaway
Shadow Warrior 2 is out today, bringing Lo brow humour and procedural gore back to PC gaming. For those of you who have created a user at GeForce.com, you have a chance to win a copy of the game for free, all you need to do is install GeForce Experience 3.0 on your machine and you are entered to win. If you haven't the desire you can pick the game up on GOG or Steam, but you will have to pay for it. This new incarnation adds four player co-op to the game and the levels are described as procedural, theoretically places you have previously visited will not be the same if you head back. More info on the contest in the PR below
Shadow Warrior 2 launched today, and GeForce gamers using GeForce Experience may be getting it free. We will be giving away $50,000 worth of codes for the over the top first person shooter Shadow Warrior 2 to random gamers registered with GeForce Experience 3.0. This marks the second game code giveaway this month and more are coming soon. Just download and log in to the new GeForce Experience 3.0 to be eligible. Shadow Warrior 2 is highly anticipated first person shooter that is focused on fun. But not to be missed behind the numerous weapon choices, over-the-top gore, and edgy sense of humor is an indie release that is loaded with next generation technology thanks in part to a collaboration between NVIDIA and Flying Wild Hog, the game’s independent Polish developer.
The developer of Shadow Warrior 2, Flying Wild Hog, along with Devolver sister company Gambitious, were a part of the NVIDIA Indie Spotlight Program launch with their game Hard Reset: Redux. So naturally working with them on Shadow Warrior 2 to expand the indie game partnership between Devolver and NVIDIA seemed like the natural next step.
PC Gamers count on GeForce Experience to get the most from their games. It keeps drivers up to date. It automatically optimizes game settings for more than 300 games. And it’s the easiest way to capture gameplay video, stream it to Twitch or YouTube, or share it with another player over the Internet using the easy-to-use in-game overlay tool.
And now it rewards you for playing on GeForce. Dating back to July, NVIDIA has thanked their loyal GeForce Experience gamers by giving away: MSI VR-Ready Notebooks, HTC Vive Systems, GeForce GTX 1080s, SHIELD Android TVs, alpha access codes to the game LawBreakers and $200,000 worth of codes for Dead by Daylight.
More than 75 million gamers can’t be wrong--GeForce Experience is the gateway to great PC gaming.
Subject: Editorial | October 13, 2016 - 11:22 AM | Ryan Shrout
Tagged: XG-U2008, western digital, video, stratix, ssd, podcast, nvidia, msi, kaby lake, iPhone 7 Plus, iPhone 7, iphone, Intel, drobo, asus, apple, 5c
PC Perspective Podcast #421 - 10/13/16
Join us this week as we discuss our review of the iPhone 7, the Drobo 5C, Intel FPGAs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom, and Sebastian Peak
Program length: 1:22:35
Week in Review:
Today’s episode is brought to you by Casper!
News items of interest:
Hardware/Software Picks of the Week
Subject: Graphics Cards | October 8, 2016 - 07:01 AM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
On Thursday, NVIDIA released their latest graphics drivers to align with Gears of War 4, Mafia 3, and Shadow Warrior 2. The drivers were published before each of these games launched, which allows gamers to optimize their PCs ahead of time. Graphics vendors work with many big-budget studios during their development cycles, and any tweaks that they found over the months and years will be targeted to this release, as usual.
Beyond tweaking for these games, NVIDIA has also announced a couple of fixes. If you were experiencing issues in Overwatch, then these new drivers fix how decals are drawn. The major fix claims to reduce inconsistent performance in multiple VR titles, which is very useful for these applications.
You can get these drivers from their website, or just install them from GeForce Experience.
Subject: Graphics Cards | October 6, 2016 - 03:17 PM | Tim Verry
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte
Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.
The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.
Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).
The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.
There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch.
Subject: Graphics Cards | October 2, 2016 - 12:12 PM | Sebastian Peak
Tagged: rumor, report, pascal, nvidia, GTX 1050 Ti, graphics card, gpu, GP107, geforce
A report published by VideoCardz.com (via Baidu) contains pictures of an alleged NVIDIA GeForce GTX 1050 Ti graphics card, which is apparently based on a new Pascal GP107 GPU.
Image credit: VideoCardz
The card shown is also equipped with 4GB of GDDR5 memory, and contains a 6-pin power connector - though such a power requirement might be specific to this particular version of the upcoming GPU.
Image credit: VideoCardz
Specifications for the GTX 1050 Ti were previously reported by VideoCardz, with a reported GPU-Z screenshot. The card will apparently feature 768 CUDA cores and a 128-bit memory bus, with clock speeds (for this particular sample) of 1291 MHz base, 1392 MHz boost (with some room to overclock, from this screenshot).
Image credit: VideoCardz
An official announcement for the new GPU has not been made by NVIDIA, though if these PCB photos are real it probably won't be far off.
Subject: Processors | October 1, 2016 - 06:11 PM | Tim Verry
Tagged: xavier, Volta, tegra, SoC, nvidia, machine learning, gpu, drive px 2, deep neural network, deep learning
Earlier this week at its first GTC Europe event in Amsterdam, NVIDIA CEO Jen-Hsun Huang teased a new SoC code-named Xavier that will be used in self-driving cars and feature the company's newest custom ARM CPU cores and Volta GPU. The new chip will begin sampling at the end of 2017 with product releases using the future Tegra (if they keep that name) processor as soon as 2018.
NVIDIA's Xavier is promised to be the successor to the company's Drive PX 2 system which uses two Tegra X2 SoCs and two discrete Pascal MXM GPUs on a single water cooled platform. These claims are even more impressive when considering that NVIDIA is not only promising to replace the four processors but it will reportedly do that at 20W – less than a tenth of the TDP!
The company has not revealed all the nitty-gritty details, but they did tease out a few bits of information. The new processor will feature 7 billion transistors and will be based on a refined 16nm FinFET process while consuming a mere 20W. It can process two 8k HDR video streams and can hit 20 TOPS (NVIDIA's own rating for deep learning int(8) operations).
Specifically, NVIDIA claims that the Xavier SoC will use eight custom ARMv8 (64-bit) CPU cores (it is unclear whether these cores will be a refined Denver architecture or something else) and a GPU based on its upcoming Volta architecture with 512 CUDA cores. Also, in an interesting twist, NVIDIA is including a "Computer Vision Accelerator" on the SoC as well though the company did not go into many details. This bit of silicon may explain how the ~300mm2 die with 7 billion transistors is able to match the 7.2 billion transistor Pascal-based Telsa P4 (2560 CUDA cores) graphics card at deep learning (tera-operations per second) tasks. Of course in addition to the incremental improvements by moving to Volta and a new ARMv8 CPU architectures on a refined 16nm FF+ process.
|Drive PX||Drive PX 2||NVIDIA Xavier||Tesla P4|
|CPU||2 x Tegra X1 (8 x A57 total)||2 x Tegra X2 (8 x A57 + 4 x Denver total)||1 x Xavier SoC (8 x Custom ARM + 1 x CVA)||N/A|
|GPU||2 x Tegra X1 (Maxwell) (512 CUDA cores total||2 x Tegra X2 GPUs + 2 x Pascal GPUs||1 x Xavier SoC GPU (Volta) (512 CUDA Cores)||2560 CUDA Cores (Pascal)|
|TFLOPS||2.3 TFLOPS||8 TFLOPS||?||5.5 TFLOPS|
|DL TOPS||?||24 TOPS||20 TOPS||22 TOPS|
|TDP||~30W (2 x 15W)||250W||20W||up to 75W|
|Process Tech||20nm||16nm FinFET||16nm FinFET+||16nm FinFET|
|Transistors||?||?||7 billion||7.2 billion|
For comparison, the currently available Tesla P4 based on its Pascal architecture has a TDP of up to 75W and is rated at 22 TOPs. This would suggest that Volta is a much more efficient architecture (at least for deep learning and half precision)! I am not sure how NVIDIA is able to match its GP104 with only 512 Volta CUDA cores though their definition of a "core" could have changed and/or the CVA processor may be responsible for closing that gap. Unfortunately, NVIDIA did not disclose what it rates the Xavier at in TFLOPS so it is difficult to compare and it may not match GP104 at higher precision workloads. It could be wholly optimized for int(8) operations rather than floating point performance. Beyond that I will let Scott dive into those particulars once we have more information!
Xavier is more of a teaser than anything and the chip could very well change dramatically and/or not hit the claimed performance targets. Still, it sounds promising and it is always nice to speculate over road maps. It is an intriguing chip and I am ready for more details, especially on the Volta GPU and just what exactly that Computer Vision Accelerator is (and will it be easy to program for?). I am a big fan of the "self-driving car" and I hope that it succeeds. It certainly looks to continue as Tesla, VW, BMW, and other automakers continue to push the envelope of what is possible and plan future cars that will include smart driving assists and even cars that can drive themselves. The more local computing power we can throw at automobiles the better and while massive datacenters can be used to train the neural networks, local hardware to run and make decisions are necessary (you don't want internet latency contributing to the decision of whether to brake or not!).
I hope that NVIDIA's self-proclaimed "AI Supercomputer" turns out to be at least close to the performance they claim! Stay tuned for more information as it gets closer to launch (hopefully more details will emerge at GTC 2017 in the US).
What are your thoughts on Xavier and the whole self-driving car future?
- NVIDIA Teases Xavier, a High-Performance ARM SoC for Drive PX & AI @ AnandTech
- Tegra Related News @ PC Perspective
- Tesla P4 Specifications @ NVIDIA
- CES 2016: NVIDIA Launches DRIVE PX 2 With Dual Pascal GPUs Driving A Deep Neural Network @ PC Perspective
Subject: General Tech | September 29, 2016 - 03:14 PM | Jeremy Hellstrom
Tagged: nvidia, competition, jen-hsun huang, Founder's Edition
When Microsoft launched the Surface there were negative reactions from vendors who saw this as new competition from what was previously their partner. Today DigiTimes reports that certain unnamed GPU vendors have similar feelings about NVIDIA's Founder's Edition cards. Jen-Hsun responded to these comments today, stating that the Founders Editions were "purely to solve problems in graphics card design".
While he did not say that NVIDIA would not consider continuing practice in future cards he does correctly point out that they did share everything about the design and results with the vendors. Those vendors are still somewhat upset about the month in which only Founder's Editions were available for sale as they feel they lost some possible profits by not being able to sell their custom designed GPUs. Then again, considering the limited supply on the market, the amount of sales they could have made that extra month would certainly have been limited. It will be interesting to see if we hear more about this directly from the vendors in the coming weeks.
"Since Nvidia has restricted its graphics card brand partners from releasing in-house designed graphics cards within a month after the releases of its Founders Edition card, the graphics card vendors are displeased with the decision as it had given Nvidia time to earn early profits without competition."
Here is some more Tech News from around the web:
- HP Inc: No DRM in our 3D printers, we swear (unlike our 2D ones) @ The Register
- HP offers optional patch to de-bork its printers after EFF rant @ The Inquirer
- macOS 10.12 Sierra vs. Ubuntu 16.04 Linux Benchmarks @ Phoronix
- Surprise! Leading 4-socket server vendor isn’t Dell or HPE @ The Register
- D-Link DWR-932 B owner? Trash it, says security bug-hunter @ The Register
- Microsoft hails pointless Privacy Shield status for its cloud services @ The Register
- Polish car mechanic is still load-balancing with a Commodore 64 after 25 years @ The Inquirer