Author:
Manufacturer: NVIDIA

A Look Back and Forward

Although NVIDIA's new GPU architecture, revealed previously as Turing, has been speculated about for what seems like an eternity at this point, we finally have our first look at exactly what NVIDIA is positioning as the future of gaming.

geforce-rtx-2080.png

Unfortunately, we can't talk about this card just yet, but we can talk about what powers it

First though, let's take a look at the journey to get here over the past 30 months or so.

Unveiled in early 2016, Pascal marked by the launch of the GTX 1070 and 1080 was NVIDIA's long-awaited 16nm successor to Maxwell. Constrained by the oft-delayed 16nm process node, Pascal refined the shader unit design original found in Maxwell, while lowering power consumption and increasing performance.

Next, in May 2017 came Volta, the next (and last) GPU architecture outlined in NVIDIA's public roadmaps since 2013. However, instead of the traditional launch with a new GeForce gaming card, Volta saw a different approach.

Click here to continue reading our analysis of NVIDIA's Turing Graphics Architecture

NVIDIA and Arrow Electronics New Jetson Xavier AI Computer

Subject: General Tech, Systems | September 10, 2018 - 04:59 PM |
Tagged: jetson xavier, nvidia, arrow electronics

Looking to do a little bit of black box programming but need new hardware to do it?  NVIDIA have partnered with Arrow Electronics to produce the newest Jetson system, the Xavier.

Xavier-White_Cropped.jpg

The Xavier supports JetPack and DeepStream SDKs, as well as CUDA, cuDNN, and TensorRT software libraries.  The 512-core Volta GPU with Tensor Cores offer 10 TFLOPS at FP16 and 20 TOPS at INT8, with the two NVDLA engines adding another 5 TOPS each.  It is not just the processing power which has been upgraded, running full out the Xavier is rated at 30W with the option to reduce that maximum to 10W or 15W if efficiency is more important than raw speed.

If you are currently using the Jetson TX2 you have some thinking to do as this units pin-out will not be compatible, however many of the signals are.  The units are in pre-order right now, with the Dev Kit selling for $2500 (USD), $1300 if you are a NVIDIA Developer Program member.

Check out the specs and PR below.

specs.PNG

SANTA CLARA, Calif., Sept. 10, 2018 (GLOBE NEWSWIRE) -- NVIDIA and Arrow Electronics, Inc. today announced they are bringing NVIDIA Jetson Xavier, a first-of-its-kind computer designed for AI, robotics and edge computing, to companies worldwide to create next-generation autonomous machines.

The collaboration combines NVIDIA’s world-leading AI capabilities with Arrow’s global roster of industrial customers and its broad support network of engineers and designers. This opens the door to the development and deployment of AI solutions for manufacturing, logistics, smart cities, healthcare and more.

“We are entering a new era of intelligent machines that will supercharge industries from manufacturing to healthcare,” said Deepu Talla, vice president and general manager of Autonomous Machines at NVIDIA. “NVIDIA and Arrow are working together to ensure that the unmatched AI capabilities of the Jetson Xavier platform reach deep into the global marketplace with first-class technical support and design.”

“At Arrow, we focus on connecting our global customers and developers to the right technology to enable transformative digital business,” said Aiden Mitchell, vice president and general manager, IoT Global Solutions at Arrow. “NVIDIA’s AI platform and Jetson Xavier is at that point, and our industrial customers can secure the Xavier developer kit from Arrow.com today.”

Jetson Xavier — available as a developer kit that customers can use to prototype designs — is supported by comprehensive software for building AI applications.

This includes the NVIDIA JetPack and DeepStream SDKs, as well as CUDA, cuDNN and TensorRT™ software libraries. At its heart is the new NVIDIA Xavier processor, which provides more computing capability than a powerful workstation and comes in three energy-efficient operating modes.

“Edge intelligence in modern robotics is a critical component in driving new use cases and increasing adoption. This relationship is primed to showcase the value of robotics in new areas and help drive continued innovation in the space,” said John Santagate, research director of Worldwide Robotics at IDC.

The NVIDIA Jetson Xavier developer kit is now available for purchase through Arrow’s website at https://www.arrow.com/nvidia.

 

Source: NVIDIA

Board shorts, the GPU market shrinks a bit

Subject: General Tech, Graphics Cards | September 7, 2018 - 01:36 PM |
Tagged: jon peddie, gpu market share, amd, nvidia

Last week we had a peek at the overall GPU market, including APUs, and the news was not great.  This week Jon Peddie released details on the discrete GPU market, which also saw contractions.  When you look at this quarter versus last quarter, sales dropped by 28% and are down 5.7% from this time last year, similar to the trend we saw with the total market.  If you look back over time Q2 tends to be a bad quarter for GPU sales and the current market is actually larger in total volume than two years ago, before the mining craze was fully underway. 

You can see the details of AMD and NVIDIA's quarter below.

unnamed.png

The market shares for the desktop discrete GPU suppliers shifted in the quarter, Nvidia increased market share from last quarter, while AMD enjoyed an increase in share year-to-year."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Jon Peddie

Podcast #511 - IFA 2018, StoreMI, and more!

Subject: General Tech | August 30, 2018 - 12:58 PM |
Tagged: podcast, xps13, StoreMI, Samsung, radeon pro, nvidia, Intel, ifa 2018, freesync, Azulle, amd, acer

PC Perspective Podcast #511 - 08/30/18

Join us this week for discussion on IFA 2018, StoreMI, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Jeremy Hellstrom, Josh Walrath, Allyn Malventano

Peanut Gallery: Alex Lustenberg

Program length: 1:24:43

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
    1. IFA 2018
  3. Picks of the Week:
    1. 1:27:09 Jeremy: NordVPN deal
    2. 1:29:00 Josh: 3 free games!
    3. 1:33:25 Alex: http://paletton.com/
  4. Closing/outro
 
 
Source:
Author:
Manufacturer: AMD

Your Mileage May Vary

One of the most interesting things going around in the computer hardware communities this past weekend was the revelation from a user named bryf50 on Reddit that they somehow had gotten his FreeSync display working with his NVIDIA GeForce GPU. 

For those of you that might not be familiar with the particular ins-and-outs of these variable refresh technologies, getting FreeSync displays to work on NVIDIA GPUs is potentially a very big deal.

While NVIDIA GPUs support the NVIDIA G-SYNC variable refresh rate standard, they are not compatible with Adaptive Sync (the technology on which FreeSync is based) displays. Despite Adaptive Sync being an open standard, and an optional extension to the DisplayPort specification, NVIDIA so far has chosen not to support these displays.

However, this provides some major downsides to consumers looking to purchase displays and graphics cards. Due to the lack of interoperability, consumers can get locked into a GPU vendor if they want to continue to use the variable refresh functionality of their display. Plus, Adaptive-Sync/FreeSync monitors, in general, seem to be significantly more inexpensive for similar specifications.

01.jpg

Click here to continue reading our exploration into FreeSync support on NVIDIA GPUs!

 

Join the Battlefield V Open Beta with the new GeForce Game Ready 399.07 WHQL drivers

Subject: General Tech | August 27, 2018 - 05:12 PM |
Tagged: Switchblade, Strange Brigade, Pro Evolution Soccer 2019, open beta, nvidia, Immortal: Unchained, geforce 399.07, F1 2018, battlefield V

The open beta for Battlefield V begins on September 4th for those want to ruin everything or who are Origin Access Premier, Origin Access Basic or EA Access members. The rest of us, especially those who have learned the evils of pre-ordering have to wait until September 6th to try it out.

battlefield-v-nvidia-rtx-ray-tracing-screenshot-004.jpg

What you don't have to wait for is the GeForce Game Ready 399.07 WHQL driver, which you can snag through GeForce Experience or from the driver page here.  This driver will enable RTX ray tracing in BFV as well as offering optimized performance in F1 2018, Immortal: Unchained, Pro Evolution Soccer 2019, Strange Brigade, and Switchblade.  There are other fixes as well (PDF), including a resolution to those experiencing stuttering in Windowed G-Sync games after ye olde Win10 Spring Update. 

Today NVIDIA released a new Game Ready Driver for the Battlefield V Open Beta. This driver will also provide the best game play experience for F1 2018, Immortal: Unchained, Pro Evolution Soccer 2019, Strange Brigade, and Switchblade.

Ahead of Battlefield V’s general release on October 19th, you can participate in an open beta featuring Conquest on the Rotterdam and Arctic Fjord maps, and Grand Operations on Arctic Fjord, with Airborne mode on Day 1, and Breakthrough mode on Day 2. The Battlefield V Open Beta begins on September 4th for those who pre-ordered, and gamers who are Origin Access Premier, Origin Access Basic and EA Access members. And then on September 6th for everyone else. GeForce gamers are Game Ready today.

Battlefield V will be one of the first games enhanced with NVIDIA RTX Real-Time Ray Tracing on our newly-unveiled GeForce RTX graphics cards, bringing a new level of fidelity and realism to the already-stunning game.

Available on or before launch day, NVIDIA Game Ready Drivers provide the best experience for GeForce gamers because NVIDIA engineers work up until the last possible minute to optimize performance and perfect gameplay. And as an additional verification of quality, every Game Ready Driver is WHQL-certified by Microsoft.

Source: NVIDIA

Podcast #510 - NVIDIA 2080 Launch, blockchain gaming, and more!

Subject: General Tech | August 23, 2018 - 03:54 PM |
Tagged: Volta, video, turing, Threadripper, rtx, podcast, nzxt, nvidia, logitech, arm, amd

PC Perspective Podcast #510 - 08/23/18

Join us this week for discussion on NVIDIA 2080 Launch, blockchain gaming, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Jeremy Hellstrom, Josh Walrath, Allyn Malventano

Peanut Gallery: Alex Lustenberg

Program length: 1:24:43

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
  3. Picks of the Week:
    1. 1:14:15 Jeremy: I love 14cm fans!
  4. Closing/outro
 
 
Source:

NVIDIA teases RTX 2080 performance and features

Subject: General Tech, Graphics Cards, Shows and Expos | August 22, 2018 - 02:06 PM |
Tagged: turing, RTX 2080, nvidia, geforce, ansel

NVIDIA has been showing off a slideshow in Germany, offering a glimpse at the new features Turing brings to the desktop as well as in-house performance numbers.  As you can see below, their testing shows a significant increase in performance from Pascal, it will be interesting to see how the numbers match up once reviewers get their hands on these cards.

TuringVsPascal_EditorsDay_Aug22_2-.png

While those performance numbers should be taken with a grain of salt or three, the various features which the new generation of chip brings to the table will appear as presented.   For fans of Ansel, you will be able to upscale your screenshots to 8k with Ansel AI UpRes, which offers an impressive implementation of anti-aliasing.  They also showed off a variety of filtres you can utilize to make your screenshots even more impressive.

up yer rez.PNG

The GigaRays of real time ray tracing capability on Turing look very impressive but with Ansel, your card has a lot more time to process reflections, refractions and shadows which means your screenshots will look significantly more impressive than what the game shows while you are playing.  In the example below you can see how much more detail a little post-processing can add.

rt scale.PNG

There are a wide variety of released and upcoming games which will support these features; 22 listed by name at the conference.  A few of the titles only support some of the new features, such as NVIDIA Highlights, however the games below should offer full support, as well as framerates high enough to play at 4k with HDR enabled.

TuringVsPascal_EditorsDay_Aug22.png

Keep your eyes peeled for more news from NVIDIA and GamesCom.

Source: NVIDIA

Turing vs Volta: Two Chips Enter. No One Dies.

Subject: Graphics Cards | August 21, 2018 - 08:43 PM |
Tagged: nvidia, Volta, turing, tu102, gv100

In the past, when NVIDIA launched a new GPU architecture, they would make a few designs for each of their market segments. All SKUs would be one of those chips, with varying amounts of it disabled or re-clocked to hit multiple price points. The mainstream enthusiast (GTX -70/-80) chip of each generation is typically 300mm2, and the high-end enthusiast (Titan / -80 Ti) chip is often around 600mm2.

nvidia-2016-gtc-pascal-banner.png

Kepler used quite a bit of that die space for FP64 calculations, but that did not happen with consumer versions of Pascal. Instead, GP100 supported 1:2:4 FP64:FP32:FP16 performance ratios. This is great for the compute community, such as scientific researchers, but games are focused on FP32. Shortly thereafter, NVIDIA releases GP102, which had the same number of FP32 cores (3840) as GP100 but with much-reduced 64-bit performance… and much reduced die area. GP100 was 610mm2, but GP102 was just 471mm2.

At this point, I’m thinking that NVIDIA is pulling scientific computing chips away from the common user to increase the value of their Tesla parts. There was no reason to either make a cheap 6XXmm2 card available to the public, and a 471mm2 part could take the performance crown, so why not reap extra dies from your wafer (and be able to clock them higher because of better binning)?

nvidia-2017-sc17-japanaisuper.jpg

And then Volta came out. And it was massive (815mm2).

At this point, you really cannot manufacture a larger integrated circuit. You are at the limit of what TSMC (and other fabs) can focus onto your silicon. Again, it’s a 1:2:4 FP64:FP32:FP16 ratio. Again, there is no consumer version in sight. Again, it looked as if NVIDIA was going to fragment their market and leave consumers behind.

And then Turing was announced. Apparently, NVIDIA still plans on making big chips for consumers… just not with 64-bit performance. The big draw of this 754mm2 chip is its dedicated hardware for raytracing. We knew this technology was coming, and we knew that the next generation would have technology to make this useful. I figured that meant consumer-Volta, and NVIDIA had somehow found a way to use Tensor cores to cast rays. Apparently not… but, don’t worry, Turing has Tensor cores too… they’re just for machine-learning gaming applications. Those are above and beyond the raytracing ASICs, and the CUDA cores, and the ROPs, and the texture units, and so forth.

nvidia-2018-geforce-rtx-turing-630-u.jpg

But, raytracing hype aside, let’s think about the product stack:

  1. NVIDIA now has two ~800mm2-ish chips… and
  2. They serve two completely different markets.

In fact, I cannot see either FP64 or raytracing going anywhere any time soon. As such, it’s my assumption that NVIDIA will maintain two different architectures of GPUs going forward. The only way that I can see this changing is if they figure out a multi-die solution, because neither design can get any bigger. And even then, what workload would it even perform? (Moment of silence for 10km x 10km video game maps.)

What do you think? Will NVIDIA keep two architectures going forward? If not, how will they serve all of their customers?

Asus Announces ROG Strix, Dual, and Turbo Series RTX 2080 Ti and RTX 2080 Graphics Cards

Subject: Graphics Cards | August 20, 2018 - 03:08 PM |
Tagged: turing, RTX 2080 Ti, RTX 2080, nvidia, geforce, asus

Following Jensen Huang's reveal of the RTX family of Turing-based graphics cards, Asus announced that it will have graphics cards from its ROG Strix, Dual, and Turbo product lines available in mid-September. The new graphics cards will be based around the NVIDIA Geforce RTX 2080 Ti and the Geforce RTX 2080 GPUs.

Asus ROG Strix RTX 2080.jpg

According to Asus, their new Turing-based graphics cards will be built using their Auto-Extreme technology and with redesigned coolers to increase card-to-card product consistency and cooling efficiency. The triple fan ROG Strix and dual fan Dual series cards use a new 2.7 slot design that results in 20% and 50% increases (respectively) in cooling array surface area versus their 1000 series predecessors. The ROG Strix card uses Axial fans that reportedly offer better airflow and IP5X dust resistance while the Dual series cards use Wing Blade fans that also offer dust resistance along with being allegedly quieter while pushing more air. Meanwhile, the Turbo series uses a blower-style cooler that has been redesigned and uses an 80mm dual ball bearing fan with a new shroud that allows for more airflow even in small cases or when cards are sandwiched together in a multi-GPU setup.

The ROG Strix RTX 2080 Ti and RTX 2080 cards will have one USB Type-C (VirtualLink), two HDMI 2.0b, and two Display Port 1.4a outputs. The Dual RTX 2080 Ti and RTX 2080 cards will have one USB Type-C, one HDMI 2.0b, and three Display Port 1.4 outputs. Finally, the Turbo series RTX 2080 Ti and RTX 2080 cards will have one USB Type-C, one HDMI 2.0b, and two Display Port 1.4 ports.

  RTX 2080 Ti RTX 2080 
GPU TU102 TU104
GPU Cores 4352 2944
Base Clock 1350 MHz (Turbo model) 1515 MHz (Turbo model)
Boost Clock 1545 MHz (Turbo model) 1710 MHz (Turbo model)
Tensor Cores 576 384
Ray Tracing Speed 10 GRays/s 8 GRays/s
Memory 11GB 8GB
Memory Clock 14000 MHz  14000 MHz 
Memory Interface 352-bit G6 256-bit G6
Memory Bandwidth 616GB/s 448 GB/s
TDP ?
Process Tech 12nm 12nm

Exact specification are still unknown though Asus did reveal clockspeeds for the Turbo models which are listed above. The clockspeeds for the Dual and ROG Strix cards should be quite a bit higher than those thanks to the much beefier coolers, and the OC Editions in particular should be clocked higher than reference specs.

Asus Turbo RTX 2080 Ti.jpg

Asus did not disclose exact MSRP pricing, but it did state that several models will be available for pre-order starting today and will be officially avaialble in the middle of September. It appears that a couple RTX 2080 Ti and RTX 2080 cards have already appeared on Newegg, but not all of them have shown up yet. The models slated to be available for preorder include the Dual GeForce RTX 2080 Ti OC Edition, Turbo RTX 2080 Ti, ROG Strix GeForce RTX 2080 OC Edition, and the Dual RTX 2080 OC Edition.

Related reading:

Source: Asus