Author:
Manufacturer: NVIDIA

A preview of potential Volta gaming hardware

This is a multi-part story for the NVIDIA Titan V:

As a surprise to most of us in the media community, NVIDIA launched a new graphics card to the world, the TITAN V. No longer sporting the GeForce brand, NVIDIA has returned the Titan line of cards to where it began – clearly targeted at the world of developers and general purpose compute. And if that branding switch isn’t enough to drive that home, I’m guessing the $2999 price tag will be.

Today’s article is going to look at the TITAN V from the angle that is likely most interesting to the majority of our readers, that also happens to be the angle that NVIDIA is least interested in us discussing. Though targeted at machine learning and the like, there is little doubt in my mind that some crazy people will want to take on the $3000 price to see what kind of gaming power this card can provide. After all, this marks the first time that a Volta-based GPU from NVIDIA has shipped in a place a consumer can get their hands on it, and the first time it has shipped with display outputs. (That’s kind of important to build a PC around it…)

IMG_4999.JPG

From a scientific standpoint, we wanted to look at the Titan V for the same reasons we tested the AMD Vega Frontier Edition cards upon their launch: using it to estimate how future consumer-class cards will perform in gaming. And, just as we had to do then, we purchased this Titan V from NVIDIA.com with our own money. (If anyone wants to buy this from me to recoup the costs, please let me know! Ha!)

  Titan V Titan Xp GTX 1080 Ti GTX 1080 GTX 1070 Ti GTX 1070 RX Vega 64 Liquid Vega Frontier Edition
GPU Cores 5120 3840 3584 2560 2432 1920 4096 4096
Base Clock 1200 MHz 1480 MHz 1480 MHz 1607 MHz 1607 MHz 1506 MHz 1406 MHz 1382 MHz
Boost Clock 1455 MHz 1582 MHz 1582 MHz 1733 MHz 1683 MHz 1683 MHz 1677 MHz 1600 MHz
Texture Units 320 240 224 160 152 120 256 256
ROP Units 96 96 88 64 64 64 64 64
Memory 12GB 12GB 11GB 8GB 8GB 8GB 8GB 16GB
Memory Clock 1700 MHz MHz 11400 MHz 11000 MHz 10000 MHz 8000 MHz 8000 MHz 1890 MHz 1890 MHz
Memory Interface 3072-bit
HBM2
384-bit G5X 352-bit G5X 256-bit G5X 256-bit 256-bit 2048-bit HBM2 2048-bit HBM2
Memory Bandwidth 653 GB/s 547 GB/s 484 GB/s 320 GB/s 256 GB/s 256 GB/s 484 GB/s 484 GB/s
TDP 250 watts 250 watts 250 watts 180 watts 180 watts 150 watts 345 watts 300 watts
Peak Compute 12.2 (base) TFLOPS
14.9 (boost) TFLOPS
12.1 TFLOPS 11.3 TFLOPS 8.2 TFLOPS 7.8 TFLOPS 5.7 TFLOPS 13.7 TFLOPS 13.1 TFLOPS
MSRP (current) $2999 $1299 $699 $499   $399 $699 $999

The Titan V is based on the GV100 GPU though with some tweaks that lower performance and capability slightly when compared to the Tesla-branded equivalent hardware. Though our add-in card iteration has the full 5120 CUDA cores enabled, the HBM2 memory bus is reduced from 4096-bit to 3072-bit and it has one of the four stacks on the package disabled. This also drops the memory capacity from 16GB to 12GB, and memory bandwidth to 652.8 GB/s.

Continue reading our gaming review of the NVIDIA Titan V!!

Ti-ny bubbles in my card, watercooling the GTX 1080 Ti

Subject: Graphics Cards | December 13, 2017 - 02:05 PM |
Tagged: gtx 1080 ti, phanteks, G1080, watercooling

Phanteks announced their G1080 water block for GTX 1080 Ti's a while back but we hadn't seen it in action until now.  [H]ard|OCP installed the cooler on a Founders Edition card and created a video of the process.  Not only do they show how to properly install the water block they also cover a few of the possible issues you might encounter while doing so.  They also made a video showing how the coolant flows through the waterblock which is not only pretty but can help you determine where to insert your GPU into your watercooling loop.

15125351650bqfuz1m4i_1_2_l.jpg

"Phanteks recently sent us its Glacier series water block for our Founders Edition GTX 1080 Ti. We take you through the full process of getting it installed. We check out the mating surfaces of the GPU, capacitors, and MOSFETs and show you just how well it all fits together. Then finally we show exactly how the coolant flows in 4K!"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Video: What does a $3000 GPU look like? NVIDIA TITAN V Unboxing and Teardown!

Subject: Graphics Cards | December 12, 2017 - 07:51 PM |
Tagged: nvidia, titan, titan v, Volta, video, teardown, unboxing

NVIDIA launched the new Titan V graphics card last week, a $2999 part targeted not at gamers (thankfully) but instead at developers of machine learning applications. Based on the GV100 GPU and 12GB of HBM2 memory, the Titan V is an incredibly powerful graphics card. We have every intention of looking at the gaming performance of this card as a "preview" of potential consumer Volta cards that may come out next year. (This is identical to our stance of testing the Vega Frontier Edition cards.)

But for now, enjoy this unboxing and teardown video that takes apart the card to get a good glimpse of that GV100 GPU.

A couple of quick interesting notes:

  • This implementation has 25% of the memory and ROPs disabled, giving us 12GB of HBM2, a 3072-bit bus, and 96 ROPs.
  • Clock speeds in our testing look to be much higher than the base AND boost ratings.
  • So far, even though the price takes this out of the gaming segment completely, we are impressed with some of the gaming results we have found.
  • The cooler might LOOK the same, but it definitely is heavier than the cooler and build for the Titan Xp.
  • Champagne. It's champagne colored.
  • Double precision performance is insanely good, spanking the Titan Xp and Vega so far in many tests.
  • More soon!

gv100.png

Source: NVIDIA
Author:
Manufacturer: AMD

The flower, not the hormone

It was way back in December of 2014 that AMD and the Radeon group first started down the path of major driver updates on an annual cadence. The Catalyst Omega release marked the beginning of a recommitment to the needs of gamers (and now professionals) with more frequent, and more dramatic, software updates and improvements. Cognizant of the previous reputation the company had with drivers and software, often a distant second to the success that NVIDIA had created with it GeForce drivers, Radeon users were promised continuous increases.

And make no mistake, the team at AMD had an uphill battle. But with releases like Omega, Crimson, ReLive, and now Adrenalin, it’s clear that the leadership has received the message and put emphasis on the portion of its product that can have the most significant impact on experience.

AMD joins us at the PCPer offices to talk through all the new features and capabilities!

Named after the adrenalin rose, rather than the drug that flows through your body when being chased by feral cats, this latest major software release for Radeon users includes a host of new features and upgraded ones that should bring a fresh coat of paint to any existing GPU. Two big features will steal the show, the new Radeon Overlay and a mobile app called AMD Link. But expansions to ReLive, Wattman, Enhanced Sync, and Chill are equally compelling.

Let’s start with what I think will get the most attention and deservedly so, the Radeon Overlay. As the name would suggest, the overlay can be turned out through a hotkey in-game, and allows the gamer to access graphics card monitoring tools and many driver settings without leaving the game, having to alt-tab, or having to close the game to apply. By hitting Alt-R, a screen will show up on the right-hand side of the display, with the game continuing to run in the background. The user will be able to interact with the menu via mouse or keyboard, and then hit the same hotkey or Esc to return.

adrenalin-49.jpg

Continue reading our look at the new AMD Radeon Softare Adrenalin Edition driver!!

NVIDIA Launches Titan V, the World's First Consumer Volta GPU with HBM2

Subject: Graphics Cards | December 7, 2017 - 11:44 PM |
Tagged: Volta, titan, nvidia, graphics card, gpu

NVIDIA made a surprising move late Thursday with the simultaneous announcement and launch of the Titan V, the first consumer/prosumer graphics card based on the Volta architecture.

NVIDIA_TITAN V_KV.jpeg

Like recent flagship Titan-branded cards, the Titan V will be available exclusively from NVIDIA for $2,999. Labeled "the most powerful graphics card ever created for the PC," Titan V sports 12GB of HBM2 memory, 5120 CUDA cores, and a 1455MHz boost clock, giving the card 110 teraflops of maximum compute performance. Check out the full specs below:

6 Graphics Processing Clusters
80 Streaming Multiprocessors
5120 CUDA Cores (single precision)
320 Texture Units
640 Tensor Cores
1200 MHz Base Clock (MHz)
1455 MHz Boost Clock (MHz)
850 MHz Memory Clock
1.7 Gbps Memory Data Rate
4608K L2 Cache Size
12288 MB HBM2 Total Video Memory
3072-bit Memory Interface
652.8 GB/s Total Memory Bandwidth
384 GigaTexels/sec Texture Rate (Bilinear)
12 nm Fabrication Process (TSMC 12nm FFN High Performance)
21.1 Billion Transistor Count
3 x DisplayPort, 1 x HDMI Connectors
Dual Slot Form Factor
One 6-pin, One 8-pin Power Connectors
600 Watts Recommended Power Supply
250 Watts Thermal Design Power (TDP)

The NVIDIA Titan V's 110 teraflops of compute performance compares to a maximum of about 12 teraflops on the Titan Xp, a greater than 9X increase in a single generation. Note that this is a very specific claim though, and references the AI compute capability of the Tensor cores rather than we traditionally measure for GPUs (single precision FLOPS). In that metric, the Titan V only truly offers a jump to 14 TFLOPS. The addition of expensive HBM2 memory also adds to the high price compared to its predecessor.

titan-v-stylized-photography-6.jpeg

The Titan V is available now from NVIDIA.com for $2,999, with a limit of 2 per customer. And hey, there's free shipping too.

Source: NVIDIA

Gigabyte Launches GTX 1070 Ti Aorus

Subject: Graphics Cards | December 7, 2017 - 01:31 AM |
Tagged: pascal, GTX 1070Ti, GP104, gigabyte, aorus

Gigabyte is jumping into the custom GTX 1070 Ti fray with the Aorus branded GeForce GTX 1070 Ti Aorus. The new custom graphics card measures 280 x 111 x 38mm and features a WindForce 3 cooler with backplate and a custom 6+2 power phase.

GIGABYTE-AORUS-GTX1070Ti-2.jpg

Backlit by RGB Fusion LEDs, the Aorus logo sits on the side of the card and can be configured to the color of your choice. The shroud is black with orange accents and has sharp stealth angles that is minimal by comparison with other cards. Gigabyte is using a fairly beefy heatsink with this card. Specifically, three 80mm fans push air over a beefy heatsink that consists of three fin stacks connected by four composite heatpipes. Further, the cooler uses direct contact for the heatpipes above the GPU and a metal plate with thermal pads to cover the GDDR5 memory chips. The rightmost fin stack cools the MOSFETs. Additionally, the full cover backplate adds rigidity to the card and has a copper plate to draw excess heat from the underside of the GPU.

The Aorus graphics card is powered by a single 8-pin PCI-E power connector that feeds a 6+2 power phase. External video outputs include one DVI, one HDMI 2.0b, and three DisplayPort 1.4 ports.

The Pascal-based GP104-300 GPU (2432 CUDA cores, 152 TMUs, and 64 ROPs) is clocked at 1607 MHz base and 1683 MHz boost which is the maximum vendors can clock the cards out of the box. Gigabyte does offer 1-click overclocking using its Aorus Graphics Engine software and guarantees at least 88 MHz overclocks to 1683 MHz base and 1771 MHz boost. Users can, of course, use other software like MSI Afterburner or EVGA Precision X if they wish but will need to use the Gigabyte tool if they want the single click automatic overclock. The 8GB of GDDR5 memory is stock clocked at 8008 MHz and sits on a 256-bit bus.

Looking online, the Aorus GTX 1070 Ti Aorus doesn’t appear to be available for sale quite yet, but it should be coming soon. With the Gigabyte GTX 1070 Ti Gaming card coming in at $469, I’m betting the Aorus card with guaranteed overclock will have a MSRP around $500.

Also read:

Source: Gigabyte

EVGA Launches GTX 1070 Ti FTW Ultra Silent Graphics Card

Subject: Graphics Cards | December 6, 2017 - 06:50 PM |
Tagged: evga, ftw, gtx 1070 ti, pascal, overclocking

EVGA is launching a new Pascal-based graphics card with a thicker 2.5 slot cooler in the form of the GeForce GTX 1070 Ti FTW Ultra Silent. The new graphics card has a sleek gray and black shroud with two large black fans in standard ACX 3.0 cooler styling, but with a much thicker cooler that EVGA claims enables more overclocking headroom or a nearly silent fan profile on stock settings.

08G-P4-6678-KR_1.jpg

The GTX 1070 Ti FTW Ultra Silent is powered by two 8-pin PCI-E power connectors that feed a 10+2 power phase and enables the cards 235W TDP (reference TDP is 180 watts). The 2432 Pascal GPU cores are clocked at 1607 MHz base and 1683 MHz boost which aligns with NVIDIA's reference specifications. While there are no guaranteed factory overclock here, EVGA is bundling the dual BIOS card with its EVGA Precision XOC and Precision OC Scanner X software for one-click overclocking that dynamically pushes the clocks up to find the optimal overclock for that specific card. The 8GB of GDDR5 memory is also stock clocked at 8008 MHz. Other features include a backplate, white LEDs, and 2-way SLI support.

Display outputs include one HDMI 2.0b, three DisplayPort 1.4, and one DVI port.

08G-P4-6678-KR_6.jpg

The new FTW series graphics card is available now from the EVGA website for $499.99 and comes with a three year warranty.

The graphics card appears to be rather tall, and I am curious how well the beefier heatsink performs and just how "ultra silent" those fans are! Hopefully we can get one in for testing! The $499.99 MSRP is interesting though because it lines up with the MSRP of the GTX 1080, but with the state of the GPU market as it is the price is not bad and actually comes in about in the middle of where other GTX 1070 Ti cards are at. My guess is they will be snatched up pretty quickly so it's hard to say if it will stay at that price especially on third party sites.

Source: EVGA

ASUS Launches ROG Strix RX Vega 64 and 56 OC Edition Graphics Cards

Subject: Graphics Cards | December 4, 2017 - 10:10 PM |
Tagged: vega 64, vega 56, rx vega, ROG Strix, ASUS ROG, asus

ASUS is launching two new factory overclocked graphics cards with all of the RGB in the form of the ROG Strix RX Vega 64 and ROG Strix RX Vega 56. Measuring 11.73" x 2.58" x 2.07" these graphics cards are beastly 2.5 slot designs with large triple fan coolers. On the outside, the graphics cards have a black shroud with RGB LEDs around the fans, on the Strix side logo, and the ROG backplate logo. Asus is using a massive heatsink that is divided into two aluminum fin stacks that connect to a copper baseplate using five heatpipes each. The baseplate is reportedly 10% flatter for improved contact with the GPU. There are three fans to push air over the heatsinks that are of the dust resistant Wing-Blade variety.

ASUS ROG Strix RX Vega 64 with RGB LEDs.png

The cards have two 8-pin PCI-E power connectors feeding ASUS' Super Alloy Power II VRMs. Other connectors include hybrid fan headers for system fans and an Aurora Sync RGB LED header. Display outputs are "VR Ready" and include two HDMI, two DisplayPort, and a single DVI output.

While ASUS has not yet revealed clockspeeds on the RX Vega 56 card, eTeknix has gotten their hands on the ROG Strix RX Vega 64 graphics card and figured out the clocks for that card. Specifically, the Vega 64 card clocks its 4096 GPU cores at 1298 MHz base and 1590 MHz boost. The site further lists the memory clockspeed at 945 MHz which doesn't appear to be overclocked as it matches the referece Vega 64 HBM2 clocks of 1890 MHz. Users can use the GPU Tweak II software to push the card further on their own though.

overview.png

ASUS has not yet revealed pricing or exact availability dates but expect them to sell out fast and over MSRP when they do surface thanks to the resurgance of GPU mining! With that said it is promising that we are finally seeing factory overclocked cards being announced!

Also read:

Source: eTeknix

AMD Working on GDDR6 Memory Controller For Future Graphics Cards

Subject: General Tech, Graphics Cards | December 4, 2017 - 05:47 PM |
Tagged: navi, HBM2, hbm, gddr6, amd

WCCFTech reports that AMD is working on a GDDR6 memory controller for its upcoming graphics cards. Starting with an AMD Technical Engineer listing GDDR6 on his portfolio, the site claims to have verified through sources familiar with the matter that AMD is, in fact, supporting the new graphics memory standard and will be using their own controller to support it (rather than licensing one).

roadmap.jpg

AMD is not abandoning HBM2 memory though. The company is sticking to its previously released roadmaps and Navi will still utilize HBM2 memory – at least on the high-end SKUs. While AMD has so far only released RX Vega 64 and RX Vega 56 graphics cards, the company may well release lower-end Vega-based cards with GDDR5 at some point although for now the Polaris architecture is handling the lower end. AMD supporting GDDR6 is a good thing and should enable cheaper mid-range cards that are not limited by supply shortages of the more expensive (albeit much higher bandwidth) High Bandwidth Memory that have seemingly plagues both NVIDIA and AMD at various points in time. GDDR6 further offers several advantages over GDDR5 with almost twice the speed (9 Gbps versus 16 Gbps) at lower power (1.5V versus 1.35V) and more density and underlying technology optimizations than even GDDR5X. While the G5X memory is capable of hitting the same 16 Gbps launch speeds of GDDR6, the newer memory technology offers up to 32Gb dies* versus 16Gb and a two channel design (which ends up being a bit more efficient and easier to produce / for GPU manufacturers to wire up). GDDR6 will represent a nice speed bump for mid-range cards (very low end may well stick with GDDR5 save for mobile parts which could benefit from the lower power GDDR6) while letting AMD have a bit better profit margins on these lower end margin SKUs and being able to produce more cards to satisfy demand. HBM2 is nice to have but it is more well suited for the compute-oriented cards for workstation and data center usage rather than gaming right now and GDDR6 can offer more price-to-performance for the consumer gaming cards.

As for the question of why AMD would want to design their own GDDR6 memory controller rather than license one, I think that comes down to AMD thinking long-term. It will be more expensive up front to design their own controller, but AMD will be able to more fully integrate it and tune it to work with their graphics cards such that it can be more power efficient. Also, having their own GDDR6 memory controller means they can use it in other areas such as their APUs and SoCs offered through their Semi Custom Business Unit (e.g. the SoCs used in gaming consoles). Being able to offer that controller to other companies in their semi-custom SoCs free of third party licensing fees is a good thing for AMD.

Micron GDDR5X.png

With GDDR6 becoming readily available early next year, there is a good chance AMD will be ready to use the new memory technology as soon as Navi but likely not until closer to the end of 2018 or early 2019 when AMD launches new lower and mid-range gaming cards (consumer-level) based on Navi and/or Vega.

*At launch it appears that GDDR6 from the big three (Micron, Samsung, and SK Hynix) will use 16Gb dies, but the standard allows for up to 32Gb dies. The G5X standard allows for up to 16Gb dies.

Also read:

Source: WCCFTech

AMD Is Very Pleased To Participate in Blockchain Technology

Subject: General Tech, Graphics Cards | December 3, 2017 - 04:26 PM |
Tagged: bitcoin, cryptocurrency, mining, gaming, lisa su, amd, Vega

AMD’s CEO Lisa Su was recently appeared on CNBC’s Power Lunch Exclusinve interview segment where she answered questions about bitcoin, blockchain technology, the tax reform bill, and sexual harassment in the workplace.

AMD CNBC.png

Of particular interest to PC Perspective readers, Dr. Lisa Su shared several interesting bits of information on cryptocurrency mining and how it is affecting the company’s graphics cards. Surprisingly, she stated that cryptocurrency miners were a "very small percentage" of sales and specifically that they represented a mid-single digit percentage of buyers (~4 to 6 percent). This number is hard to believe for me as I expected it to be significantly higher with the prices of graphics cards continuing to climb well above MSRP (it wasn’t too bad when writing our gift guide and shortly after but just as I was about to commit I looked and prices had shot back up again coinciding with a resurgence in mining popularity with the price of cryptocurrencies rising and improving ROI).

Further, the AMD president and CEO states that the company is interested in this market, but they are mainly waiting to see how businesses and industries adopt blockchain technologies. AMD is “very pleased to participate in blockchain” and believes it is a “very important foundational product”. Dr. Lisa Su did not seem very big on bitcoin specifically, but did seem interested in the underlying blockchain technologies and future cryptocurrencies.

Beyond bitcoin, altcoins, and the GPU mining craze, AMD believes that gaming is and continues to be a tremendous growth market for the company. AMD has reportedly launched 10 new product families and saw sizeable increases in sales on Amazon and Newegg versus last year with processor sales tripling and double digital percentage increases in graphics sales in 2017. AMD also managed to be in two of the three gaming towers in Best Buy for the holiday buying season.

Speaking for AMD Dr. Su also had a few other interesting bits of information to share. The interview is fairly short and worth watching. Thankfully Kyle over at HardOCP managed to record it and you can watch it here. If you aren't able to stream the video, PCGamer has transcribed most of the major statements.

What are your thoughts on the interview? Will we ever see GPU prices return to normal so I can upgrade, and do you agree with AMD’s assessment that miners are such a small percentage of their sales and not as much of an influencer in pricing as we thought (perhaps it’s a supply problem rather than a demand problem, or the comment was only taking their mining-specific cards into account?)?

Source: HardOCP