Samsung Begins Mass Production Of 18 Gbps 16-Gigabit GDDR6 Memory

Subject: Memory | January 18, 2018 - 12:34 AM |
Tagged: Samsung, graphics memory, graphics cards, gddr6, 19nm

Samsung is now mass producing new higher density GDDR6 memory built on its 10nm-class process technology that it claims offers twice the speed and density of its previous 20nm GDDR5. Samsung's new GDDR6 memory uses 16 Gb dies (2 GB) featuring pin speeds of 18 Gbps (gigabits-per-second) and is able to hit data transfer speeds of up to 72 GB/s per chip.

Samsung GDDR6_PhotoFs.png

According to Samsnug, its new GDDR6 uses a new circuit design which allows it to run on a mere 1.35 volts. Also good news for Samsung and for memory supply (and thus pricing and availability of products) is that the company is seeing a 30% gain in manufacturing productivity cranking out its 16Gb GDDR6 versus its 20nm GDDR5. 

Running at 18 Gbps, the new GDDR6 offers up quite a bit of bandwidth and will allow for graphics cards with much higher amounts of VRAM. Per package, Samsung's 16Gb GDDR6 offers 72 GB/s which is twice the density, pin speed, and bandwidth than that of its 8Gb GDDR5 running at 8Gbps and 1.5V with data transfers of 32 GB/s. (Note that SK Hynix has announced it plans to produce 9Gbps and 10Gbps dies which max out at 40 GB/s.) GDDR5X gets closer to this mark, and in theory is able to hit up to 16 Gbps per pin and 64 GB/s per die, but so far the G5X used in real world products has been much slower (the Titan XP runs at 11.4 Gbps for example). The Titan XP runs 12 8Gb (1GB) dies at 11.4 Gbps on a 384-bit memory bus for maximum memory bandwidth of 547 GB/s. Moving to GDDR6 would enable that same graphics card to have 24 GB of memory (with the same number of dies) with up to 864 GB/s of bandwidth which is approaching High Bandwidth Memory levels of performance (though it still falls short of newer HBM2 and in practice the graphics card would likely be more conservative on the memory speeds). Still, it's an impressive jump in memory performance that widens the gap between GDDR6 and GDDR5X. I am curious how the GPU memory market will shake out in 2018 and 2019 with GDDR5, GDDR5X, GDDR6, HBM, HBM2, and HBM3 all being readily available for use in graphics cards and where each memory type will land especially on the mid-range and high-end consumer cards (HBM2/3 still holds the performance crown and is ideal for the HPC market).

Samsung is aiming its new 18Gbps 16Gb memory at high performance graphics cards, game consoles, vehicles, and networking devices. Stay tuned for more information on GDDR6 as it develops!

Also read:

Source: Samsung

Samsung Mass Producing Second Generation "Aquabolt" HBM2: Better, Faster, and Stronger

Subject: Memory | January 12, 2018 - 05:46 PM |
Tagged: supercomputing, Samsung, HPC, HBM2, graphics cards, aquabolt

Samsung recently announced that it has begun mass production of its second generation HBM2 memory which it is calling “Aquabolt”. Samsung has refined the design of its 8GB HBM2 packages allowing them to achieve an impressive 2.4 Gbps per pin data transfer rates without needing more power than its first generation 1.2V HBM2.

SAMSUNG-HBM2_C.jpg

Reportedly Samsung is using new TSV (through-silicon-via) design techniques and adding additional thermal bumps between dies to improve clocks and thermal control. Each 8GB HBM2 “Aquabolt” package is comprised of eight 8Gb dies each of which is vertically interconnected using 5,000 TSVs which is a huge number especially considering how small and tightly packed these dies are. Further, Samsung has added a new protective layer at the bottom of the stack to reinforce the package’s physical strength. While the press release did not go into detail, it does mention that Samsung had to overcome challenges relating to “collateral clock skewing” as a result of the sheer number of TSVs.

On the performance front, Samsung claims that Aquabolt offers up a 50% increase in per package performance versus its first generation “Flarebolt” memory which ran at 1.6Gbps per pin and 1.2V. Interestingly, Aquabolt is also faster than Samsung’s 2.0Gbps per pin HBM2 product (which needed 1.35V) without needing additional power. Samsung also compares Aquabolt to GDDR5 stating that it offers 9.6-times the bandwidth with a single package of HBM2 at 307 GB/s and a GDDR5 chip at 32 GB/s. Thanks to the 2.4 Gbps per pin speed, Aquabolt offers 307 GB/s of bandwidth per package and with four packages products such as graphics cards can take advantage of 1.2 TB/s of bandwidth.

This second generation HBM2 memory is a decent step up in performance (with HBM hitting 128GB/s and first generation HBM2 hitting 256 GB/s per package and 512 GB/s and 1 TB/s with four packages respectively), but the interesting bit is that it is faster without needing more power. The increased bandwidth and data transfer speeds will be a boon to the HPC and supercomputing market and useful for working with massive databases, simulations, neural networks and AI training, and other “big data” tasks.

Aquabolt looks particularly promising for the mobile market though with future products succeeding the current mobile Vega GPU in Kaby Lake-G processors, Ryzen Mobile APUs, and eventually discrete Vega mobile graphics cards getting a nice performance boost (it’s likely too late for AMD to go with this new HBM2 on these specific products, but future refreshes or generations may be able to take advantage of it). I’m sure it will also see usage in the SoCs uses in Intel’s and NVIDIA’s driverless car projects as well.

Source: Samsung

CES 2018: NVIDIA Opens Up GeForce NOW Beta To PC Gamers

Subject: General Tech, Graphics Cards | January 8, 2018 - 11:35 PM |
Tagged: pc game streaming, nvidia, geforce now, game streaming, cloud gaming, CES 2018, CES

NVIDIA is opening up its Geforce NOW cloud gaming service to PC gamers who will join Mac users (who got access last year) in the free beta. The service uses GeForce GTX graphics cards and high-powered servers to store, play, and stream games at high settings and stream the output over the internet back to gamers of any desktop or laptop old or new (so long as you have at least a 25Mbps internet connection and can meet the basic requirements to run the Geforce NOW application of course - see below). Currently, NVIDIA supports over 160 games that can be installed on its virtual GeForce NOW gaming PCs and a select number of optimized titles can even be played at 120 FPS for a smoother gaming experience that is closer to playing locally (allegedly).

GeForce NOW.jpg

GeForce NOW is a bring your own games service in the sense that you install the Geforce NOW app on your local machine and validate the games you have purchased and have the rights to play on Steam and Ubisoft's Uplay PC stores. You are then able to install the games on the cloud-based Geforce NOW machines. The game installations reportedly take around 30 seconds with game patching, configurations, and driver updates being handled by NVIDIA's Geforce NOW platform. Gamers will be glad to know that the infrastructure further supports syncing with the games' respective stores and save games, achievements, and settings are synched allowing potentially seamless transitions between local and remote play sessions. 

You can find a list of currently supported games here, but some highlights include some oldies and newer titles including: Borderlands 2, Bioshock Remastered, various Call of Duty titles, League of Legends, Left 4 Dead 2, Kerbal Space Program, Just Cause 3, StarCraft II, Resident Evil 7, KOTOR, Tomb Raider, Metal Gear Solid, Dirt 4 (just for Josh), Project Cars 2, Fallout 4, XCOM 2 (a personal favorite), PUBG, WoW, Civilization VI, and more.

While many of the titles may need to be tweaked to get the best performance, some games have been certified and optimized by NVIDIA to come pre-configured with the best graphics settings for optimum performance including running them at maximum settings at 1920 x 1080 and 120 Hz.

If you are interested in the cloud-based game streaming service, you can sign up for the GeForce NOW beta here and join the waiting list! According to AnandTech, users will need a Windows 7 (or OS X equivalent) PC with at least a Core i3 clocked at 3.1 GHz with 4GB of RAM and a DirectX 9 GPU (AMD HD 3000 series / NVIDIA 600 Series / Intel HD 2000 series) or better. Beta users are limited to 4 hours per gaming session. There is no word on when the paid Geforce NOW tiers will resume or what the pricing for the rented virtual gaming desktops will be.

I signed up (not sure I'll get in though, maybe they need someone to test with old hardware hah) and am interested to try it as their past streaming attempts (e.g. to the Shield Portable) seemed to work pretty well for what it was (something streamed over the internet).

Hopefully they have managed to make it better and quicker to respond to inputs. Have you managed to get access, and if so what are your thoughts? Is GeForce NOW the way its meant to be played? It would be cool to see them add Space Engineers and Sins of a Solar Empire: Rebellion as while me and my brother have fun playing them, they are quite demanding resource wise especially Space Engineers post planets update!

Also read:

Source: NVIDIA

CES 2018: HTC announces Vive Pro and Vive Wireless Adapter

Subject: General Tech, Graphics Cards | January 8, 2018 - 06:22 PM |
Tagged: wigig, VR, vive wireless adapter, vive pro, vive, steamvr, Oculus, htc, CES 2018, CES

As it turns out, HTC's teasing of a new Vive product late last week was well warranted. Today, at their press conference before CES 2018, HTC announced two new hardware products—the Vive Pro and the Vive Wireless adapter.

Copy of VIVE-Pro_KV-B_FA.jpg

Just like their teaser indicated, one of the major features of this new Vive hardware is an increased display resolution. The Vive Pro's resolution is 2880x1600 (combined), a 78% increase from the standard 2160×1200 resolution shared by the original Vive and the Oculus Rift.

Currently, there are no details about upgraded optics in the form of new lenses, but considering Valve's announcement of new lenses for future VR headsets, I would expect these to be included in the Vive Pro.

Copy of Vive Pro Profile.png

In addition to the display improvements, there are also some design changes in the Vive Pro that aim to allow users to quickly put on the headset and adjust it for maximum comfortability. The Vive Pro now features a dial on the back of the head strap to adjust the headset rather than having to adjust velcro straps. This setup is very reminiscent of the PSVR headset which is widely regarded as one of the most comfortable VR headsets currently on the market.

Copy of Vive Pro Sizing Dial.png

While we've already seen some of these design changes like integrated headphones in the currently shipping Deluxe Audio Strap for Vive, the Vive Pro is built from the ground up with this new strap instead of it being a replacement option.

Vive Pro Head on.png

HTC was very quiet about the change from a single front-facing camera on the standard Vive to dual front cameras on the Vive Pro. Having stereo cameras on the device have the potential to provide a lot of utility ranging from a stereo view of your surroundings when you are nearing the chaperone boundaries to potential AR applications.

vivepro-tracking.jpg

The Vive Pro will work with the current 1.0 base stations for positional tracking, as well as Valve's previously announced but unreleased 2.0 base stations. When using SteamVR 2.0 tracking, the Vive Pro supports up to 4 base stations, allowing for a  significantly larger play area of up to 10m x 10m.

Initially, the Vive Pro is slated to ship this quarter as a headset-only upgrade for customers who already have the original Vive with its 1.0 base stations. The full Vive Pro kit with 2.0 tracking is said to ship in the Summer time frame. Pricing for both configurations is yet to be announced.

In addition to new headset hardware, HTC also announced their first official solution for wireless VR connectivity. 

Copy of VIVE_Pro_Wireless_KV_Square.jpg

Built in partnership with Intel, the Vive Wireless Adapter will use 60 GHz WiGig technology to provide a low latency experience for wirelessly streaming video to the HMD. Both the original Vive and the Vive Pro will support this adapter set to be available this summer. We also have no indications of pricing on the Vive Wireless Adapter.

HTC's announcements today are impressive and should help push PC VR forward. We have yet to get hands-on experience with either the Vive Pro or the Vive Wireless adapter, but we have a demo appointment tomorrow, so keep checking PC Perspective for our updated impressions of the next generation of VR!

Source: HTC

CES 2018: NVIDIA announces Big Format Gaming Display initiative with 65-in G-SYNC

Subject: Graphics Cards, Displays | January 8, 2018 - 12:30 AM |
Tagged: SHIELD TV, nvidia, hp, hdr, g-sync, DCI-P3, bgfd, asus, android tv, acer

Although their Keynote presentation tonight at CES is all about automotive technology, that hasn't stopped NVIDIA from providing us with a few gaming-related announcements this week. The most interesting of which is what NVIDIA is calling "Big Format Gaming Displays" or BFGDs (get it?!).

Along with partners ASUS, Acer, and HP, NVIDIA has developed what seems to be the ultimate living room display solution for gamers.

Based on an HDR-enabled 65" 4K 120Hz panel, these displays integrate both NVIDIA G-SYNC variable refresh rate technology for smooth gameplay, as well as a built-in NVIDIA SHIELD TV set-top box.

In addition to G-SYNC technology, these displays will also feature a full direct-array backlight capable of a peak luminance of 1000-nits and conform to the DCI-P3 color gamut, both necessary features for a quality HDR experience. These specifications put the BFGDs in line with the current 4K HDR TVs on the market.

Unlike traditional televisions, these BFGDs are expected to have very low input latencies, a significant advantage for both PC and console gamers.

Integration of the SHIELD TV means that these displays will be more than just an extremely large PC monitor, but rather capable of replacing the TV in your living room. The Android TV operating system means you will get access to a lot of the most popular streaming video applications, as well as features like Google Assistant and NVIDIA GameStream.

BFGD KV.jpg

Personally, I am excited at the idea of what is essentially a 65" TV, but optimized for things like low input latency. The current crop of high-end TVs on the market cater very little to gamers, with game modes that don't turn off all of the image processing effects and still have significant latency.

It's also interesting to see companies like ASUS, Acer, and HP who are well known in the PC display market essentially entering the TV market with these BFGD products.

Stay tuned as for eyes-on impression of the BFGD displays as part of our CES 2018 coverage!

Update: ASUS has officially announced their BFGD offering, the aptly named PG65 (pictured below). We have a meeting with ASUS this week, and we hope to get a look at this upcoming product!

asus-angled-1920x1080-screenshot.jpg

Source: NVIDIA

CES 2018: AMD teases 7nm Vega for machine learning in 2018

Subject: Graphics Cards | January 8, 2018 - 12:00 AM |
Tagged: Vega, CES 2018, CES, amd, 7nm

Though just the most basic of teases, AMD confirmed at CES that it will have a 7nm based Vega product sampling sometime in 2018. No mention of shipping timeline, performance, or consumer variants were to be found.

01_0.jpg

This product will target the machine learning market, with hardware and platform optimizations key to that segment. AMD mentions “new DL Ops”, or deep learning operations, but the company didn’t expand on that. It could mean it will integrate Tensor Core style compute units (as NVIDIA did on the Volta architecture) or it may be something more unique. AMD will integrate a new IO, likely to compete with NVLink, and MxGPU support for dividing resources efficiently for virtualization.

02_0.jpg

AMD did present a GPU “roadmap” at the tech day as well. I put that word in quotes because it is incredibly, and intentionally, vague. You might assume that Navi is being placed into the 2019 window, but its possible it might show in late 2018. AMD was also unable to confirm if a 7nm Vega variant would arrive for gaming and consumer markets in 2018.

Source: PCPer

NVIDIA addresses Spectre vulnerabilities

Subject: General Tech, Graphics Cards | January 5, 2018 - 02:59 PM |
Tagged: meltdown, spectre, geforce, quadro, NVS, nvidia, tesla, security

If you were wondering if NVIDIA products are vulnerable to some of the latest security threats, the answer is yes.  Your Shield device or GPU is not vulnerable to CVE-2017-5754, aka Meltdown, however the two variants of Spectre could theoretically be used to infect you. 

  • Variant 1 (CVE-2017-5753): Mitigations are provided with the security update included in this bulletin. NVIDIA expects to work together with its ecosystem partners on future updates to further strengthen mitigations.

  • Variant 2 (CVE-2017-5715): Mitigations are provided with the security update included in this bulletin. NVIDIA expects to work together with its ecosystem partners on future updates to further strengthen mitigations.

  • Variant 3 (CVE-2017-5754): At this time, NVIDIA has no reason to believe that Shield TV/tablet is vulnerable to this variant.

The Android based Shield tablet should be updated to Shield Experience 5.4, which should arrive before the end of the month.  Your Shield TV, should you actually still have a working on will receive Shield Experience 6.3 along the same time frame.

The GPU is a little more complex as there are several product lines and OSes which need to be dealt with.  There should be a new GeForce driver appearing early next week for gaming GPUs, with HPC cards receiving updates on the dates you can see below.

nvidia patch.PNG

There is no reason to expect Radeon and Vega GPUs to suffer from these issues at this time.  Intel could learn a bit from NVIDIA's response, which has been very quick and includes ther older hardware.

Source: NVIDIA

ASUS Announces XG Station Pro, a Thunderbolt 3 External GPU Enclosure for Windows and Mac

Subject: Graphics Cards | January 4, 2018 - 10:15 AM |
Tagged: external graphics, external gpu, CES 2017, CES, ASUS ROG, asus

ASUS today announced the XG Station Pro, a Thunderbolt 3-based external GPU enclosure tailored for both gamers and professionals. The XG Station Pro can accommodate full-size GPUs up to 2.5 slots wide, including large cards such as the ROG Strix 1080 Ti and Radeon RX Vega 64.

Featuring a "contemporary design with clean lines and subtle styling," the XG Station Pro has a footprint of 4.3-inches x 14.8-inches, thanks to ASUS's decision to use an external power supply. In order to provide enough juice for high-end graphics cards, ASUS is borrowing the power supply design from its GX800 gaming laptop, which puts out up to 330 watts.

xgstationpro-side.jpg

The XG Station Pro's chassis, designed by case maker In Win, has a smooth dark gray finish with a black PCB and sleeved PCIe power cables. It features a soft white internal glow that can be controlled by ASUS's Aura software, including Aura Sync to synchronize lighting with your compatible ASUS and ROG graphics cards and laptops.

xgstationpro-inside.jpg

Inside the XG Station Pro, dual 120mm PWM fans provide exhaust out of the right side of the chassis. The fans automatically ramp down and even shut off below certain temperatures, but users can also manually control the fans with the ASUS GPU Tweak II application.

xgstationpro-standing.jpg

Around back, users will find an extra USB Type-C 3.1 Gen 2 port, which can supply up to 15 watts of power to compatible devices such as smartphones and external storage. Finally, ASUS notes that it includes the require Thunderbolt 3 cable in the box, something that many Thunderbolt-based devices seem to lack.

The ASUS XG Station Pro will launch later this month for $329 with support for both AMD and NVIDIA GPUs in Windows 10, and just AMD Vega-based GPUs in macOS Sierra and newer.

Source: ASUS

Gigabyte's Aorus GTX 1070, the GPU you don't unbox

Subject: Graphics Cards | December 28, 2017 - 03:32 PM |
Tagged: external gpu, gigabyte, aorus, gtx 1070, thunderbolt 3, nvidia, gaming box

Have a laptop with Thunderbolt 3 and a mobile GPU that just doesn't cut it anymore?  Gigabyte now offers an incredibly easy way to upgrade your laptop, with no screwdriver required!  The Aorus GTX 1070 Gaming Box contains an external desktop class GTX 1070 and separate PSU, giving you a dock with some serious gaming prowess.  The Tech Report's benchmarks compare this external GPU against the GTX 1060 installed in their Alienware gaming laptop and Alienware's own external GPU enclosure, on both the internal display and an external monitor.  The results are somewhat mixed and worth reading through fully, however if you are on an integrated GPU then this solution is an incredible upgrade.

front34.jpg

"Gigabyte's Aorus GTX 1070 Gaming Box offers us a look into a future where a big shot of graphics performance is just a single cable away for ultraportable notebook PCs. We plugged the Gaming Box into a test notebook and gave it a spin to see just how bright that future looks."

Here are some more Graphics Card articles from around the web:

Graphics Cards

SAPPHIRE Releases NITRO+ Radeon RX Vega (64 & 56)

Subject: Graphics Cards | December 15, 2017 - 09:00 AM |
Tagged: vega 64 liquid, vega 64, vega 56, Vega, sapphire, radeon, amd

SAPPHIRE has just launched a pair of custom cooled, factory overclocked, RX Vega-based graphics cards. As you might guess: the SAPPHIRE NITRO+ Radeon RX Vega 64 uses the Vega 64 chip with its 4096 stream processors, while the SAPPHIRE NITRO+ Radeon RX Vega 56 uses the Vega 56 chip and its 3584 stream processors. Both cards have 8GB of HBM2 memory (two stacks of 4GB). The cooler design uses three fans and vapor chambers, with separate heat pipes for the GPU+Memory (six pipes) and VRMs (two pipes).

11275-00_VEGA64_Nitro_Plus_2DP2HDMI_C03.jpg

It also has a back plate!

11275-00_VEGA64_Nitro_Plus_2DP2HDMI_C05.jpg

The clock rate is where it gets interesting. The NITRO+ RX Vega 64 will have a boost clock of 1611 MHz out-of-the-box. This is above the RX Vega 64 Air’s boost clock (1546 MHz) but below the RX Vega 64 Liquid’s boost clock (1677 MHz). The liquid-cooled Radeon RX Vega 64 still has the highest clocks, but this product sits almost exactly half-way between it (the liquid-cooled RX Vega 64) and the air-cooled RX Vega 64.

The NITRO+ Radeon RX Vega 56, with its 1572 MHz boost clock, is well above the stock RX Vega 56’s 1471 MHz boost clock, though. It’s a clear win.

fancontrol.png

As for enthusiast features, this card has quite a few ways to keep it cool. First, it will operate fanless until 56C. Second, the card accepts a 4-pin fan connector, which allows it to adjust the speed of two case fans based on the temperature readings from the card. I am a bit curious whether it’s better to let the GPU control the fans, or whether having them all attached to the same place allows them to work together more effectively. Either way, if you ran out of fan headers, then I’m guessing that this feature will be good for you anyway.

The SAPPHIRE NITRO+ Radeon RX Vega 64 and 56 are available now.

Source: SAPPHIRE

Ti-ny bubbles in my card, watercooling the GTX 1080 Ti

Subject: Graphics Cards | December 13, 2017 - 02:05 PM |
Tagged: gtx 1080 ti, phanteks, G1080, watercooling

Phanteks announced their G1080 water block for GTX 1080 Ti's a while back but we hadn't seen it in action until now.  [H]ard|OCP installed the cooler on a Founders Edition card and created a video of the process.  Not only do they show how to properly install the water block they also cover a few of the possible issues you might encounter while doing so.  They also made a video showing how the coolant flows through the waterblock which is not only pretty but can help you determine where to insert your GPU into your watercooling loop.

15125351650bqfuz1m4i_1_2_l.jpg

"Phanteks recently sent us its Glacier series water block for our Founders Edition GTX 1080 Ti. We take you through the full process of getting it installed. We check out the mating surfaces of the GPU, capacitors, and MOSFETs and show you just how well it all fits together. Then finally we show exactly how the coolant flows in 4K!"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Video: What does a $3000 GPU look like? NVIDIA TITAN V Unboxing and Teardown!

Subject: Graphics Cards | December 12, 2017 - 07:51 PM |
Tagged: nvidia, titan, titan v, Volta, video, teardown, unboxing

NVIDIA launched the new Titan V graphics card last week, a $2999 part targeted not at gamers (thankfully) but instead at developers of machine learning applications. Based on the GV100 GPU and 12GB of HBM2 memory, the Titan V is an incredibly powerful graphics card. We have every intention of looking at the gaming performance of this card as a "preview" of potential consumer Volta cards that may come out next year. (This is identical to our stance of testing the Vega Frontier Edition cards.)

But for now, enjoy this unboxing and teardown video that takes apart the card to get a good glimpse of that GV100 GPU.

A couple of quick interesting notes:

  • This implementation has 25% of the memory and ROPs disabled, giving us 12GB of HBM2, a 3072-bit bus, and 96 ROPs.
  • Clock speeds in our testing look to be much higher than the base AND boost ratings.
  • So far, even though the price takes this out of the gaming segment completely, we are impressed with some of the gaming results we have found.
  • The cooler might LOOK the same, but it definitely is heavier than the cooler and build for the Titan Xp.
  • Champagne. It's champagne colored.
  • Double precision performance is insanely good, spanking the Titan Xp and Vega so far in many tests.
  • More soon!

gv100.png

Source: NVIDIA

NVIDIA Launches Titan V, the World's First Consumer Volta GPU with HBM2

Subject: Graphics Cards | December 7, 2017 - 11:44 PM |
Tagged: Volta, titan, nvidia, graphics card, gpu

NVIDIA made a surprising move late Thursday with the simultaneous announcement and launch of the Titan V, the first consumer/prosumer graphics card based on the Volta architecture.

NVIDIA_TITAN V_KV.jpeg

Like recent flagship Titan-branded cards, the Titan V will be available exclusively from NVIDIA for $2,999. Labeled "the most powerful graphics card ever created for the PC," Titan V sports 12GB of HBM2 memory, 5120 CUDA cores, and a 1455MHz boost clock, giving the card 110 teraflops of maximum compute performance. Check out the full specs below:

6 Graphics Processing Clusters
80 Streaming Multiprocessors
5120 CUDA Cores (single precision)
320 Texture Units
640 Tensor Cores
1200 MHz Base Clock (MHz)
1455 MHz Boost Clock (MHz)
850 MHz Memory Clock
1.7 Gbps Memory Data Rate
4608K L2 Cache Size
12288 MB HBM2 Total Video Memory
3072-bit Memory Interface
652.8 GB/s Total Memory Bandwidth
384 GigaTexels/sec Texture Rate (Bilinear)
12 nm Fabrication Process (TSMC 12nm FFN High Performance)
21.1 Billion Transistor Count
3 x DisplayPort, 1 x HDMI Connectors
Dual Slot Form Factor
One 6-pin, One 8-pin Power Connectors
600 Watts Recommended Power Supply
250 Watts Thermal Design Power (TDP)

The NVIDIA Titan V's 110 teraflops of compute performance compares to a maximum of about 12 teraflops on the Titan Xp, a greater than 9X increase in a single generation. Note that this is a very specific claim though, and references the AI compute capability of the Tensor cores rather than we traditionally measure for GPUs (single precision FLOPS). In that metric, the Titan V only truly offers a jump to 14 TFLOPS. The addition of expensive HBM2 memory also adds to the high price compared to its predecessor.

titan-v-stylized-photography-6.jpeg

The Titan V is available now from NVIDIA.com for $2,999, with a limit of 2 per customer. And hey, there's free shipping too.

Source: NVIDIA

Gigabyte Launches GTX 1070 Ti Aorus

Subject: Graphics Cards | December 7, 2017 - 01:31 AM |
Tagged: pascal, GTX 1070Ti, GP104, gigabyte, aorus

Gigabyte is jumping into the custom GTX 1070 Ti fray with the Aorus branded GeForce GTX 1070 Ti Aorus. The new custom graphics card measures 280 x 111 x 38mm and features a WindForce 3 cooler with backplate and a custom 6+2 power phase.

GIGABYTE-AORUS-GTX1070Ti-2.jpg

Backlit by RGB Fusion LEDs, the Aorus logo sits on the side of the card and can be configured to the color of your choice. The shroud is black with orange accents and has sharp stealth angles that is minimal by comparison with other cards. Gigabyte is using a fairly beefy heatsink with this card. Specifically, three 80mm fans push air over a beefy heatsink that consists of three fin stacks connected by four composite heatpipes. Further, the cooler uses direct contact for the heatpipes above the GPU and a metal plate with thermal pads to cover the GDDR5 memory chips. The rightmost fin stack cools the MOSFETs. Additionally, the full cover backplate adds rigidity to the card and has a copper plate to draw excess heat from the underside of the GPU.

The Aorus graphics card is powered by a single 8-pin PCI-E power connector that feeds a 6+2 power phase. External video outputs include one DVI, one HDMI 2.0b, and three DisplayPort 1.4 ports.

The Pascal-based GP104-300 GPU (2432 CUDA cores, 152 TMUs, and 64 ROPs) is clocked at 1607 MHz base and 1683 MHz boost which is the maximum vendors can clock the cards out of the box. Gigabyte does offer 1-click overclocking using its Aorus Graphics Engine software and guarantees at least 88 MHz overclocks to 1683 MHz base and 1771 MHz boost. Users can, of course, use other software like MSI Afterburner or EVGA Precision X if they wish but will need to use the Gigabyte tool if they want the single click automatic overclock. The 8GB of GDDR5 memory is stock clocked at 8008 MHz and sits on a 256-bit bus.

Looking online, the Aorus GTX 1070 Ti Aorus doesn’t appear to be available for sale quite yet, but it should be coming soon. With the Gigabyte GTX 1070 Ti Gaming card coming in at $469, I’m betting the Aorus card with guaranteed overclock will have a MSRP around $500.

Also read:

Source: Gigabyte

EVGA Launches GTX 1070 Ti FTW Ultra Silent Graphics Card

Subject: Graphics Cards | December 6, 2017 - 06:50 PM |
Tagged: evga, ftw, gtx 1070 ti, pascal, overclocking

EVGA is launching a new Pascal-based graphics card with a thicker 2.5 slot cooler in the form of the GeForce GTX 1070 Ti FTW Ultra Silent. The new graphics card has a sleek gray and black shroud with two large black fans in standard ACX 3.0 cooler styling, but with a much thicker cooler that EVGA claims enables more overclocking headroom or a nearly silent fan profile on stock settings.

08G-P4-6678-KR_1.jpg

The GTX 1070 Ti FTW Ultra Silent is powered by two 8-pin PCI-E power connectors that feed a 10+2 power phase and enables the cards 235W TDP (reference TDP is 180 watts). The 2432 Pascal GPU cores are clocked at 1607 MHz base and 1683 MHz boost which aligns with NVIDIA's reference specifications. While there are no guaranteed factory overclock here, EVGA is bundling the dual BIOS card with its EVGA Precision XOC and Precision OC Scanner X software for one-click overclocking that dynamically pushes the clocks up to find the optimal overclock for that specific card. The 8GB of GDDR5 memory is also stock clocked at 8008 MHz. Other features include a backplate, white LEDs, and 2-way SLI support.

Display outputs include one HDMI 2.0b, three DisplayPort 1.4, and one DVI port.

08G-P4-6678-KR_6.jpg

The new FTW series graphics card is available now from the EVGA website for $499.99 and comes with a three year warranty.

The graphics card appears to be rather tall, and I am curious how well the beefier heatsink performs and just how "ultra silent" those fans are! Hopefully we can get one in for testing! The $499.99 MSRP is interesting though because it lines up with the MSRP of the GTX 1080, but with the state of the GPU market as it is the price is not bad and actually comes in about in the middle of where other GTX 1070 Ti cards are at. My guess is they will be snatched up pretty quickly so it's hard to say if it will stay at that price especially on third party sites.

Source: EVGA

ASUS Launches ROG Strix RX Vega 64 and 56 OC Edition Graphics Cards

Subject: Graphics Cards | December 4, 2017 - 10:10 PM |
Tagged: vega 64, vega 56, rx vega, ROG Strix, ASUS ROG, asus

ASUS is launching two new factory overclocked graphics cards with all of the RGB in the form of the ROG Strix RX Vega 64 and ROG Strix RX Vega 56. Measuring 11.73" x 2.58" x 2.07" these graphics cards are beastly 2.5 slot designs with large triple fan coolers. On the outside, the graphics cards have a black shroud with RGB LEDs around the fans, on the Strix side logo, and the ROG backplate logo. Asus is using a massive heatsink that is divided into two aluminum fin stacks that connect to a copper baseplate using five heatpipes each. The baseplate is reportedly 10% flatter for improved contact with the GPU. There are three fans to push air over the heatsinks that are of the dust resistant Wing-Blade variety.

ASUS ROG Strix RX Vega 64 with RGB LEDs.png

The cards have two 8-pin PCI-E power connectors feeding ASUS' Super Alloy Power II VRMs. Other connectors include hybrid fan headers for system fans and an Aurora Sync RGB LED header. Display outputs are "VR Ready" and include two HDMI, two DisplayPort, and a single DVI output.

While ASUS has not yet revealed clockspeeds on the RX Vega 56 card, eTeknix has gotten their hands on the ROG Strix RX Vega 64 graphics card and figured out the clocks for that card. Specifically, the Vega 64 card clocks its 4096 GPU cores at 1298 MHz base and 1590 MHz boost. The site further lists the memory clockspeed at 945 MHz which doesn't appear to be overclocked as it matches the referece Vega 64 HBM2 clocks of 1890 MHz. Users can use the GPU Tweak II software to push the card further on their own though.

overview.png

ASUS has not yet revealed pricing or exact availability dates but expect them to sell out fast and over MSRP when they do surface thanks to the resurgance of GPU mining! With that said it is promising that we are finally seeing factory overclocked cards being announced!

Also read:

Source: eTeknix

AMD Working on GDDR6 Memory Controller For Future Graphics Cards

Subject: General Tech, Graphics Cards | December 4, 2017 - 05:47 PM |
Tagged: navi, HBM2, hbm, gddr6, amd

WCCFTech reports that AMD is working on a GDDR6 memory controller for its upcoming graphics cards. Starting with an AMD Technical Engineer listing GDDR6 on his portfolio, the site claims to have verified through sources familiar with the matter that AMD is, in fact, supporting the new graphics memory standard and will be using their own controller to support it (rather than licensing one).

roadmap.jpg

AMD is not abandoning HBM2 memory though. The company is sticking to its previously released roadmaps and Navi will still utilize HBM2 memory – at least on the high-end SKUs. While AMD has so far only released RX Vega 64 and RX Vega 56 graphics cards, the company may well release lower-end Vega-based cards with GDDR5 at some point although for now the Polaris architecture is handling the lower end. AMD supporting GDDR6 is a good thing and should enable cheaper mid-range cards that are not limited by supply shortages of the more expensive (albeit much higher bandwidth) High Bandwidth Memory that have seemingly plagues both NVIDIA and AMD at various points in time. GDDR6 further offers several advantages over GDDR5 with almost twice the speed (9 Gbps versus 16 Gbps) at lower power (1.5V versus 1.35V) and more density and underlying technology optimizations than even GDDR5X. While the G5X memory is capable of hitting the same 16 Gbps launch speeds of GDDR6, the newer memory technology offers up to 32Gb dies* versus 16Gb and a two channel design (which ends up being a bit more efficient and easier to produce / for GPU manufacturers to wire up). GDDR6 will represent a nice speed bump for mid-range cards (very low end may well stick with GDDR5 save for mobile parts which could benefit from the lower power GDDR6) while letting AMD have a bit better profit margins on these lower end margin SKUs and being able to produce more cards to satisfy demand. HBM2 is nice to have but it is more well suited for the compute-oriented cards for workstation and data center usage rather than gaming right now and GDDR6 can offer more price-to-performance for the consumer gaming cards.

As for the question of why AMD would want to design their own GDDR6 memory controller rather than license one, I think that comes down to AMD thinking long-term. It will be more expensive up front to design their own controller, but AMD will be able to more fully integrate it and tune it to work with their graphics cards such that it can be more power efficient. Also, having their own GDDR6 memory controller means they can use it in other areas such as their APUs and SoCs offered through their Semi Custom Business Unit (e.g. the SoCs used in gaming consoles). Being able to offer that controller to other companies in their semi-custom SoCs free of third party licensing fees is a good thing for AMD.

Micron GDDR5X.png

With GDDR6 becoming readily available early next year, there is a good chance AMD will be ready to use the new memory technology as soon as Navi but likely not until closer to the end of 2018 or early 2019 when AMD launches new lower and mid-range gaming cards (consumer-level) based on Navi and/or Vega.

*At launch it appears that GDDR6 from the big three (Micron, Samsung, and SK Hynix) will use 16Gb dies, but the standard allows for up to 32Gb dies. The G5X standard allows for up to 16Gb dies.

Also read:

Source: WCCFTech

AMD Is Very Pleased To Participate in Blockchain Technology

Subject: General Tech, Graphics Cards | December 3, 2017 - 04:26 PM |
Tagged: bitcoin, cryptocurrency, mining, gaming, lisa su, amd, Vega

AMD’s CEO Lisa Su was recently appeared on CNBC’s Power Lunch Exclusinve interview segment where she answered questions about bitcoin, blockchain technology, the tax reform bill, and sexual harassment in the workplace.

AMD CNBC.png

Of particular interest to PC Perspective readers, Dr. Lisa Su shared several interesting bits of information on cryptocurrency mining and how it is affecting the company’s graphics cards. Surprisingly, she stated that cryptocurrency miners were a "very small percentage" of sales and specifically that they represented a mid-single digit percentage of buyers (~4 to 6 percent). This number is hard to believe for me as I expected it to be significantly higher with the prices of graphics cards continuing to climb well above MSRP (it wasn’t too bad when writing our gift guide and shortly after but just as I was about to commit I looked and prices had shot back up again coinciding with a resurgence in mining popularity with the price of cryptocurrencies rising and improving ROI).

Further, the AMD president and CEO states that the company is interested in this market, but they are mainly waiting to see how businesses and industries adopt blockchain technologies. AMD is “very pleased to participate in blockchain” and believes it is a “very important foundational product”. Dr. Lisa Su did not seem very big on bitcoin specifically, but did seem interested in the underlying blockchain technologies and future cryptocurrencies.

Beyond bitcoin, altcoins, and the GPU mining craze, AMD believes that gaming is and continues to be a tremendous growth market for the company. AMD has reportedly launched 10 new product families and saw sizeable increases in sales on Amazon and Newegg versus last year with processor sales tripling and double digital percentage increases in graphics sales in 2017. AMD also managed to be in two of the three gaming towers in Best Buy for the holiday buying season.

Speaking for AMD Dr. Su also had a few other interesting bits of information to share. The interview is fairly short and worth watching. Thankfully Kyle over at HardOCP managed to record it and you can watch it here. If you aren't able to stream the video, PCGamer has transcribed most of the major statements.

What are your thoughts on the interview? Will we ever see GPU prices return to normal so I can upgrade, and do you agree with AMD’s assessment that miners are such a small percentage of their sales and not as much of an influencer in pricing as we thought (perhaps it’s a supply problem rather than a demand problem, or the comment was only taking their mining-specific cards into account?)?

Source: HardOCP

XFX Teases Custom RX Vega 56 and RX Vega 64 Double Edition Graphics Cards

Subject: Graphics Cards | December 1, 2017 - 02:48 PM |
Tagged: xfx, vega 10, Vega, RX VEGA 64, RX Vega 56, double edition, amd

Not content to let Asus have all the fun with X shaped products, graphics card manufacturer XFX is prepping two new Vega graphics cards that feature a cut-out backplate and cooler shroud that resembles a stretched-out X. XFX has, so far, only released a few pictures of the card but they do show off most of the card including the top edge, cooler, and backplate.

XFX RX Vega Double Edition Cooler.jpg

XFX has opted for a short PCB that extends slightly past the first cooling fan. The card is a dual slot design with a large heatsink and two large red fans and a bit less than half of the cooler extends past the PCB as a result. Cooling is not an issue thanks to liberal use of heat pipes (I think there are five main copper heat pipes), but the cooler hanging so far past the PCB has resulted in the two 8-pin PCI-E power connectors ending up in the middle of the cooler (the middle of the X shape) which is not ideal for cable management (still waiting for someone to put the PCI-E power connectors on the back edge closest to the motherboard!) but with a bit of modding maybe it would be possible to hid the wires under the shroud and route them around the card as one of the photos it looks like there is a bit of a gap between the heatsink and the shroud/backplate heh).

The design is sure to be divisive with some people loving it and other hating it, but XFX has put quite a bit of work into it. The red fans are surrounded by a stylized black shroud with a carbon fiber texture while the top edge holds the red XFX logo. The backplate specifically looks great with a black and grey design with red accent that features numerous cutouts for extra ventilation.

Display outputs are standard with three DisplayPort and one HDMI out.

TechPowerUp along with Videocardz are reporting that the card will come in both RX Vega 56 and RX Vega 64 variants. Unfortunately, while XFX has gone all out in the custom cooling and backplate, they are not pushing any of the clockspeeds past factory settings with the RX Vega 56 Double Edition clocking in at 1156 MHz base and 1471 MHz boost on the GPU and 1600 MHz on the 8GB of HBM2 memory. The XFX RX Vega 64 Double Edition is also stock clocked at 1247 MHz base, 1546 MHz boost, and 1890 MHz memory. It is not all bad news though, because with such a beefy cooler, enthusiasts should be able to overclock the chips themselves at least a bit (depending on how lucky they are in the silicon lottery) but it does mean that XFX isn’t guaranteeing anything. Also, overclocking might be more top-end overclock limited on the Vega 64 version versus other custom cards due to it only including two 8-pin power connectors (which does make me wonder what they have done as far as the VRMs versus reference if anything).

XFX RX Vega Double Edition Backplate.jpg

XFX has not yet revealed pricing or availability for their custom RX Vega cards.

What are your thoughts on the X design? 

Also read:

Source: TechPowerUp

Nvidia GeForce 388.43 WHQL driver

Subject: Graphics Cards | November 30, 2017 - 01:31 PM |
Tagged: GeForce 388.43, nvidia, whql, DOOM VFR, NV Tray

It might sound like an experimental Nazi weapon from WWII, the DOOM VFR has little to do with Wolfenstein and is instead a DOOM virtual reality game for the HTC Vive (pre-purchasing is bad, m'kay).   NVIDIA's new game ready driver, the GeForce 388.43 WHQL release is made to improve the performance of your GTX in this game.

vfr.PNG

This release also marks the return of the NVTray, much to the delight of the hoards of users who mourned the loss of the utility.  You can grab the drivers here, or through the GeForce Experience app if you have it installed.

 

Source: NVIDIA