EVGA Launches GTX 1070 Ti FTW Ultra Silent Graphics Card

Subject: Graphics Cards | December 6, 2017 - 06:50 PM |
Tagged: evga, ftw, gtx 1070 ti, pascal, overclocking

EVGA is launching a new Pascal-based graphics card with a thicker 2.5 slot cooler in the form of the GeForce GTX 1070 Ti FTW Ultra Silent. The new graphics card has a sleek gray and black shroud with two large black fans in standard ACX 3.0 cooler styling, but with a much thicker cooler that EVGA claims enables more overclocking headroom or a nearly silent fan profile on stock settings.

08G-P4-6678-KR_1.jpg

The GTX 1070 Ti FTW Ultra Silent is powered by two 8-pin PCI-E power connectors that feed a 10+2 power phase and enables the cards 235W TDP (reference TDP is 180 watts). The 2432 Pascal GPU cores are clocked at 1607 MHz base and 1683 MHz boost which aligns with NVIDIA's reference specifications. While there are no guaranteed factory overclock here, EVGA is bundling the dual BIOS card with its EVGA Precision XOC and Precision OC Scanner X software for one-click overclocking that dynamically pushes the clocks up to find the optimal overclock for that specific card. The 8GB of GDDR5 memory is also stock clocked at 8008 MHz. Other features include a backplate, white LEDs, and 2-way SLI support.

Display outputs include one HDMI 2.0b, three DisplayPort 1.4, and one DVI port.

08G-P4-6678-KR_6.jpg

The new FTW series graphics card is available now from the EVGA website for $499.99 and comes with a three year warranty.

The graphics card appears to be rather tall, and I am curious how well the beefier heatsink performs and just how "ultra silent" those fans are! Hopefully we can get one in for testing! The $499.99 MSRP is interesting though because it lines up with the MSRP of the GTX 1080, but with the state of the GPU market as it is the price is not bad and actually comes in about in the middle of where other GTX 1070 Ti cards are at. My guess is they will be snatched up pretty quickly so it's hard to say if it will stay at that price especially on third party sites.

Source: EVGA

ASUS Launches ROG Strix RX Vega 64 and 56 OC Edition Graphics Cards

Subject: Graphics Cards | December 4, 2017 - 10:10 PM |
Tagged: vega 64, vega 56, rx vega, ROG Strix, ASUS ROG, asus

ASUS is launching two new factory overclocked graphics cards with all of the RGB in the form of the ROG Strix RX Vega 64 and ROG Strix RX Vega 56. Measuring 11.73" x 2.58" x 2.07" these graphics cards are beastly 2.5 slot designs with large triple fan coolers. On the outside, the graphics cards have a black shroud with RGB LEDs around the fans, on the Strix side logo, and the ROG backplate logo. Asus is using a massive heatsink that is divided into two aluminum fin stacks that connect to a copper baseplate using five heatpipes each. The baseplate is reportedly 10% flatter for improved contact with the GPU. There are three fans to push air over the heatsinks that are of the dust resistant Wing-Blade variety.

ASUS ROG Strix RX Vega 64 with RGB LEDs.png

The cards have two 8-pin PCI-E power connectors feeding ASUS' Super Alloy Power II VRMs. Other connectors include hybrid fan headers for system fans and an Aurora Sync RGB LED header. Display outputs are "VR Ready" and include two HDMI, two DisplayPort, and a single DVI output.

While ASUS has not yet revealed clockspeeds on the RX Vega 56 card, eTeknix has gotten their hands on the ROG Strix RX Vega 64 graphics card and figured out the clocks for that card. Specifically, the Vega 64 card clocks its 4096 GPU cores at 1298 MHz base and 1590 MHz boost. The site further lists the memory clockspeed at 945 MHz which doesn't appear to be overclocked as it matches the referece Vega 64 HBM2 clocks of 1890 MHz. Users can use the GPU Tweak II software to push the card further on their own though.

overview.png

ASUS has not yet revealed pricing or exact availability dates but expect them to sell out fast and over MSRP when they do surface thanks to the resurgance of GPU mining! With that said it is promising that we are finally seeing factory overclocked cards being announced!

Also read:

Source: eTeknix

AMD Working on GDDR6 Memory Controller For Future Graphics Cards

Subject: General Tech, Graphics Cards | December 4, 2017 - 05:47 PM |
Tagged: navi, HBM2, hbm, gddr6, amd

WCCFTech reports that AMD is working on a GDDR6 memory controller for its upcoming graphics cards. Starting with an AMD Technical Engineer listing GDDR6 on his portfolio, the site claims to have verified through sources familiar with the matter that AMD is, in fact, supporting the new graphics memory standard and will be using their own controller to support it (rather than licensing one).

roadmap.jpg

AMD is not abandoning HBM2 memory though. The company is sticking to its previously released roadmaps and Navi will still utilize HBM2 memory – at least on the high-end SKUs. While AMD has so far only released RX Vega 64 and RX Vega 56 graphics cards, the company may well release lower-end Vega-based cards with GDDR5 at some point although for now the Polaris architecture is handling the lower end. AMD supporting GDDR6 is a good thing and should enable cheaper mid-range cards that are not limited by supply shortages of the more expensive (albeit much higher bandwidth) High Bandwidth Memory that have seemingly plagues both NVIDIA and AMD at various points in time. GDDR6 further offers several advantages over GDDR5 with almost twice the speed (9 Gbps versus 16 Gbps) at lower power (1.5V versus 1.35V) and more density and underlying technology optimizations than even GDDR5X. While the G5X memory is capable of hitting the same 16 Gbps launch speeds of GDDR6, the newer memory technology offers up to 32Gb dies* versus 16Gb and a two channel design (which ends up being a bit more efficient and easier to produce / for GPU manufacturers to wire up). GDDR6 will represent a nice speed bump for mid-range cards (very low end may well stick with GDDR5 save for mobile parts which could benefit from the lower power GDDR6) while letting AMD have a bit better profit margins on these lower end margin SKUs and being able to produce more cards to satisfy demand. HBM2 is nice to have but it is more well suited for the compute-oriented cards for workstation and data center usage rather than gaming right now and GDDR6 can offer more price-to-performance for the consumer gaming cards.

As for the question of why AMD would want to design their own GDDR6 memory controller rather than license one, I think that comes down to AMD thinking long-term. It will be more expensive up front to design their own controller, but AMD will be able to more fully integrate it and tune it to work with their graphics cards such that it can be more power efficient. Also, having their own GDDR6 memory controller means they can use it in other areas such as their APUs and SoCs offered through their Semi Custom Business Unit (e.g. the SoCs used in gaming consoles). Being able to offer that controller to other companies in their semi-custom SoCs free of third party licensing fees is a good thing for AMD.

Micron GDDR5X.png

With GDDR6 becoming readily available early next year, there is a good chance AMD will be ready to use the new memory technology as soon as Navi but likely not until closer to the end of 2018 or early 2019 when AMD launches new lower and mid-range gaming cards (consumer-level) based on Navi and/or Vega.

*At launch it appears that GDDR6 from the big three (Micron, Samsung, and SK Hynix) will use 16Gb dies, but the standard allows for up to 32Gb dies. The G5X standard allows for up to 16Gb dies.

Also read:

Source: WCCFTech

AMD Is Very Pleased To Participate in Blockchain Technology

Subject: General Tech, Graphics Cards | December 3, 2017 - 04:26 PM |
Tagged: bitcoin, cryptocurrency, mining, gaming, lisa su, amd, Vega

AMD’s CEO Lisa Su was recently appeared on CNBC’s Power Lunch Exclusinve interview segment where she answered questions about bitcoin, blockchain technology, the tax reform bill, and sexual harassment in the workplace.

AMD CNBC.png

Of particular interest to PC Perspective readers, Dr. Lisa Su shared several interesting bits of information on cryptocurrency mining and how it is affecting the company’s graphics cards. Surprisingly, she stated that cryptocurrency miners were a "very small percentage" of sales and specifically that they represented a mid-single digit percentage of buyers (~4 to 6 percent). This number is hard to believe for me as I expected it to be significantly higher with the prices of graphics cards continuing to climb well above MSRP (it wasn’t too bad when writing our gift guide and shortly after but just as I was about to commit I looked and prices had shot back up again coinciding with a resurgence in mining popularity with the price of cryptocurrencies rising and improving ROI).

Further, the AMD president and CEO states that the company is interested in this market, but they are mainly waiting to see how businesses and industries adopt blockchain technologies. AMD is “very pleased to participate in blockchain” and believes it is a “very important foundational product”. Dr. Lisa Su did not seem very big on bitcoin specifically, but did seem interested in the underlying blockchain technologies and future cryptocurrencies.

Beyond bitcoin, altcoins, and the GPU mining craze, AMD believes that gaming is and continues to be a tremendous growth market for the company. AMD has reportedly launched 10 new product families and saw sizeable increases in sales on Amazon and Newegg versus last year with processor sales tripling and double digital percentage increases in graphics sales in 2017. AMD also managed to be in two of the three gaming towers in Best Buy for the holiday buying season.

Speaking for AMD Dr. Su also had a few other interesting bits of information to share. The interview is fairly short and worth watching. Thankfully Kyle over at HardOCP managed to record it and you can watch it here. If you aren't able to stream the video, PCGamer has transcribed most of the major statements.

What are your thoughts on the interview? Will we ever see GPU prices return to normal so I can upgrade, and do you agree with AMD’s assessment that miners are such a small percentage of their sales and not as much of an influencer in pricing as we thought (perhaps it’s a supply problem rather than a demand problem, or the comment was only taking their mining-specific cards into account?)?

Source: HardOCP

XFX Teases Custom RX Vega 56 and RX Vega 64 Double Edition Graphics Cards

Subject: Graphics Cards | December 1, 2017 - 02:48 PM |
Tagged: xfx, vega 10, Vega, RX VEGA 64, RX Vega 56, double edition, amd

Not content to let Asus have all the fun with X shaped products, graphics card manufacturer XFX is prepping two new Vega graphics cards that feature a cut-out backplate and cooler shroud that resembles a stretched-out X. XFX has, so far, only released a few pictures of the card but they do show off most of the card including the top edge, cooler, and backplate.

XFX RX Vega Double Edition Cooler.jpg

XFX has opted for a short PCB that extends slightly past the first cooling fan. The card is a dual slot design with a large heatsink and two large red fans and a bit less than half of the cooler extends past the PCB as a result. Cooling is not an issue thanks to liberal use of heat pipes (I think there are five main copper heat pipes), but the cooler hanging so far past the PCB has resulted in the two 8-pin PCI-E power connectors ending up in the middle of the cooler (the middle of the X shape) which is not ideal for cable management (still waiting for someone to put the PCI-E power connectors on the back edge closest to the motherboard!) but with a bit of modding maybe it would be possible to hid the wires under the shroud and route them around the card as one of the photos it looks like there is a bit of a gap between the heatsink and the shroud/backplate heh).

The design is sure to be divisive with some people loving it and other hating it, but XFX has put quite a bit of work into it. The red fans are surrounded by a stylized black shroud with a carbon fiber texture while the top edge holds the red XFX logo. The backplate specifically looks great with a black and grey design with red accent that features numerous cutouts for extra ventilation.

Display outputs are standard with three DisplayPort and one HDMI out.

TechPowerUp along with Videocardz are reporting that the card will come in both RX Vega 56 and RX Vega 64 variants. Unfortunately, while XFX has gone all out in the custom cooling and backplate, they are not pushing any of the clockspeeds past factory settings with the RX Vega 56 Double Edition clocking in at 1156 MHz base and 1471 MHz boost on the GPU and 1600 MHz on the 8GB of HBM2 memory. The XFX RX Vega 64 Double Edition is also stock clocked at 1247 MHz base, 1546 MHz boost, and 1890 MHz memory. It is not all bad news though, because with such a beefy cooler, enthusiasts should be able to overclock the chips themselves at least a bit (depending on how lucky they are in the silicon lottery) but it does mean that XFX isn’t guaranteeing anything. Also, overclocking might be more top-end overclock limited on the Vega 64 version versus other custom cards due to it only including two 8-pin power connectors (which does make me wonder what they have done as far as the VRMs versus reference if anything).

XFX RX Vega Double Edition Backplate.jpg

XFX has not yet revealed pricing or availability for their custom RX Vega cards.

What are your thoughts on the X design? 

Also read:

Source: TechPowerUp

Nvidia GeForce 388.43 WHQL driver

Subject: Graphics Cards | November 30, 2017 - 01:31 PM |
Tagged: GeForce 388.43, nvidia, whql, DOOM VFR, NV Tray

It might sound like an experimental Nazi weapon from WWII, the DOOM VFR has little to do with Wolfenstein and is instead a DOOM virtual reality game for the HTC Vive (pre-purchasing is bad, m'kay).   NVIDIA's new game ready driver, the GeForce 388.43 WHQL release is made to improve the performance of your GTX in this game.

vfr.PNG

This release also marks the return of the NVTray, much to the delight of the hoards of users who mourned the loss of the utility.  You can grab the drivers here, or through the GeForce Experience app if you have it installed.

 

Source: NVIDIA

A look at the latest Radeon graphics driver stack

Subject: Graphics Cards | November 29, 2017 - 03:20 PM |
Tagged: windows 10, vega 64, RX 580, microsoft, linux 4.15, linux, amd

With a new Linux kernel out, Phoronix revisited the performance of two of AMD's new cards running on that kernel as well as the current version of Windows 10.  GPU testing on Linux has gotten more interesting thanks to the upsurge in compatible games, this review encompasses the recent Deus Ex, Shadow of Mordor, F1 2017 and GRID Autosport.  The tests show there is still work to be done on the Mesa Radeon graphics driver stack as in all cases the performance lagged behind on Linux even though the hardware was exactly the same.

image.php_.jpg

"As we end out November, here is a fresh look at the current Windows 10 Pro Fall Creator's Update versus Ubuntu 17.10 with the latest Linux 4.15 kernel and Mesa 17.4-dev Radeon graphics driver stack as we see how various games compete under Windows 10 and Linux with these latest AMD drivers on the Radeon RX 580 and RX Vega 64 graphics cards."

Here is some more Tech News from around the web:

Gaming

 

Source: Phoronix

XSPC Razor Neo Waterblock is pretty, effective

Subject: Graphics Cards | November 23, 2017 - 01:30 PM |
Tagged: watercooler, gtx 1080 ti, nvidia, XSPC, Razer Neo

It seems a shame to hide the XSPC Razor Neo watercooler for the GTX 1080 Ti as you will not easily see the polished nickel plated copper waterblock and tempered glass window XSPC used.  [H]ard|OCP found the design to be very scratch resistant and it allows you to completely avoid the cracks which acrylic inevitably develops as it ages.  This waterblock is not just decorative, [H] found the card would hit and remain at 2100.5MHz in game, with temperatures never exceeding 33C, with or without the Frag Harder Disco Lights going.

1511290150fvx2g4emm9_1_2_l.jpg

"If you are thinking about delving in water cooling your high end NVIDIA GTX 1080 or 1080 Ti video card, the XSPC Razor Neo is certainly worthy of being on your short list. Outside of its incredibly good looks, Frag Harder Disco Lights, and easy install process, does it work well when it comes to overclocking and cooling your GTX 1080 Ti?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

A New Frontier

Console game performance has always been an area that we've been interested in here at PC Perspective but has been mostly out of our reach to evaluate with any kind of scientific tilt. Our Frame Rating methodology for PC-based game analysis relies on having an overlay application during screen capture which is later analyzed by a series of scripts. Obviously, we can not take this approach with consoles as we cannot install our own code on the consoles to run that overlay. 

A few other publications such as Eurogamer with their Digital Foundry subsite have done fantastic work developing their internal toolsets for evaluating console games, but this type of technology has mostly remained out of reach of the everyman.

trdrop2.PNG_.png

Recently, we came across an open source project which aims to address this. Trdrop is an open source software built upon OpenCV, a stalwart library in the world of computer vision. Using OpenCV, trdrop can analyze the frames of ordinary gameplay (without an overlay), detecting if there are differences between two frames, looking for dropped frames and tears to come up with a real-time frame rate.

trdrop1.PNG

This means that trdrop can analyze gameplay footage from any source, be it console, PC, or anything in-between from which you can get a direct video capture feed. Now that PC capture cards capable of 1080p60, and even 4K60p are coming down in price, software like this is allowing more gamers to peek at the performance of their games, which we think is always a good thing.

It's worth noting that trdrop is still listed as "alpha" software on it's GitHub repo, but we have found the software to be very stable and flexible in the current iteration.

  Xbox One S Xbox One X PS4 PS4 Pro
CPU 8x Jaguar
1.75 Ghz
8x Jaguar
2.3 Ghz
8x Jaguar
1.6 Ghz
8x Jaguar
2.1 Ghz
GPU CU 12x GCN
914 Mhz
40x Custom
1172 Mhz
18x GCN
800 Mhz
36x GCN
911 Mhz
GPU
Compute
1.4 TF 6.0 TF 1.84 TF 4.2 TF
Memory 8 GB DDR3
32MB ESRAM
12 GB GDDR5 8 GB GDDR5 8 GB GDDR5
Memory
Bandwidth
219GB/s 326GB/s 176GB/s 218GB/s

Now that the Xbox One X is out, we figured it would be a good time to take a look at the current generation of consoles and their performance in a few games as a way to get our feet wet with this new software and method. We are only testing 1080p here, but we now have our hands on a 4K HDMI capture card capable of 60Hz for some future testing! (More on that soon.)

Continue reading our look at measuring performance of the Xbox One X!

State of the GPU market for Q3 2017; miners and gamers putting a smile on manufacturers faces

Subject: Graphics Cards | November 20, 2017 - 06:57 PM |
Tagged: jon peddie, q3 2017

The latest results from Jon Peddie Research are out and it looks like it has been a good quarter for discrete GPU vendors, not so much for APUs however.  When JPR looks at the graphics market, they include all silicon with graphics capabilities, discrete GPUs, APUs and IGPs giving a broad overview of the current state of the market.

jpr overall.png

It seems the market shares of Matrox and S3 have finally disappeared into the noise, leaving only Intel, AMD and NVIDIA represented in the breakdown of global GPU market share.  In all cases the total amount of sales have gone up, which fits in with seasonal patterns and demonstrates that while the PC market may be wounded, it is far from dead.  Intel's total GPU sales increased by 5% from last quarter, which translated to a loss of 3.2% of total market share.  AMD saw a total increase of 7.6%, their desktop GPUs alone increased by 16.1%, however that was only enough to keep them at the same ~13% of the global GPU market.  NVIDIA saw the biggest increase, a 29.5% jump in sales, which gives them just under 20% of the GPU market to call their own.

totes PC.png

A very interesting data point from JPR's latest report shows how the overall PC market has changed over time.  We have never recovered from the highs of the end of 2010, for a wide variety of reasons ranging from the long term impact of the global recession to a certain company's decision to switch from a lively two step to a stately waltz.  The market is signs that the long decline we have seen may be slowing, instead of dropping by several million units during the traditionally sluggish beginning of the year it only dropped by about one million units.  Consider the lack of driving reasons to do a complete upgrade of a computer this year until AMD's Ryzen and Threadripper appeared, it is quite possible we may see sales steady or perhaps even rise over 2018.

We won't know for a while yet, but the signs are more encouraging than they have been in a long time.