DigitalFoundry Compares GTX 1060 & RX 480 in Battlefield 1

Subject: Graphics Cards | October 19, 2016 - 08:08 PM |
Tagged: amd, nvidia, gtx 1060, rx 480, dx12, dx11, battlefield 1

Battlefield 1 is just a few days from launching. In fact, owners of the Deluxe Edition have the game unlock yesterday. It's interesting that multiple publishers are using release date as a special edition bonus these days, including Microsoft's recent Windows Store releases. I'm not going to say interesting bad or good, though, because I'll leave that up to the reader to decide.

Anywho, DigitalFoundry is doing their benchmarking thing, and they wanted to see what GPU could provide a solid 60FPS when everything is maxed out (at 1080p). They start off with a DX12-to-DX12 comparison between the GTX 1060 and the RX 480. This is a relatively fair comparison, because the 3GB GTX 1060 and the 4GB RX 480 both come in at about $200, while upgrading to 6GB for the 1060 or 8GB for the 480 bumps each respective SKU up to the ~$250 price point. In this test, NVIDIA has a few dips slightly below 60 FPS in complex scenes, while AMD stays above that beloved threshold.

They also compare the two cards in DX11 and DX12 mode, with both cards using a Skylake-based Core i5 CPU. In this test, AMD's card noticed a nice increase in frame rate when switching to DirectX 12, while NVIDIA had a performance regression in the new API. This raises two questions, one of which is potentially pro-NVIDIA, and the other, pro-AMD. First, would the original test, if NVIDIA's card was allowed to use DirectX 11, show the GTX 1060 more competitive against the DX12-running RX 480? This brings me to the second question: what would the user see? A major draw of Mantle-based graphics APIs is that the application has more control over traditionally driver-level tasks. Would 60 FPS in DX12 be more smooth than 60 FPS in DX11?

I don't know. It's something we'll need to test.

The MSI GTX 1070 GAMING X 8G has extra power

Subject: Graphics Cards | October 14, 2016 - 01:27 PM |
Tagged: msi, gtx 1070, GTX 1070 GAMING X 8G, factory overclocked

The most noticeable feature of this GTX 1070 from MSI is that is has an additional 6 pin power connector intended to ensure smooth power delivery.  The most confusing part is the branding, a GAMING X is better than a GAMING Z which is better than a GAMING which is better than a non-GAMING 1070.  The factory overclock on the card pushes the boost clock to 1771MHz and [H]ard|OCP also tested it the best overclock they could manage, a base clock of 1692MHz and a boost clock of 1882MHz.  Check out the effect that had on gameplay in their full review.

1476261835uf3UcQAoPD_1_7_l.jpg

"We have MSI’s new GeForce GTX 1070 GAMING X 8G video card to evaluate today. We will push this GPU as high as we can, and see how the overclock compares to the default factory overclock, and a Founders Edition NVIDIA GeForce GTX 1070. This video card is a fully custom retail video card with the Twin Frozr VI cooling system. "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Releases GeForce 373.06 Drivers

Subject: Graphics Cards | October 8, 2016 - 07:01 AM |
Tagged: nvidia, graphics drivers, geforce

On Thursday, NVIDIA released their latest graphics drivers to align with Gears of War 4, Mafia 3, and Shadow Warrior 2. The drivers were published before each of these games launched, which allows gamers to optimize their PCs ahead of time. Graphics vendors work with many big-budget studios during their development cycles, and any tweaks that they found over the months and years will be targeted to this release, as usual.

nvidia-geforce.png

Beyond tweaking for these games, NVIDIA has also announced a couple of fixes. If you were experiencing issues in Overwatch, then these new drivers fix how decals are drawn. The major fix claims to reduce inconsistent performance in multiple VR titles, which is very useful for these applications.

You can get these drivers from their website, or just install them from GeForce Experience.

Source: NVIDIA

Gigabyte Launches GTX 1080 TT With Blower Style Cooler

Subject: Graphics Cards | October 6, 2016 - 03:17 PM |
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte

Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.

The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.

Gigabyte GTX 1080TT Blower Style Cooled Graphics Card.jpg

Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).

The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.

There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch. 

Also read:

The GeForce GTX 1080 8GB Founders Edition Review - GP104 Brings Pascal to Gamers

CES 2016: Gigabyte Xtreme Gaming Series GPU with RGB LED Fans (video) @ PC Perspective

Source: TechPowerUp

AMD Discusses Multi-GPU Frame Pacing in DirectX 12

Subject: Graphics Cards | October 5, 2016 - 09:01 PM |
Tagged: amd, frame pacing, DirectX 12

When I first read this post, it was on the same day that AMD released their Radeon Software Crimson Edition 16.10.1 drivers, although it was apparently posted the day prior. As a result, I thought that their reference to 16.9.1 was a typo, but it apparently wasn't. These changes have been in the driver for a month, at least internally, but it's unclear how much it was enabled until today. (The Scott Wasson video suggests 16.10.1.) It would have been nice to see it on their release notes as a new feature, but at least they made up for it with a blog post and a video.

If you don't recognize him, Scott Wasson used to run The Tech Report, and he shared notes with Ryan while we were developing our Frame Rating testing methodology. He was focused on benchmarking GPUs by frame time, rather than frame rate, because the number of frames that the user sees means less than how smooth the animation they present is. Our sites diverged on implementation, though, as The Tech Report focused on software, while Ryan determined that capturing and analyzing output frames, intercepted between the GPU and the monitor, would tell a more complete story. Regardless, Scott Wasson left his site to work for AMD last year, with the intent to lead User Experience.

We're now seeing AMD announce frame pacing for DirectX 12 Multi-GPU.

This feature particularly interesting, because, depending on the multi-adapter mode, a lot of that control should be in the hands of the game developers. It seems like the three titles they announced, 3D Mark: Time Spy, Rise of the Tomb Raider, and Total War: Warhammer, would be using implicit linked multi-adapter, which basically maps to CrossFire. I'd be interested to see if they can affect this in explicit mode via driver updates as well, but we'll need to wait and see for that (and there isn't many explicit mode titles anyway -- basically just Ashes of the Singularity for now).

If you're interested to see how multi-GPU load-balancing works, we published an animation a little over a month ago that explains three different algorithms, and how explicit APIs differ from OpenGL and DirectX 11. It is also embedded above.

Source: AMD

AMD Releases Radeon Software Crimson Edition 16.10.1

Subject: Graphics Cards | October 5, 2016 - 08:37 PM |
Tagged: graphics drivers, amd

Earlier today, AMD has released their Radeon Software Crimson Edition 16.10.1 drivers. These continue AMD's trend of releasing drivers alongside major titles, which, this time, are Mafia III (October 7th) and Gears of War 4 (October 11th). Both of these titles are multiple days out, apart from a handful of insiders with advanced copies, which makes it nice for gamers by letting them optimize their machine ahead of time, on their own schedule, before launch.

amd-2015-crimson-logo.png

The driver also includes a handful of interesting fixes. First, a handful of games, such as Overwatch, Battlefield 1, and Paragon, should no longer flicker when set to CrossFire mode. Also, performance issues in The Crew should be fixed with this release.

You can download AMD Radeon Software Crimson Edition 16.10.1 from their website.

Source: AMD

Report: NVIDIA GeForce GTX 1050 Ti Based on Pascal GP107

Subject: Graphics Cards | October 2, 2016 - 12:12 PM |
Tagged: rumor, report, pascal, nvidia, GTX 1050 Ti, graphics card, gpu, GP107, geforce

A report published by VideoCardz.com (via Baidu) contains pictures of an alleged NVIDIA GeForce GTX 1050 Ti graphics card, which is apparently based on a new Pascal GP107 GPU.

NVIDIA-GTX-1050-Ti-PCB-1.jpg

Image credit: VideoCardz

The card shown is also equipped with 4GB of GDDR5 memory, and contains a 6-pin power connector - though such a power requirement might be specific to this particular version of the upcoming GPU.

NVIDIA-GTX-1050-Ti-PCB-2.jpg

Image credit: VideoCardz

Specifications for the GTX 1050 Ti were previously reported by VideoCardz, with a reported GPU-Z screenshot. The card will apparently feature 768 CUDA cores and a 128-bit memory bus, with clock speeds (for this particular sample) of 1291 MHz base, 1392 MHz boost (with some room to overclock, from this screenshot).

NVIDIA-GeForce-GTX-1050-Ti-GPUZ-Specs.jpg

Image credit: VideoCardz

An official announcement for the new GPU has not been made by NVIDIA, though if these PCB photos are real it probably won't be far off.

Source: VideoCardz

EVGA Adds Water Cooled GTX 1070 FTW Hybrid To Lineup

Subject: Graphics Cards | September 27, 2016 - 10:04 PM |
Tagged: water cooling, pascal, hybrid cooler, gtx 1070, GP104, evga

EVGA is preparing to launch the GTX 1070 FTW Hybrid which is a water cooled card that pairs NVIDIA's GTX 1070 GPU with EVGA's Hybrid cooler and custom FTW PCB. The factory overclocked graphics card is currently up for pre-order for $500 on EVGA's website.

EVGA GTX 1070 FTW Hybrid Pascal GPU.jpg

The GTX 1070 FTW Hybrid uses EVGA's custom PCB that features two 8-pin power connectors that drive a 10+2 power phase and dual BIOS chips. The Hybrid cooler includes a shrouded 100mm axial fan and a water block that directly touches both the GPU and the memory chips. The water block connects to an external 120mm radiator and a single fan that can be swapped out and/or powered by a motherboard using a standard four pin connector. Additionally, the cooler has a metal back plate and RGB LED back-lit EVGA logos on the side and windows on the front. Display outputs include one DVI, one HDMI, and three DisplayPort connectors.

As far as specification go, EVGA did not get too crazy with the factory overclock, but users should be able to push it quite far on their own assuming they get a decent chip from the silicon lottery. The GP104 GPU has 1920 CUDA cores clocked at 1607 MHz base and 1797 MHz boost. However, the 8 GB of memory is clocked at the stock 8,000 MHz. For comparison, reference clock speeds are 1506 MHz base and 1683 MHz boost.

Interestingly, EVGA rates the GTX 1070 FTW Hybrid at 215 watts versus the reference card's 150 watts. It is also the same TDP rating as the GTX 1080 FTW Hybrid card.

The table below outlines the specifications of EVGA's water cooled card compared to the GTX 1070 reference GPU and the GTX 1080 FTW Hybrid.

  GTX 1070 GTX 1070 FTW Hybrid GTX 1080 FTW Hybrid
GPU GP104 GP104 GP104
GPU Cores 1920 1920 2560
Rated Clock 1506 MHz 1607 MHz 1721 MHz
Boost Clock 1683 MHz 1797 MHz 1860 MHz
Memory 8GB 8GB 8GB
Memory Clock 8000 MHz 8000 MHz 10000 MHz
TDP 150 watts 215 watts 215 watts
Max Temperature 80°C 45°C 42°C
MSRP (current) $379 ($449 FE) $500 $730

According to EVGA, the Hybrid cooler offers up GPU and memory temperatures to 45°C and 57°C respectively compared to reference temperatures of 80°C and 85°C. Keeping in mind that these are EVGA's own numbers (you can see our Founder's Edition temperature results here), the Hybrid cooler seems to be well suited for keeping Pascal GPUs in check even when overclocked. In reviews of the GTX 1080 FTW Hybrid, reviewers found that the Hybrid cooler allowed stable 2GHz+ GPU clock speeds that let the card hit their maximum boost clocks and stay there under load. Hopefully the GTX 1070 version will have similar results. I am interested to see whether the memory chips they are using will be capable of hitting at least the 10 GHz of the 1080 cards if not more since they are being cooled by the water loop.

You can find more information on the factory overclocked water cooled graphics card on EVGA's website. The card is available for pre-order at $500 with a 3 year warranty.

Pricing does seem a bit high at first glance, but looking around at other custom GTX 1070 cards, it is only at about a $50 premium which is not too bad in my opinion. I will wait to see actual reviews before I believe it, but if I had to guess the upcoming card should have a lot of headroom for overclocking and I'm interested to see how far people are able to push it!

Source: EVGA

More VR testing, Trickster VR on the Vive

Subject: Graphics Cards | September 27, 2016 - 01:57 PM |
Tagged: VR, trickster vr, amd, nvidia, htc vive

[H]ard|OCP continues their look into the performance of VR games on NVIDIA's Titan X, GTX 1080, 1070, 1060 and 970 as well as AMD's Fury X and RX 480.  This particular title allowed AMD to shine, they saw the RX 480 come within a hair of matching the GTX 1060 which is a first for them and shows that AMD can be a contender in the VR market.  Pop by to see their review in full.

1474559750rGnsjoIAJs_6_1.jpg

"Arm yourself with a bow and arrows, a magic sword that flies, or if you prefer, a handful of throwing darts. Then get ready to take on the procedurally generated fantasy world full of cartoonish Orcs, and more Orcs, and some other Orcs. Headshots count as well as chaining your shots so aim is critical. Did I mention the Orcs?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Releases GeForce 372.90 Drivers

Subject: Graphics Cards | September 21, 2016 - 05:54 PM |
Tagged: nvidia, graphics drivers

With Forza Horizon 3 coming out for Ultimate Edition SKU users in a little over a day, NVIDIA has released their new Game Ready drivers. GeForce 372.90 drivers roll in all of NVIDIA's fixes for the game that have been discovered during its development.

nvidia-geforce.png

Thankfully, unlike the slippage that I've witnessed from them recently in this regard, the release notes for 372.90 are quite verbose (PDF). For instance, and this probably affects a few of our readers, NVIDIA has finally fixed the issue with HTC Vive over DisplayPort. Their description sounds like it wasn't failing to connect, as users believed, but rather it was just failing to light up the display. Of course, from a user's standpoint, a black screen is a black screen, but it's interesting to see what honest admissions of what exactly any given error was.

So, TL;DR: HTC Vive users should be able to use it over DisplayPort with Pascal again.

Also, they announced that the driver contains security updates. They don't elaborate on what specifically was fixed, especially since it will take a while for users to update, but it sounds like NVIDIA was in bug-fixing mode with this driver, which I appreciate.

You can get GeForce 372.90 from GeForce Experience and their website.

Source: NVIDIA