X-Factor versus Delta Force; does your DX version matter right now?

Subject: General Tech | March 22, 2017 - 02:31 PM |
Tagged: gaming, dx11, dx12

We are finally starting to see a diverse enough field of games capable of running in both DX11 and DX12 which makes it much easier to see performance pattern differences.  [H]ard|OCP tested out Rise of the Tomb Raider, Hitman, Deus Ex: Mankind Divided, BF1, The Division, Sniper Elite and AotS on AMD's RX480 and NVIDIA's GTX 1080 and 1080 Ti.  In almost all cases the difference between the two APIs were negligible and neither offers significant performance benefits to owners of these cards.  The one exception was Sniper Elite 4 which did see some performance deltas, especially on the RX480.  Check out the full review to see for yourself.

View Full Size

"We play latest games with DX12 support and find out which is faster, DX12 or DX11? We use the latest drivers from NVIDIA and AMD to find any advantages in this GPU focused review. We’ll get to the bottom of the question, "Should I be running this game in DX12 or DX11 in order to get the best real world gaming performance?"

Here is some more Tech News from around the web:


Source: [H]ard|OCP

March 22, 2017 | 03:19 PM - Posted by pessimistic_observer (not verified)

tldr version: its 2010 all over again

March 22, 2017 | 03:21 PM - Posted by Anonymous (not verified)

I wonder, if broken DX12 implementations are the result of incompetence or intentional.
The difference is to be expected in increased efficiency. Pity that HARDOCP decided not to measure power consumption.

March 23, 2017 | 11:42 AM - Posted by Anonymous (not verified)

Or simply that game developers and device driver developers are not incompetent, and are not 'leaving performance on the table' under DX11. Developers have had decades of working with the requirement to pack draw calls together.

March 22, 2017 | 03:53 PM - Posted by Anonymous (not verified)

"AMD fans just need to keep the faith for a few months, and that soon Ryzen's full power will be revealed."

Could those few months be Games that require Scorpio and whatever Sony has in 2 years time?

Sounds like buy now and wait 30 months.

March 22, 2017 | 03:59 PM - Posted by Anonymous (not verified)

btw, got that z77-nvme going on 8x pcie and now i can copy and extract what used to take 4 seconds now takes 1 second. Wow... that was sure worth the money for those 2000 seconds i saved a year. oh well, i have money to burn and i can brag about the hack and complain about the idiocy of all of this. I just got back $1 in happiness with this complaining.

March 22, 2017 | 04:28 PM - Posted by Anonymous (not verified)

"Sounds like buy now and wait 30 months."
Or, buy smart now and do not need to worry even 30 months later.

March 22, 2017 | 05:31 PM - Posted by Jeremy Hellstrom

Where the hell did that quote come from?  No Ryzen involved in this testing at all.

March 22, 2017 | 04:37 PM - Posted by Nezarn

M$ peasant devs should start using Vulkan instead of the crap DX12.

March 23, 2017 | 01:46 AM - Posted by Carlos Leyva (not verified)

I forgot where I saw the video but it basically explained how devs are forced into DX12 without a viable option to develop in Vulkan.
Something about big brother Microsoft controlling revenue over game developers and how it would not be in their best interest to use Vulkan.
Sorry, this is a horrible post. But my point is that Vulkan might be a superior API but I don't think we will be able to see its full potential.

March 23, 2017 | 11:44 AM - Posted by Anonymous (not verified)

Or more simply that MS provide plentiful support for developers working on DX12 (and DX11), while when working on Vulkan the extent of support from Khronos is generally "uh, iunno, read the docs? They might even be up to date and legible!".

March 23, 2017 | 06:04 AM - Posted by Anonymous (not verified)

Who would have guessed that an API intended to alleviate CPU bottlenecks showed no gains when there was no CPU bottleneck.

March 23, 2017 | 01:10 PM - Posted by Anonymous (not verified)

Yea, it was pretty idiotic test - only resolutions that were tested were 4k and 1440p, something that puts too much stress on GPU (and something which I can't even run simply because all my monitors are still 1080p, and same is true for many other people).

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.