[H]ard|OCP have spent a lot of with Watch Dogs 2 recently, enough to create three articles covering the game of which two are now published. The first article focuses on performance at ultra settings and finding the highest playable settings that the GPUs they tested were capable of, without installing the high resolution texture pack. As it turns out, the game is much more graphically demanding than many other recent releases, so much so that only the Titan X and GTX 1080 was able to perform at 4k resolutions, the GTX 1070 and 1060, as well as the RX 480 and 470 only feature at lower resolutions.
The second article looks at performance with the texture pack installed, which did not have much effect on overall performance but significantly increased VRAM usage. Even the mighty Titan X struggled with this game, we will need a new generation of GPUs to utilize all the available graphics features available in this game. The last review will be up soon and will focus on what effect each of the graphical settings have on the visual appearance of the game.
"Watch Dogs 2 has been released on the PC. We will have a three part evaluation of performance and image quality starting today with performance comparisons. We will also find the highest playable settings for each graphics card and the gameplay experience delivered. Finally, a graphically demanding game."
Here is some more Tech News from around the web:
- Ubisoft Giving Away Yet Another Free Game @ [H]ard|OCP
- Dishonored 2 update 1.3 brings performance boosts @ Rock, Paper, SHOTGUN
- Tobii Tech 4C eye tracker for gaming @ Kitguru
- Wot I Think: Tyranny @ Rock, Paper, SHOTGUN
- Gears of War 4 DirectX 12 Graphics Performance @ eTeknix
- Sniper Ghost Warrior 3 is an off-brand Far Cry game @ Rock, Paper, SHOTGUN
- The Last Guardian Is Finally Here—and Yes, It Was Worth the Wait @ Wired
- Dead Rising 4 shambles onto Windows 10 @ Rock, Paper, SHOTGUN
- Nvidia launches GeForce GTX 1050 and 1060 Indie Bundle @ HEXUS
- Deus Ex: Mankind Divided Graphics Performance Analysis @ eTeknix
- Mugs and mayhem: eight minutes of Prey @ Rock, Paper, SHOTGUN
- Tenebra is a free horror game inspired by silent films @ Rock, Paper, SHOTGUN
Who wants to play the video
Who wants to play the video game Watch Dogs, where you sit and watch dogs all day?
I’m thinking what gpu devs
I’m thinking what gpu devs uses… is there an article about this?
Depends on the developer.
Depends on the developer. Some have whatever is the best available, others render up on a server farm, still others might have multiple PCs for unit testing.
But, remember they are developing a lot of these games for console first. What that means in practice is they are targeting either 30 or 60 fps (more often than not, 30,) and sometimes, when they can’t meet the target, they might go with an uncapped frame rate, even on console.
It boils down to the type of game. Driving games and first person shooters, you definitely want 60 fps. A game like Watch_Dogs 2 on console, it’s definitely 30.
So, when they go to test on PC, they have the target in mind and offer up a bunch of additional settings to put the power in the hands of the users (and sometimes, this may have the added benefit of obfuscating the reason for the performance differential, visuals, etc. when comparing PC to consoles.)
It’s not like they are all sitting there and saying to themselves “Let’s target a GTX 1080 or TITAN X.” No, they are trying to make their artistic vision mesh with the available technology and put out something that pleases everyone. I don’t envy them at all, the developers.
They are developing a game based on a vision of what that game looks like artistically, built using either an in-house developed engine or, more often, one sourced from a third party, and then working hard to make that vision fit within the context of the available hardware resources.
The argument I keep seeing is “devs didn’t optimize for PC.” Often times, the real issue is PC users are expecting too much.
Sure, they can tweak the visuals (more often than not, this means degrading the quality) so that better performance can be had.
Other times it’s a back-end issue, and no amount of hardware is going to overcome it in this generation.
This is a good thing, in my opinion. It means developers are pushing the envelope, and hardware needs to catch up.
But that’s not to say some of them aren’t guilty of boneheaded moves such as Forza Horizon 3 on PC being dependent on that one CPU thread that always seems to hover around 100%, and FPS tends to suffer, seemingly as a result, in the more polygon rich areas such as Surfers Paradise.
Sometimes it’s the software that’s important. Other times it’s the hardware.
There’s an ebb and flow. It’s nice when they line up.
thanks for the info.
maybe
thanks for the info.
maybe you’re right, they are mostly go for console first. how much are consoles’ gdk? could a low budget indie dev afford to get one?
let’s talk about pc game devs. i can understand if, due to the complexity of a single aaa game today, each build must take a lot of time. do they run the build on a single (best) machine? couldn’t they run the build on different machines with varying cpus and gpus to do like early optimization? or maybe when nearing beta or before doing finishing touches. at least they know before release if a combination of cpu/gpu is not well optimized. just curious.