NVIDIA GeForce GTX 1660 (non-Ti) Review - Featuring EVGA and MSI
1920x1080 Game Benchmarks
|PC Perspective GPU Test Platform|
|Processor||Intel Core i7-8700K|
|Motherboard||ASUS ROG STRIX Z370-H Gaming|
|Memory||Corsair Vengeance LED 16GB (8GBx2) DDR4-3000|
|Storage||Samsung 850 EVO 1TB|
|Power Supply||CORSAIR RM1000x 1000W|
|Operating System||Windows 10 64-bit (Version 1803)|
We begin benchmark results with DX12 games, and I will point out that these three tests are not neutral, instead being examples of the very real effects of GPU optimization, with Ashes of the Singularity heavily favoring AMD, Far Cry 5 favoring AMD (but to a lesser degree), and then Shadow of the Tomb Raider firmly in the NVIDIA camp with AMD cards falling far behind in this test specifically. It may not be fair to run just one of these, but the hope is by running both AMD and NVIDIA optimized benchmarks we get a clearer picture of what a given GPU will do with real games, which often favor one side or the other.
Ashes of the Singularity: Escalation is first up, run at the "high" preset using DirectX 12.
With AotS Escalation we find the stock GTX 1660 barely edging out the GTX 1060 6GB, with a mere ~2.5% increase over the older card. This is the least impressive showing for the 1660 in this group of tests, with the game representing a "worst-case" scenario for NVIDIA cards in general.
Far Cry 5 is next, and this was also run using the default "high" preset settings.
Here the GTX 1660 jumps up the chart to finish ahead of the GTX 980 Ti, with gains improving to ~17% over the GTX 1060 6GB.
And now for the NVIDIA-friendly Shadow of the Tomb Raider benchmark result, run at the "high" preset as well.
The position remains the same for the GTX 1660 here relative to the other NVIDIA cards on the lower half of the chart - though the lead over the GTX 1060 6GB rises to more than 23% - but of course the AMD cards drop significantly in this test and here the GTX 1660 outperforms a Vega 64. The disadvantage to AMD cards is part and parcel with this game just as the reverse is true with AotS - though to less of an extreme for NVIDIA in that game.
Now we move on to some DirectX 11 tests, beginning with F1 2018, run using (you guessed it) the default "high" settings.
The GTX 1660's increase over the GTX 1060 6GB is up to nearly 27% here, and just behind the older GTX 980 Ti in this game (though with a slightly more consistent framerate).
Middle Earth: Shadow of War is our next DX11 title, run at the default "high" settings as well:
The GTX 1660 enjoys an increase of ~23% over the GTX 1060 6GB in this benchmark, and this marks the third result in a row to exceed 20% in performance gains vs. the older card.
Next up are a pair of "canned" benchmarks, beginning with the standalone Final Fantasy XV test, run here using the "standard" preset to help level the playing field with the AMD cards.
The GTX 1660 offers a ~11% increase over the GTX 1060 6GB here, which is the second-lowest gain we've seen at 1080p so far.
World of Tanks enCore is next; a standalone DX11 test that is run using the "ultra" preset as it is a less challenging test.
Here the GTX 1660 holds a 9% advantage over the GTX 1060 6GB.
To sum up, the GTX 1660 is capable of gains exceeding 20% compared to the GTX 1060 6GB, though its advantage varies quite a bit by title and in some instances is in the low single-digits.
But how will this Turing card fare vs. its Pascal predecessor when the resolution is bumped up to 2560x1440? Is the GTX 1660 a viable 1440p gaming option at $219? We will find out on the next page.