Review Index:
Feedback

NVIDIA GeForce RTX 2060 Review Part One: Initial Testing

Manufacturer: NVIDIA

Power, Temperatures, Noise, and Conclusion

View Full Size

Total power draw at the wall with our test platform (Core i7-8700K system with 16GB of 3000MHz DDR4) places the card between a GTX 1070 and RTX 2070 at load, which is in line with its performance level. Once we have a chance to try out overclocking we will revisit power consumption with the RTX 2060.

Temperatures with this Founders Edition cooler was excellent, with temps in the low 60s under load a sign of  how efficient this dual-fan cooler design is, though board partners will of course have their own solutions.

View Full Size

What about noise levels in reaching these low load temps? I measured the RTX 2060 at 32.5 dBA at idle, and just 35 dBA under full load. This is a cool and quiet card.

Conclusion

View Full Size

As previously mentioned the full story of the RTX 2060 has not been told here, but these initial findings should at least provide a good idea of the RTX 2060's capabilities. A followup is planned covering such omissions as 2560x1440 game testing, ray tracing performance, and overclocking results, so look for that in the coming weeks.

As things stand the GeForce RTX 2060 is an impressive product as it brings performance that often compares to a GTX 1070 and even GTX 1080, above what might be expected from a "mid-range" offering, and while $349 represents a sizable investment for the mainstream 1080p gaming segment, this card is more of a QHD solution with very high FHD performance as well. What the various versions from board partners will retail for when the card goes on sale remains to be seen, so it would be premature to make a price/performance argument either way.

Based on our first round of testing the RTX 2060 provides impressive performance beyond 1080p, proving itself more than capable in games at higher resolutions and detail settings, and adds (of course) the ray tracing capabilities of the Turing architecture. The RTX 2060 is more than just a standard midrange GPU to be sure, and as we revisit the card post-CES and conclude our testing we will make a more definite conclusion.


January 7, 2019 | 04:22 PM - Posted by Killergran (not verified)

This actually looks interesting. Will have to see what the prices end up at once it reaches me here in Sweden, but given it keeps on the lower side of 4000SEK it actually might be the card I buy to replace my 970. That card is over 3 years old now, but still mostly good for 1920*1080 gaming.
Given the numbers in this review it seems the 2060 has roughly 2X performance for only slightly more money.

January 7, 2019 | 07:17 PM - Posted by Power (not verified)

It looks good at the moment but 6GB will limit RTX2060 to 1080p in just a year or so. For 1080p gaming 350USD seems a bit too much.

January 11, 2019 | 08:54 AM - Posted by Spunjji

Citation needed. There are no indications that 6GB of RAM will limit this card at 2.5k resolutions any time soon.

January 11, 2019 | 08:39 PM - Posted by Wall Street (not verified)

If the next gen consoles are released, it could start being an issue in games sooner than later.

January 7, 2019 | 04:23 PM - Posted by Jgr9 (not verified)

Hammering requested, so... Highlight the 2060 in the charts.

(pls)

January 7, 2019 | 04:34 PM - Posted by Sebastian Peak

Yes! Highlighting or a different color to make it easier to spot, and better colors for the charts in general. It will be done.

January 14, 2019 | 03:51 PM - Posted by Sander Bouwhuis

That is good to hear. It's literally why I registered at this site.

BTW, Congratulations on taking over the site. I wish you all the best!

January 7, 2019 | 04:51 PM - Posted by beagley (not verified)

"it also helps keep things looking tidy in that trendy tempered glass enclosure you have your eye on (you know the one)."

Hit the nail on the head. I'm shopping right now between 2-3 itx white tempered glass chassis' to hold a new VR-on-the-go setup.

January 7, 2019 | 06:22 PM - Posted by Jann5s

I heard nvidia has unlocked adaptive sync on this card, I’d be very curious to some tests on freesync monitors if that would be possible.

January 8, 2019 | 02:36 AM - Posted by Pholostan

There will be a driver in a week or so that will let GTX 10-series and RTX 20-series cards support certain monitors for adaptive sync. They have a list, out of 400 tested screens 12 did qualify. There will also be an option to turn it on for any screen, but no promises how it will work out.

https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

January 8, 2019 | 03:44 AM - Posted by Isaac Johnson

I like this review format so far, maybe you won't ruin the site after all... (kidding...(not kidding...))

January 8, 2019 | 11:03 AM - Posted by P (not verified)

So... we need to wait for an eventual RTX 2050 for a real mid-range card at real mid-range price.

January 8, 2019 | 02:02 PM - Posted by Sebastian Peak

That's basically my thinking: 1050/1050 Ti sucessors will be very interesting if we see similar gains. This feels like a step above "midrange" for sure.

January 14, 2019 | 03:54 PM - Posted by Sander Bouwhuis

Price, yes. Performance, maybe.

It's just a clear sign that there isn't enough competition for Nvidia. Not good for consumers.

I was hoping for a 2060 GTX. I would MUCH prefer a $250,- 2060 GTX over a $350,- 2060 RTX.

January 8, 2019 | 11:14 AM - Posted by TheFane (not verified)

I have a MSI GTX 1070 Gaming X that I bought for around $290 when the mining craze ended after selling my Aorus Extreme GTX 1060 for the exact same amount.

Would there be any advantages for me to upgrade? I'm gaming on a full HD 60hz monitor.

January 8, 2019 | 12:26 PM - Posted by Tony Morrow (not verified)

Will you be performing testing using Ryan/Ken's frame rating technique or will you rely on in game benchmarks and other API tools? I always liked the frametime and variance graphs since it helped visualize screen stutter.

January 8, 2019 | 09:44 PM - Posted by Sebastian Peak

I have been using OCAT, which will allow for frame rating testing similar to the FCAT results of the past. I will continue to work on a solution that is easy to follow visually when I get back from CES.

January 8, 2019 | 01:05 PM - Posted by JohnGR

At $350 it's NOT a mainstream card. A mainstream card is at $200 - $250. This is an expensive card, out of the price range of the mainstream category. The fact that offers performance at GTX 1070/Ti level is an excuse for it's price, NOT for putting it in the mainstream category.

Anyway, I see many sites and bloggers playing Nvidia's song. Nvidia is trying to move all prices up and unfortunately tech sites are helping. Soon we will be reading about 2030 at $150 being a low end card and the excuse would be "look, it scores better than the 1050.

PS GTX 970 was a hi end card at $330

January 8, 2019 | 09:17 PM - Posted by raytracemyfacebaby (not verified)

a gallon of gas used to cost 25cents. cell phones used to cost $300.

trolling aside, point well made and well taken. my 7970, the king of cards at the time, most expensive one could purchase...$500. woof

hey, at least ssd's and ddr4 is coming down. mydigital bpx pro 480 gb ftw look at them speeds and endurance ratings mr samsung 970 pro hehe ;)

January 9, 2019 | 03:27 AM - Posted by TonySaxton

It's more cheaper than other cards

January 9, 2019 | 10:45 AM - Posted by Matt (not verified)

Sebastian, isn't making sure you're not CPU bound, the first step in testing a GPU accurately? It seems like you're severely CPU bound on Ashes of the Singularity @ 1080p and Final Fantasy XV @ 1080p. How is a Vega64 faster than a 2080? Maybe it's time to start testing with a overclocked 9700k/9900k. RAM speeds have a great impact on certain engines as well so using a DDR4-3400 or DDR4-3600 should help in those situations as well.

January 9, 2019 | 11:47 AM - Posted by Sebastian Peak

Sure, 1080p presents some issues testing with CPU bound games, and I could just drop lower res testing for them. The alternative is to use higher detail settings for games that present those issues and verify that GPUs scale as expected. I'm planning to re-test because of that. Hard to retest from the airport when I made those charts.

I don't really think a Core i7-8700K with 3000 MHz DDR4 is going to be a bottleneck for most gamers, and this setup is intended to be more realistic, even though we could use a super high-end platform instead.

January 13, 2019 | 08:19 PM - Posted by David Lamma (not verified)

This is exciting. Was really holding off from buying another 10XX card and did not want an expensive 2070 or 2080. 350 doesn't seem so bad. Cannot wait to order. Wish they allowed for pre-orders or something.

January 13, 2019 | 10:54 PM - Posted by Anonymo (not verified)

I totally agree. People are so funny in here. Seems like a very solid card. Enjoy!

January 14, 2019 | 04:25 AM - Posted by othertomperson (not verified)

This is far too expensive. Approaching three years since the release of the GTX 1070, Nvidia have released a follow up that offers... the same price and performance of a GTX 1070. This represents a complete halting of any kind of measurable progress, like the entire Turing product line.

January 15, 2019 | 01:42 PM - Posted by semiconductorslave

"I totally agree with anyone who suggests that we need to run tests as 2560x1440 as well, and this is something that will be addressed." -Sebastian

I totally agree with you totally agreeing with me.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.