Yes that's right, if you felt Ryan and Al somehow missed something in our review of the new GTX 1080 or you felt the obvious pro-Matrox bios was showing here are the other reviews you can pick and choose from. Start off with [H]ard|OCP who also tested Ashes of the Singularity and Doom as well as the old favourite Battlefield 4. Doom really showed itself off as a next generation game, its Nightmare mode scoffing at any GPU with less than 5GB of VRAM available and pushing the single 1080 hard. Read on to see how the competition stacked up … or wait for the 1440 to come out some time in the future.
"NVIDIA's next generation video card is here, the GeForce GTX 1080 Founders Edition video card based on the new Pascal architecture will be explored. We will compare it against the GeForce GTX 980 Ti and Radeon R9 Fury X in many games to find out what it is capable of."
Here are some more Graphics Card articles from around the web:
- In the lab: Nvidia's GeForce GTX 1080 graphics card @ The Tech Report
- FCAT GeForce GTX 1080 Framepacing @ Guru of 3D
- NVIDIA GeForce GTX 1080 Review: A Look At 4K & Ultra-wide Gaming @ Techgage
- NVIDIA GeForce GTX 1080 Review – The Advent of Pascal @HiTech Legion
- NVIDIA GeForce GTX 1080 Founders Edition Review @ OCC
- NVIDIA GeForce GTX 1080 Founders Edition Review @ Neoseeker
- Nvidia GTX 1080 @ Kitguru
- NVIDIA GeForce GTX 1080 8 GB @ techPowerUp
- The NVIDIA GeForce GTX 1080 Review @ Hardware Canucks
This is concerning for a $699
This is concerning for a $699 card. Not to mention these test are being conducted on an open air test bed. Consumers will likely see worst performance once its in an case.
http://images.hardwarecanucks.com/image//skymtl/GPU/GTX-1080-REVIEWS/GTX-1080-REVIEWS-83.jpg
In this case Rise of the Tomb Raider goes from running at 75FPS to hovering between 72FPS and so we’re looking at an approximate 4% reduction in real-world performance.
http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/clock_analysis.jpg
I did some testing of Boost 3.0 on the GeForce GTX 1080 (not using Furmark). First, the card is in idle, before a game is started and clocks shoot up to 1885 MHz. As GPU temperature climbs, we immediately see Boost 3.0 reducing clocks – with Boost 2.0, clocks stayed at their maximum until a certain temperature was reached. Once the card reaches around 83°C, the clocks level out at around 1705 MHz
Tom’s Hardware
The 1080 hits
Tom’s Hardware
The 1080 hits its temperature target by dropping the GPU’s clock rate. During a gaming loop, it falls all the way down to its base frequency, leaving nothing left of GPU Boost. This gets even worse during our stress test, where the core clock dips below the 1607MHz that is supposed to be the GeForce GTX 1080’s floor.
http://media.bestofmicro.com/W/3/581763/gallery/01-Clock-Rate_w_600.png
The chip looks pretty hot…
The chip looks pretty hot… over 50 °C (first-degree burn limit).
Do they spell bias
Do they spell bias differently in Canada or has Jeremy’s spellcheck learned too many computer terms now to spot it?
bios? That’s a joke based on
bios? That's a joke based on commentors in the past claiming pc per is "biosed" heh.
Is it really a joke when they
Is it really a joke when they keep proving it true?
Yes.
http://www.smbc-comics.c
Yes.
http://www.smbc-comics.com/index.php?id=4105
I’ve read that the Partner’s
I’ve read that the Partner’s cards will now be available at the same time as the Founder’s Edition – 27th May
The Tagged field for this
The Tagged field for this story is kinda odd 😉
I was thinking that too,
I was thinking that too, that’s one hell of a news post tag Jeremy 🙂
well dang, that is not what I
well dang, that is not what I intended to copy in there.
I haven’t been this excited
I haven’t been this excited about buying a new GPU in almost 8 years!
Where are the Company of
Where are the Company of Heroes 2 benchmarks? A great single card benchmark
Jeremy, y Ryan didn’t run the
Jeremy, y Ryan didn’t run the division benchmark? Fury X obtained the highest FPS with 73???
i found out hardocp is covering NVDIA weak point & BF4 benchmark is totally different from others.
As usual do not answer the
As usual do not answer the right questions
1. HDR – what kind ? Dolby Vision ? Standard Dynamic Range (SDR) ?
or HDR10 ?
the HDR have in the card 1000 nits or 4000 nits ?
Unregistered As usual what supports the card’s hardware or software ?
2. in the picture
http://www.guru3d.com/index.php?ct=articles&action=file&id=21784
Write Contras over 10,000 :1 with nvidia 1080
In other words previous video cards we received the Contras 2000:1 ???
3. What color system supports 1080p video card ?
Rec.2020 color gamut or rec P3 ? or only rec 709 ?
>1. HDR – what kind ? Dolby
>1. HDR – what kind ? Dolby Vision ? Standard Dynamic Range (SDR) ?
or HDR10 ?
New HDR standards are software defined: the HDR parameters (primaries, etc) are provided as metadata as part of the bitstream. This is a requirement of the connection standard, both HDMI and DP.
>the HDR have in the card 1000 nits or 4000 nits ?
Unregistered As usual what supports the card’s hardware or software ?
LOL. Display brightness is handled by the display.
>2. in the picture
http://www.guru3d.com/index.php?ct=articles&action=file&id=21784
>Write Contras over 10,000 :1 with nvidia 1080
In other words previous video cards we received the Contras 2000:1 ???
Again, achievable contrast is determined by the display.
>3. What color system supports 1080p video card ?
Rec.2020 color gamut or rec P3 ? or only rec 709 ?
Once more, this is determined by the display. A GPU can output whatever the hell colourspace you want, even on today’s GPUs. The hard part is in telling the disdlay what cvolourspace you’re outputting: today the solution to that is “have the user do it, assume SRGB otherwise”. The change with newer interconnect standards is to transport as metadata the colourspace information.
Not entirely true. NVIDIA
Not entirely true. NVIDIA artificially limited output of bit depth and chroma compression on GeForce cards while offering more options on Quadro cards. I’m glad that they are coming around. AMD allowed whatever the hardware was capable of.
HDR10 and DolbyVision both use the metadata stream using the SMPTE ST 2084 EOTF format. Currently DV offers a larger range, 0-10,000 nits, but HDR10 is expandable. Both HDR technologies will conform to the capabilities of your display. Only now are some LCDs reaching 1,200 nits (all white), but they still can’t do pure black, so it’s somewhat relative. Likewise, OLEDs can do pure black, but max out around 800 nits for all white. So technically, OLEDs have a greater HDR range, despite not getting as bright.
If you do not know dont write
If you do not know dont write nonsense
you have evidence that lead to links.
Do not write me without proof
He doesn’t need links. You
He doesn’t need links. You are just wrong.
The original post seems to be
The original post seems to be in relatively bad English. Perhaps there are some misunderstandings due to that.
The original post seems to be
The original post seems to be in relatively bad English. Perhaps there are some misunderstandings due to that.
“LOL. Display brightness is
“LOL. Display brightness is handled by the display.”
There seem to be many things that the original poster does not understand, but replies like the one above are not explaining the issues. I don’t have that much knowledge of this, but I will try to explain simply. If any of this is wrong, please correct if you can offer a good explanation and not just LOL.
We have had 10 color for a while. I have Dell U3011 display from 2010 that supports 10 bit color. With SDR video, the bits (8, 10, 12, or more) are interpreted to be within the range of 0 to 100 nits. This is because old CRTs were not very bright. Modern displays are capable of much higher brightness than 100 nits so there has been a drive to expand that range. Modern cameras can capture a wider dynamic range also, but due to all of the legacy systems in place, this is usually compressed into the 0 to 100 nits range somewhere in production or transfer to the viewer. HDR attempts to preserve the dynamic range all of the way through the system. HDR uses the rec.2020 color space, which is either 10 or 12 bits per color component. It also expands the reference white level such that those color values are interpreted to be between 0 and 10,000 nits instead of 0 and 100 nits. This allows much better quality on modern displays capable of much higher brightness. The video card does not care how the bits are interpreted though, it just needs to be able to communicate to the display which interpretation is being used and the display needs to be able to interpret the values into what it can actually display. The 1000 or 4000 nits and contrast questions do not make any sense. Hopefully, it is understandable why this is so.
This doesn’t get into whether or not all of the software and hardware infrastructure supports HDR. If I downloaded an HEVC main 10 HDR encoded video and tried to play it with the display set to a supposedly HDR capable TV or other display, would it actually work with this video card? Are there video players that support this with hardware acceleration? HEVC is sufficiently processing intensive that in probably can not be played without hardware acceleration. Does Windows itself have issues any issues getting HDR to the screen? I am running Linux, and I still occasionally run into things where 10-bit color does not seem to be supported properly, which results in washed out colors or over saturated colors. Having a standard panel and a deep color panel connected at the same time seems to cause confusion in some cases.
I just like seeing all those
I just like seeing all those naked circuits! [6502 overloads]
“Chaps make working 6502 CPU
“Chaps make working 6502 CPU by hand. Because why not?
Hand-carved, hipster, artisanal micro[strikethrough text] macroprocessor”
http://www.theregister.co.uk/2016/05/18/chaps_make_6502_by_hand/