Summary of Events
In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating. At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology. I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find.
My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever.
Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was. Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem. If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.
At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics. Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been. Instead I posted graphs like this:
We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.
Since those stories were released, AMD has been very active. At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing. However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.
The results were great! The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology. There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.
But the story won’t end there. CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround. As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.
Subject: General Tech, Graphics Cards | August 31, 2013 - 04:30 PM | Scott Michaud
Tagged: geforce, F2P, bundle
If you read Jeremy's post, yesterday, and actually were a fan of in game currency then how about a new bundle? If you pick up a GeForce GTX 650 or participating GTX 700M-based notebooks, you will receive a total $75 split between three Free-to-Play (F2P) titles. One of the titles, Warframe, is also optimized for NVIDIA hardware with the inclusion of PhysX support primarily for particle effects, it would seem.
The bundle includes:
- Warframe - 465 Platinum (Normally $25)
- Dungeons and Dragons: NeverWinter - 1,000,000 Astral Diamonds (Normally $25)
- Marvel Heroes - 2600 Gold (Normally $25)
NVIDIA stresses that you must make your purchase (be it a system containing a GTX 650, a discrete add-in GTX 650, or a laptop containing a GTX 700M) from one of the participating merchants. Codes will not be provided if the retailer, or 'e-tailer', is not a partner for this program.
A New TriFrozr Cooler
Graphics cards are by far the most interesting topic we cover at PC Perspective. Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed. Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs. Other than the figurative stamp that is the sticker on the fan.
One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand. As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.
Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May. Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market. MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of. Just what DO you get for $750 these days?
Subject: Graphics Cards | August 2, 2013 - 02:50 AM | Tim Verry
Tagged: graphics drivers, nvidia, shield, pc game streaming, gaming, geforce
NVIDIA recently released a new set of beta GeForce graphics card drivers targetted at the 400, 500, 600, and 700 series GPUs. The new version 326.41 beta drivers feature the same performance tweaks as the previous 326.19 drivers while baking in beta support for PC game streaming to NVIDIA’s Shield gaming portable from a compatible GeForce graphics card (GTX 650 or better). The new beta release is also the suggested version to use for those running the Windows 8.1 Preview.
NVIDIA has included the same performance tweaks as version 326.19. The tweaks offer up to 19% performance increases, depending on the particular GPU setup. For example, users running a GTX 770 will see as much as 15% better performance in Dirt: Showdown and 6% in Tomb Raider. Performance improvements are even higher for GTX 770 SLI setups, with boosts in Dirt: Showdown and F1 2012 of 19% and 11% respectively. NVIDIA has also added SLI profiles for Splinter Cell: Blacklist and Batman: Arkham Origins.
The NVIDIA Shield launched recently and reviews are making the rounds around the Internet. One of the exciting features of the Shield gaming handheld is the ability to stream PC games from a PC with NVIDIA graphics card to the Shield over Wi-Fi.
The 326.41 drivers improve performance across several games on the GTX 770.
The other major changes are improvements to tiled 4K displays, which are displays with 4K resolutions that are essentially made of two separate displays, and the monitor even shows up to the OS as two separate displays despite being in a single physical monitor. Using DisplayPort MST and tiled displays allows monitor manufacturers to deliver 4K displays with higher refresh rates.
Interested GeForce users can grab the latest beta drivers from the NVIDIA website or via the links below:
Overclocked GTX 770 from Galaxy
When NVIDIA launched the GeForce GTX 770 at the very end of May, we started to get in some retail samples from companies like Galaxy. While our initial review looked at the reference models, other add-in card vendors are putting their own unique touch on the latest GK104 offering and Galaxy was kind enough to send us their GeForce GTX 770 2GB GC model that uses a unique, more efficient cooler design and also runs at overclocked frequencies.
If you haven't yet read up on the GTX 770 GPU, you should probably stop by my first review of the GTX 770 to see what information you are missing out on. Essentially, the GTX 770 is a full-spec GK104 Kepler GPU running at higher clocks (both core and memory speeds) compared to the original GTX 680. The new reference clocks for the GTX 770 were 1046 MHz base clock, 1085 MHz Boost clock and a nice increase to 7.0 GHz memory speeds.
Galaxy GeForce GTX 770 2GB GC Specs
The Galaxy GC model is overclocked with a new base clock setting of 1111 MHz and a higher Boost clock of 1163 MHz; both are about 6.5-7.0% higher than the original clocks. Galaxy has left the memory speeds alone though keeping them running at 7.0 GHz effectively.
Another Wrench – GeForce GTX 760M Results
Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board. While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.
The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics. Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.
This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology. It is configured with 2GB of GDDR5 memory running at 2.0 GHz.
If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first. Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing. And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today.
If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop. The data presented below depends on that background knowledge!
Okay, you’ve been warned – on to the results.
Subject: Graphics Cards | June 25, 2013 - 01:28 PM | Jeremy Hellstrom
Tagged: geforce, GK104, gtx 760, nvidia, msi, MSI N760 TF 2GD5/OC
To start off with the good news, the GTX 760 is now available between $250 to $260 for the MSI model that [H]ard|OCP reviewed. No paper launch this, nor another $400+ card for you to dream about but instead a solid performing card at a decent price. Power is provide by an 8-pin and a 6-pin PCIe power connector, perhaps a little more than the card needs but perfect for overclockers who need the extra juice. Performance wise the card trumps the GTX 660Ti and matches the GTX 670 and HD7950 boost in almost every test, for a good $50-75 less to pick up. Even better news is that some certain sites testing Frame Rating and SLI performance saw great scaling in real performance.
"Today NVIDIA is launching the GeForce GTX 760. The GeForce GTX 760 will be replacing a video card and offering what use to be high-end memory performance, at a mainstream price. We will evaluate a retail MSI N760 TF 2GD5/OC video card with comparisons to find out whether or not this is a true value."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 760 graphics card reviewed @ The Tech Report
- Gigabyte GTX 760 WindForce OC 2 GB @ techPowerUp
- MSI GeForce GTX 760 Gaming OC 2GB Video Card Review in SLI @ Legit Reviews
- MSI GTX 760 TwinFrozr Gaming 2 GB @ techPowerUp
- Gainward GeForce GTX 760 Phantom @ Legion Hardware
- EVGA GTX 760 SC w/ ACX Cooler 2 GB @ techPowerUp
- ASUS GTX 760 DirectCU II OC 2 GB @ techPowerUp
- NVIDIA GTX 760 2GB Review @ Hardware Canucks
- NVIDIA GTX 760 Review @ OCC
- Nvidia GTX760 @ Kitguru
- Palit GTX 760 JetStream 2 GB @ techPowerUp
- NVIDIA GeForce GTX 760 2GB Video Card Review @ Legit Reviews
- Gigabyte GTX 760 OC Video Card Review @ Ninjalane
- Nvidia GeForce GTX 760 review: boost for the mid-range segment @ Hardware.info
- NVIDIA GeForce GTX 760 2 GB @ techPowerUp
- NVIDIA GeForce GTX 760 Launch Review @ Neoseeker
GeForce GTX 760 @ TechSpot
- MSI GTX 770 TwinFrozr Gaming 2 GB @ techPowerUp
- Nvidia GeForce GTX 780 3 GB @ X-bit Labs
- EVGA GTX 770 Super Clocked w/ ACX Video Card Review @ Ninjalane
- EVGA GeForce GTX 780 ACX SC Review @ Hardware Canucks
- MSI GTX 770 Gaming @ Bjorn3D
- MSI GTX 770 Lightning 2 GB @ techPowerUp
- Gigabyte GeForce GTX 770 WindForce 3X 2GB Video Card Review @ Legit Reviews
- ASUS GeForce GTX 770 DirectCU II @ [H]ard|OCP
- Gigabyte GTX 780 WindForce 3X OC Review @ Hardware Canucks
- ASUS GTX 780 Direct CU II OC 3 GB @ techPowerUp
- Gainward GeForce GTX 650 Ti BOOST 2GB "Golden Sample" Review @ Madshrimps
- Zotac GeForce GTX TITAN AMP! Edition 6 GB Graphics Card and TITAN in 2-Way SLI Configuration @ X-bit Labs
- Prolimatech MK-26 Multi-VGA Cooler @ eTeknix
- Arctic Accelero Hybrid 7970 VGA Cooler @ eTeknix
- Intel Haswell HD Graphics 4600 vs. AMD Radeon Graphics On Linux @ Phoronix
- Choice of Champions: Asus ROG MATRIX 7970 3 GB @ X-bit Labs
- PowerColor Radeon HD 7850 SCS3 1GB Passive Video Card Review @ Legit Reviews
- XFX Radeon HD R7790 Video Card @ Benchmark Reviews
- Gigabyte Radeon HD7790 @ Funky Kit
- PowerColor TurboDuo HD7790 Review @ Neoseeker
- PowerColor HD 7850 1GB SCS3 Fanless Video Card Review @ HiTech Legion
- HIS Radeon HD 7790 Turbo 1GB @ eTeknix
Getting even more life from GK104
Have you guys heard about this new GPU from NVIDIA? It’s called GK104 and it turns out that the damn thing is found yet another graphics card this year – the new GeForce GTX 760. Yup, you read that right, what NVIDIA is saying is the last update to the GeForce lineup through Fall 2013 is going to be based on the same GK104 design that we have previously discussed in reviews of the GTX 680, GTX 670, GTX 660 Ti, GTX 690 and more recently, the GTX 770. This isn’t a bad thing though! GK104 has done a fantastic job in every field and market segment that NVIDIA has tossed it into with solid performance and even better performance per watt than the competition. It does mean however that talking up the architecture is kind of mind numbing at this point…
If you are curious about the Kepler graphics architecture and the GK104 in particular, I’m not going to stop you from going back and reading over my initial review of the GTX 680 from January of 2012. The new GTX 760 takes the same GPU, adds a new and improved version of GPU Boost (the same we saw in the GTX 770) and lowers down the specifications a bit to enable NVIDIA to hit a new price point. The GTX 760 will be replacing the GTX 660 Ti – that card will be falling into the ether but the GTX 660 will remain, as will everything below it including the GTX 650 Ti Boost, 650 Ti and plain old 650. The GTX 670 went the way of the dodo with the release of the GTX 770.
Even though the GTX 690 isn't on this list, NVIDIA says it isn't EOL
As for the GeForce GTX 760 it will ship with 1152 CUDA cores running at a base clock of 980 MHz and a typical boost clock of 1033 MHz. The memory speed remains at 6.0 GHz on a 256-bit memory bus and you can expect to find both 2GB and 4GB frame buffer options from retail partners upon launch. The 1152 CUDA cores are broken up over 6 SMX units and that means you’ll see some parts with 3 GPCs and others with 4 – NVIDIA claims any performance delta between them will be negligible.
Subject: Graphics Cards | June 20, 2013 - 04:05 PM | Ryan Shrout
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd
Well, the date has been set. AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st. For a problem that many in the industry didn't think existed.
Big news for CrossFire! We plan to release our driver that delivers improved multi-GPU frame pacing on July 31. More info soon.
— AMD Radeon Graphics (@AMDRadeon) June 20, 2013
Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring. Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.
For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology.
- Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing
- Frame Rating: Visual Effects of Vsync on Gaming Animation
- Frame Rating: AMD Improves CrossFire with Prototype Driver
AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.
So what can we expect on July 31st? A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous. More to come in the next 30 days!
Subject: Graphics Cards | June 18, 2013 - 03:39 PM | Ryan Shrout
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd
The original source article at IGN.com has been updated with some new information. Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported."
The quote from an EA rep says as follows:
DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.
END UPDATE #3
This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine. That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014. Here is the quote that is getting my attention:
Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.
Battlefield 4 will be exclusive optimized for AMD hardware.
This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles. The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams.
I am particularly interested in this line as well:
While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.
The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming. What is bothersome to me is that both EA and AMD
are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners. In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release. Without these builds, NVIDIA would be at a big disadvantage. This is exactly what happend with the recent Tomb Raider release.
AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy. In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"
So what do we take away from that statement, made in a story published in March, and today's rumor? We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way." That stance just happens to be counter to this rumor.
NVIDIA had performance and compatibility issues with Tomb Raider upon release.
The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA. When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing. In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards. Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales. At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc. But times change as opportunity changes.
The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business." If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.
Will the advantage be with AMD thanks to PS4 and Xbox One hardware?
At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months. AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space. NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side. In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.
Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority. There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion.
Remember the issues with Batman: Arkham Asylum? I do.
I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far. Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:
It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.
Not much there, but he is also not denying of the original report coming from IGN. It might just be too early for a more official statement. I will update this story with information from NVIDIA if I hear anything else.
What do YOU think about this announcement though? Is this good news for AMD and bad news for NVIDIA? Is it good or bad for the gamer and in particular, the PC gamer? Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject.
Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:
END UPDATE #2