Quick Performance Comparison
Earlier this week, we posted a brief story that looked at the performance of Middle-earth: Shadow of Mordor on the latest GPUs from both NVIDIA and AMD. Last week also marked the release of the v1.11 patch for Sniper Elite 3 that introduced an integrated benchmark mode as well as support for AMD Mantle.
I decided that this was worth a quick look with the same line up of graphics cards that we used to test Shadow of Mordor. Let's see how the NVIDIA and AMD battle stacks up here.
For those unfamiliar with the Sniper Elite series, the focuses on the impact of an individual sniper on a particular conflict and Sniper Elite 3 doesn't change up that formula much. If you have ever seen video of a bullet slowly going through a body, allowing you to see the bones/muscle of the particular enemy being killed...you've probably been watching the Sniper Elite games.
Gore and such aside, the game is fun and combines sniper action with stealth and puzzles. It's worth a shot if you are the kind of gamer that likes to use the sniper rifles in other FPS titles.
But let's jump straight to performance. You'll notice that in this story we are not using our Frame Rating capture performance metrics. That is a direct result of wanting to compare Mantle to DX11 rendering paths - since we have no way to create an overlay for Mantle, we have resorted to using FRAPs and the integrated benchmark mode in Sniper Elite 3.
Our standard GPU test bed was used with a Core i7-3960X processor, an X79 motherboard, 16GB of DDR3 memory, and the latest drivers for both parties involved. That means we installed Catalyst 14.9 for AMD and 344.16 for NVIDIA. We'll be comparing the GeForce GTX 980 to the Radeon R9 290X, and the GTX 970 to the R9 290. We will also look at SLI/CrossFire scaling at the high end.
Subject: Mobile | October 7, 2014 - 01:26 PM | Jeremy Hellstrom
Tagged: nvidia, msi, maxwell, GTX 980M, GTX 970M, gt dominator, gs stealth, gs ghost
You've heard about NVIDIA's new GTX900M Series and MSI has released two new families of gaming laptops which contain the new GPU. The GS Stealth and Ghost series are the thinner, lighter more mobile of the laptops while the Dominator Series is more a desktop replacement and should also give you a good workout while you are carting them around. The base model will run you $1600, with more expensive options available such as the limited edition Crimson Red Stealth Pro model at the end. Before you ask, no the integral displays are not G-SYNC however since the mobile GPUs are based on Maxwell you may be able to output to a G-SYNC monitor with a variable refresh. Stay tuned for more.
City of Industry, Calif. – October 7, 2014 – MSI Computer Corp, a leading manufacturer of computer hardware products and solutions, announces the immediate availability of the critically acclaimed GS Stealth/Ghost Series and GT Dominator Series gaming laptops with NVIDIA’s latest GTX 900M Series graphics. Armed with unprecedented power and an array of cutting-edge features, including MSI’s new SHIFT technology, the new gaming notebooks deliver up to 28% more graphics performance for a mobile gaming experience without barriers.
“MSI’s newest gaming laptops showcase breakthroughs in mobile gaming technology that improve graphics performance, increase gaming comfort, and transport gamers into an unbelievable gaming journey,” says Andy Tung, president of MSI Pan America. “NVIDIA’s GTX 900M Series GPU delivers mobile graphics that are up to par with desktop graphics card models and superior to anything we’ve ever seen.”
SHIFT Your Speed
MSI’s newest gaming laptops feature their exclusive SHIFT power adjustment technology that enables easy tweaking of CPU and GPU performance to best suit the gamers’ needs. SHIFT comes with three proprietary modes: Sport to maximize CPU and GPU usage for extreme performance, Comfort for a smooth and balanced ride, and Green, which enables the lowest power consumption of both CPU and GPU while maintaining the coolest constant temperatures for both.
MSI provides unprecedented customization in all NVIDIA GeForce GTX 900M Series graphics equipped laptops via the Dragon Gaming Center and SteelSeries Engine. Gamers can SHIFT CPU and GPU usage through the Dragon Gaming Center as well as monitor system performance, temperature, network speed, power consumption, fan speed and more. The SteelSeries Engine gives gamers the ability to personalize playing style with over a billion customization options, program individual keys for unlimited configurations, determine key color and lighting patterns, save and share configurations, and learn gaming patterns with key usage statistics.
Cutting Edge Components
All revamped gaming laptops come with 4th Gen Intel Core i7 processors, Killer E2200 Game Networking, Sound Blaster Cinema, Dynaudio Technology, XSplit Gamecaster, 4K HDMI Output, Matrix Display and NVIDIA Surround View. NVIDIA Surround is now supported on all next gen models, allowing gamers to immersive themselves in the ultimate gaming experience. Select models come with MSI’s Super RAID technology which supports up to 4x M.2 SATA SSD’s in RAID 0, ultra-high resolution 3K displays, and Killer DoubleShot Pro combining Killer E2200 Game Networking with Killer N1525 Wireless AC.
MSI’s latest update applies to the ultra-thin and light GS70 Stealth Pro and GS60 Ghost Pro models, and the potent GT72 Dominator, GT70 Dominator and GT60 Dominator gaming laptops. All GS and GT gaming laptops equipped with NVIDIA’s GTX 900M graphics are available now starting at $1,599.99.
In addition, MSI is launching a special edition GS70 Stealth Pro in Crimson Red, catering to the demands of gamers who want more choices in color and style. The GS70 Stealth Pro Crimson Red edition will be available through online retailers with next gen graphics.
If there is one message that I get from NVIDIA's GeForce GTX 900M-series announcement, it is that laptop gaming is a first-class citizen in their product stack. Before even mentioning the products, the company provided relative performance differences between high-end desktops and laptops. Most of the rest of the slide deck is showing feature-parity with the desktop GTX 900-series, and a discussion about battery life.
First, the parts. Two products have been announced: The GeForce GTX 980M and the GeForce GTX 970M. Both are based on the 28nm Maxwell architecture. In terms of shading performance, the GTX 980M has a theoretical maximum of 3.189 TFLOPs, and the GTX 970M is calculated at 2.365 TFLOPs (at base clock). On the desktop, this is very close to the GeForce GTX 770 and the GeForce GTX 760 Ti, respectively. This metric is most useful when you're compute bandwidth-bound, at high resolution with complex shaders.
The full specifications are:
|GTX 980M||GTX 970M||
|Memory||Up to 4GB||Up to 3GB||4GB||4GB||4GB/8GB|
|Memory Rate||2500 MHz||2500 MHz||7.0 (GT/s)||7.0 (GT/s)||2500 MHz|
As for the features, it should be familiar for those paying attention to both desktop 900-series and the laptop 800M-series product launches. From desktop Maxwell, the 900M-series is getting VXGI, Dynamic Super Resolution, and Multi-Frame Sampled AA (MFAA). From the latest generation of Kepler laptops, the new GPUs are getting an updated BatteryBoost technology. From the rest of the GeForce ecosystem, they will also get GeForce Experience, ShadowPlay, and so forth.
For VXGI, DSR, and MFAA, please see Ryan's discussion for the desktop Maxwell launch. Information about these features is basically identical to what was given in September.
BatteryBoost, on the other hand, is a bit different. NVIDIA claims that the biggest change is just raw performance and efficiency, giving you more headroom to throttle. Perhaps more interesting though, is that GeForce Experience will allow separate one-click optimizations for both plugged-in and battery use cases.
The power efficiency demonstrated with the Maxwell GPU in Ryan's original GeForce GTX 980 and GTX 970 review is even more beneficial for the notebook market where thermal designs are physically constrained. Longer battery life, as well as thinner and lighter gaming notebooks, will see tremendous advantages using a GPU that can run at near peak performance on the maximum power output of an integrated battery. In NVIDIA's presentation, they mention that while notebooks on AC power can use as much as 230 watts of power, batteries tend to peak around 100 watts. Given that a full speed, desktop-class GTX 980 has a TDP of 165 watts, compared to the 250 watts of a Radeon R9 290X, translates into notebook GPU performance that will more closely mirror its desktop brethren.
Of course, you probably will not buy your own laptop GPU; rather, you will be buying devices which integrate these. There are currently five designs across four manufacturers that are revealed (see image above). Three contain the GeForce GTX 980M, one has a GTX 970M, and the other has a pair of GTX 970Ms. Prices and availability are not yet announced.
In what can most definitely be called the best surprise of the fall game release schedule, the open-world action game set in the Lord of the Rings world, Middle-earth: Shadow of Mordor has been receiving impressive reviews from gamers and the media. (GiantBomb.com has a great look at it if you are new to the title.) What also might be a surprise to some is that the PC version of the game can be quite demanding on even the latest PC hardware, pulling in frame rates only in the low-60s at 2560x1440 with its top quality presets.
Late last week I spent a couple of days playing around with Shadow of Mordor as well as the integrated benchmark found inside the Options menu. I wanted to get an idea of the performance characteristics of the game to determine if we might include this in our full-time game testing suite update we are planning later in the fall. To get some sample information I decided to run through a couple of quality presets with the top two cards from NVIDIA and AMD and compare them.
Without a doubt, the visual style of Shadow of Mordor is stunning – with the game settings cranked up high the world, characters and fighting scenes look and feel amazing. To be clear, in the build up to this release we had really not heard anything from the developer or NVIDIA (there is an NVIDIA splash screen at the beginning) about the title which is out of the ordinary. If you are looking for a game that is both fun to play (I am 4+ hours in myself) and can provide a “wow” factor to show off your PC rig then this is definitely worth picking up.
Subject: General Tech | October 2, 2014 - 02:05 PM | Ken Addison
Tagged: X99 Classified, X99, video, tlc, tegra k1, ssd, Samsung, podcast, nvidia, micron, M600, iphone 6, g-sync, freesync, evga, broadwell-u, Broadwell, arm, apple, amd, adaptive sync, a8, 840 evo, 840
PC Perspective Podcast #320 - 10/02/2014
Join us this week as we discuss the Micron M600 SSD, NVIDIA and Adaptive Sync, Windows 10 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:27:21
Subject: General Tech | October 1, 2014 - 01:09 PM | Jeremy Hellstrom
Tagged: nvidia, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr
Move over Super Best Friends, the Dynamic Super Resolution Duo is here to slay the evil Jaggies! Ryan covered NVIDIA's new DSR in his review of the new Maxwell cards and how it can upsample a monitor with a resolution of 2560x1440 or lower to much higher resolutions using a process similar to supersampling but is in fact a 13-tap gaussian filter. That is important because supersampling would have some interesting challenges rendering 2560x1440 on a 1080p monitor. DSR gives you a much wider choice of resolutions as you can see in the Guild Wars screenshot below, allowing you to choose a variety of multipliers to your displays native resolution to give your game a much smoother look. The Tech Report has assembled a variety of screenshots from games with different DSR and AA settings which you can examine with your own eyeballs to see what you think.
"One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. Nvidia bills it as a means of getting 4K quality on a 2K display. How good is it? We take a look."
Here is some more Tech News from around the web:
- There's more to Windows 10 than miscounting @ The Inquirer
- Microsoft WINDOWS 10: Seven ATE Nine. Or Eight did really @ The Register
- AMD demonstrates NFV tool using 64-bit ARM-based SoC codenamed 'Hierofalcon' @ The Inquirer
- Hong Kong Protesters Use Mesh Networks To Organize @ Slashdot
- Mozilla might add Tor encryption to its Firefox web browser @ The Inquirer
- Lenovo becomes the biggest x86 server provider in China as acquisition of IBM x86 server business completes, says IDC @ DigiTimes
- Supercomputers: The Next Generation – Cray puts burst buffer tech, Intel Haswell inside @ The Register
- Competition: Win One of Three BioStar Motherboards @ eTeknix
Subject: Graphics Cards | September 27, 2014 - 07:24 PM | Ryan Shrout
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync
During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:
There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."
To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.
The ASUS ROG Swift PG278Q G-Sync monitor
With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.
When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."
The future of G-Sync is still in development. Petersen stated:
"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."
Diagram showing how G-Sync affects monitor timings
So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.
You can check out our stories and reviews covering G-Sync here:
- PCPer Live! NVIDIA Maxwell, GTX 980, GTX 970 Discussion with Tom Petersen, Q&A
- Acer XB280HK 28-in 4K G-Sync Monitor Review
- NVIDIA G-Sync Surround Impressions: Using 3 ASUS ROG Swift Displays
- PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A
- ASUS ROG Swift PG278Q 27-in Monitor Review - NVIDIA G-Sync at 2560x1440
Subject: Graphics Cards | September 26, 2014 - 12:14 PM | Ryan Shrout
Tagged: vxgi, video, tom petersen, nvidia, mfaa, maxwell, livestream, live, GTX 980, GTX 970, dsr
UPDATE: If you missed the live stream yesterday, I have good news: the interview and all the information/demos provided are available to you on demand right here. Enjoy!
Last week NVIDIA launched GM204, otherwise known as Maxwell and now branded as the GeForce GTX 980 and GTX 970 graphics cards. You should, of course, have already read the PC Perspective review of these two GPUs, but undoubtedly there are going to be questions and thoughts circulating through the industry.
To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Thursday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Thursday, September 25th at 4pm ET / 1pm PT to discuss the new Maxwell GPU, the GTX 980 and GTX 970, new features like Dynamic Super Resolution, MFAA, VXGI and more! You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA Maxwell Live Stream
1pm PT / 4pm ET - September 25th
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Thursday afternoon!
UPDATE: We have confirmed at least a handful of prizes for those of you that tune into the live stream today. We'll giveaway an NVIDIA SHIELD as well as several of the brand new SLI LED bridges that were announced for sale this week!
Subject: General Tech, Graphics Cards | September 26, 2014 - 02:03 AM | Scott Michaud
Tagged: steam, precisionx 16, precisionx, overclocking, nvidia, evga
If you were looking to download EVGA Precision X recently, you were likely disappointed. For a few months now, the software was unavailable because of a disagreement between the add-in board (AIB) partner and Guru3D (and the RivaTuner community). EVGA maintains that it was a completely original work, and references to RivaTuner are a documentation error. As a result, they pulled the tool just a few days after launching X 15.
This new version, besides probably cleaning up all of the existing issues mentioned above, adds support for the new GeForce GTX 900-series cards, a new interface, an "OSD" for inside applications, and Steam Achievements (??). You can get a permanent badge on your Steam account for breaking 1200 MHz on your GPU, taking a screenshot, or restoring settings to default. I expect that latter badge is one of shame, like the Purple Heart from Battlefield, that is not actually a bad thing and says nothing less of your overclocking skills by pressing it. Seriously, save yourself some headache and just press default if things just do not seem right.
PrecisionX 16 is free, available now, and doesn't require an EVGA card (just a site sign-up).
Subject: General Tech, Mobile | September 26, 2014 - 01:45 AM | Scott Michaud
Tagged: tablet, Nexus, google, nexus 9, nvidia, tegra k1
The Nexus line is due for an update, with each product being released for at least a year. They are devices which embody Google's vision... for their own platform. You can fall on either side of that debate, whether it guides OEM partners or if it is simply a shard the fragmentation issue, if you even believe that fragmentation is bad, but they are easy to recommend and a good benchmark for Android.
We are expecting a few new entries in the coming months, one of which being the Nexus 9. Of note, it is expected to mark the return of HTC to the Nexus brand. They were the launch partner with the Nexus One and then promptly exited stage left as LG, Samsung, and ASUS performed the main acts.
We found this out because NVIDIA spilled the beans on their lawsuit filing against Qualcomm and Samsung. Apparently, "the HTC Nexus 9, expected in the third quarter of 2014, is also expected to use the Tegra K1". It has since been revised to remove the reference. While the K1 has a significant GPU to back it up, it will likely be driving a very high resolution display. The Nexus 6 is expected to launch at around the same time, along with Android 5.0 itself, and the 5.2-inch phone is rumored to have a 1440p display. It seems unlikely that a larger, tablet display will be lower resolution than the phone it launches alongside -- and there's not much room above it.
The Google Nexus 9 is expected for "Q3".