In what can most definitely be called the best surprise of the fall game release schedule, the open-world action game set in the Lord of the Rings world, Middle-earth: Shadow of Mordor has been receiving impressive reviews from gamers and the media. (GiantBomb.com has a great look at it if you are new to the title.) What also might be a surprise to some is that the PC version of the game can be quite demanding on even the latest PC hardware, pulling in frame rates only in the low-60s at 2560x1440 with its top quality presets.
Late last week I spent a couple of days playing around with Shadow of Mordor as well as the integrated benchmark found inside the Options menu. I wanted to get an idea of the performance characteristics of the game to determine if we might include this in our full-time game testing suite update we are planning later in the fall. To get some sample information I decided to run through a couple of quality presets with the top two cards from NVIDIA and AMD and compare them.
Without a doubt, the visual style of Shadow of Mordor is stunning – with the game settings cranked up high the world, characters and fighting scenes look and feel amazing. To be clear, in the build up to this release we had really not heard anything from the developer or NVIDIA (there is an NVIDIA splash screen at the beginning) about the title which is out of the ordinary. If you are looking for a game that is both fun to play (I am 4+ hours in myself) and can provide a “wow” factor to show off your PC rig then this is definitely worth picking up.
Installation and Overview
While once a very popular way to cool your PC, the art of custom water loops tapered off in the early 2000s as the benefits of better cooling, and overclocking in general, met with diminished returns. In its place grew a host of companies offering closed loop system, individually sealed coolers for processors and even graphics cards that offered some of the benefits of standard water cooling (noise, performance) without the hassle of setting up a water cooling configuration manually.
A bit of a resurgence has occurred in the last year or two though where the art and styling provided by custom water loop cooling is starting to reassert itself into the PC enthusiast mindset. Some companies never left (EVGA being one of them), but it appears that many of the users are returning to it. Consider me part of that crowd.
During a live stream we held with EVGA's Jacob Freeman, the very first prototype of the EVGA Hydro Copper was shown and discussed. Lucky for us, I was able to coerce Jacob into leaving the water block with me for a few days to do some of our testing and see just how much capability we could pull out of the GM204 GPU and a GeForce GTX 980.
Our performance preview today will look at the water block itself, installation, performance and temperature control. Keep in mind that this is a very early prototype, the first one to make its way to US shores. There will definitely be some changes and updates (in both the hardware and the software support for overclocking) before final release in mid to late October. Should you consider this ~$150 Hydro Copper water block for your GTX 980?
Subject: General Tech | October 1, 2014 - 01:09 PM | Jeremy Hellstrom
Tagged: nvidia, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr
Move over Super Best Friends, the Dynamic Super Resolution Duo is here to slay the evil Jaggies! Ryan covered NVIDIA's new DSR in his review of the new Maxwell cards and how it can upsample a monitor with a resolution of 2560x1440 or lower to much higher resolutions using a process similar to supersampling but is in fact a 13-tap gaussian filter. That is important because supersampling would have some interesting challenges rendering 2560x1440 on a 1080p monitor. DSR gives you a much wider choice of resolutions as you can see in the Guild Wars screenshot below, allowing you to choose a variety of multipliers to your displays native resolution to give your game a much smoother look. The Tech Report has assembled a variety of screenshots from games with different DSR and AA settings which you can examine with your own eyeballs to see what you think.
"One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. Nvidia bills it as a means of getting 4K quality on a 2K display. How good is it? We take a look."
Here is some more Tech News from around the web:
- There's more to Windows 10 than miscounting @ The Inquirer
- Microsoft WINDOWS 10: Seven ATE Nine. Or Eight did really @ The Register
- AMD demonstrates NFV tool using 64-bit ARM-based SoC codenamed 'Hierofalcon' @ The Inquirer
- Hong Kong Protesters Use Mesh Networks To Organize @ Slashdot
- Mozilla might add Tor encryption to its Firefox web browser @ The Inquirer
- Lenovo becomes the biggest x86 server provider in China as acquisition of IBM x86 server business completes, says IDC @ DigiTimes
- Supercomputers: The Next Generation – Cray puts burst buffer tech, Intel Haswell inside @ The Register
- Competition: Win One of Three BioStar Motherboards @ eTeknix
Subject: Graphics Cards | September 19, 2014 - 02:17 PM | Jeremy Hellstrom
Tagged: vr direct, video, nvidia, mfaa, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr
The answer to the two most important questions are as follows, the GTX 980 will cost you around $560 compared to the $500 for an R9 290X and the GTX 970 an attractive $330 compared to $380 for an R9 290. Availability is hard to predict but the cards will be shipping soon and you can pre-order your choice of card by following the links on the last page of Ryan's review. Among all the new features that have been added to this new GPU one of the most impressive is the power draw, as you can see in [H]ard|OCP's review this card pulls 100W less than the 290X at full load although it did run warmer than the 290X Double Dissipation card which [H] compared it to, something that may change with a 980 bearing a custom cooler. Follow those links to see the benchmarking results of this card, both synthetic and in game.
"Today NVIDIA launches its newest Maxwell GPU. There will be two new GPUs, the GeForce GTX 980 and GeForce GTX 970. These next generation GPUs usher in new features and performance that move the gaming industry forward. We discuss new features, architecture, and evaluate the gameplay performance against the competition."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 980 and 970 @ The Tech Report
- Maxwell is here! GTX 980 is now revealed! @ Bjorn3D.com
- Nvidia unveils second-gen Maxwell GPUs, the Geforce GTX 970 and GTX 980 @ The Inquirer
- NVIDIA GeForce GTX 980 4GB Review @HiTech Legion
- NVIDIA GeForce GTX 980 Review @ Neoseeker
- NVIDIA GTX 980 Review @ OCC
- NVIDIA GeForce GeForce GTX 980 Maxwell Video Card Review @ Legit Reviews
- MSI GTX 970 Gaming 4G GPU Review @ Modders-Inc
- GeForce GTX 980 @ HardwareHeaven
- NVIDIA GeForce GTX 980 @ Benchmark Reviews
- NVIDIA GeForce GTX 980 Review: Does Maxwell Bring Maximum Gameplay? @ Techgage
- ASUS Strix GTX 970 OC 4 GB @ techPowerUp
- NVIDIA GeForce GTX 980 Performance Review @ Hardware Canucks
- EVGA GTX 970 SC ACX 4 GB @ techPowerUp
- NVIDIA GeForce GTX 970 SLI @ techPowerUp
- NVIDIA GeForce GTX 980 4 GB @ techPowerUp
- Gigabyte G1 Gaming GTX 970 & GTX 980 @ Legion Hardware
- Nvidia Geforce GTX 980 @ Kitguru
- Asus GTX970 STRIX OC @ Kitguru
- MSI GTX 970 Gaming 4G @ Kitguru
The GM204 Architecture
James Clerk Maxwell's equations are the foundation of our society's knowledge about optics and electrical circuits. It is a fitting tribute from NVIDIA to include Maxwell as a code name for a GPU architecture and NVIDIA hopes that features, performance, and efficiency that they have built into the GM204 GPU would be something Maxwell himself would be impressed by. Without giving away the surprise conclusion here in the lead, I can tell you that I have never seen a GPU perform as well as we have seen this week, all while changing the power efficiency discussion in as dramatic a fashion.
To be fair though, this isn't our first experience with the Maxwell architecture. With the release of the GeForce GTX 750 Ti and its GM107 GPU, NVIDIA put the industry on watch and let us all ponder if they could possibly bring such a design to a high end, enthusiast class market. The GTX 750 Ti brought a significantly lower power design to a market that desperately needed it, and we were even able to showcase that with some off-the-shelf PC upgrades, without the need for any kind of external power.
That was GM107 though; today's release is the GM204, indicating that not only are we seeing the larger cousin of the GTX 750 Ti but we also have at least some moderate GPU architecture and feature changes from the first run of Maxwell. The GeForce GTX 980 and GTX 970 are going to be taking on the best of the best products from the GeForce lineup as well as the AMD Radeon family of cards, with aggressive pricing and performance levels to match. And, for those that understand the technology at a fundamental level, you will likely be surprised by how much power it requires to achieve these goals. Toss in support for things like a new AA method, Dynamic Super Resolution, and even improved SLI performance and you can see why doing it all on the same process technology is impressive.
The NVIDIA Maxwell GM204 Architecture
The NVIDIA Maxwell GM204 graphics processor was built from the ground up with an emphasis on power efficiency. As it was stated many times during the technical sessions we attended last week, the architecture team learned quite a bit while developing the Kepler-based Tegra K1 SoC and much of that filtered its way into the larger, much more powerful product you see today. This product is fast and efficient, but it was all done while working on the same TSMC 28nm process technology used on the Kepler GTX 680 and even AMD's Radeon R9 series of products.
The fundamental structure of GM204 is setup like the GM107 product shipped as the GTX 750 Ti. There is an array of GPCs (Graphics Processing Clustsers), each comprised of multiple SMs (Streaming Multiprocessors, also called SMMs for this Maxwell derivative) and external memory controllers. The GM204 chip (the full implementation of which is found on the GTX 980), consists of 4 GPCs, 16 SMMs and four 64-bit memory controllers.
Subject: General Tech, Graphics Cards | September 15, 2014 - 08:02 PM | Scott Michaud
Tagged: nvidia, geforce, GTX 980
Details and photographs of the GeForce GTX 980 are leaking on various forums and websites. Based on the Maxwell architecture, it is supposed to be faster and more efficient than Kepler while being manufactured on an identical, 28nm fab process. While we were uncertain before, it now looks like the GTX 980 will be its formal name, as seen in leaked photographs, below.
Image Credit: Videocardz
As expected, the cooler is a continuation of NVIDIA's reference cooler, as seen on recent high-end graphics cards (such as the GeForce Titan). Again, this is not a surprise. The interesting part is that it is rated for about 250W whereas Maxwell is rumored to draw 180W. While the reference card has two six-pin PCIe power connectors, I am curious to see if the excess cooling will lead to interesting overclocks. That is not even mentioning what the AIB partners can do.
Image Credit: Videocardz
Beyond its over-engineering for Maxwell's TDP, it also includes a back plate.
Its display connectors have been hotly anticipated. As you can see above, the GTX 980 has five outputs: three DisplayPort, one HDMI, and one DVI. Which version of HDMI? Which version of DisplayPort? No clue at the moment. There has been some speculation regarding HDMI 2.0, and the DisplayPort 1.3 standard was just released to the public today, but I would be pleasantly surprised if even one of these made it in.
Check out Videocardz for a little bit more.
Subject: Graphics Cards, Displays | August 22, 2014 - 08:05 PM | Ryan Shrout
Tagged: video, gsync, g-sync, tom petersen, nvidia, geforce
Earlier today we had NVIDIA's Tom Petersen in studio to discuss the retail availability of G-Sync monitors as well as to get hands on with a set of three ASUS ROG Swift PG278Q monitors running in G-Sync Surround! It was truly an impressive sight and if you missed any of it, you can catch the entire replay right here.
Even if seeing the ASUS PG278Q monitor again doesn't interest you (we have our full review of the monitor right here), you won't want to miss the very detailed Q&A that occurs, answering quite a few reader questions about the technology. Covered items include:
- Potential added latency of G-Sync
- Future needs for multiple DP connections on GeForce GPUs
- Upcoming 4K and 1080p G-Sync panels
- Can G-Sync Surround work through an MST Hub?
- What happens to G-Sync when the frame rate exceeds the panel refresh rate? Or drops below minimum refresh rate?
- What does that memory on the G-Sync module actually do??
- A demo of the new NVIDIA SHIELD Tablet capabilities
- A whole lot more!
Another big thank you to NVIDIA and Tom Petersen for stopping out our way and for spending the time to discuss these topics with our readers. Stay tuned here at PC Perspective as we will have more thoughts and reactions to G-Sync Surround very soon!!
Subject: General Tech | August 13, 2014 - 12:26 PM | Jeremy Hellstrom
Tagged: borderlands, nvidia, geforce
Santa Clara, CA — August 12, 2014 — Get ready to shoot ‘n’ loot your way through Pandora’s moon. Starting today, gamers who purchase select NVIDIA GeForce GTX TITAN, 780 Ti, 780, and 770 desktop GPUs will receive a free copy of Borderlands: The Pre-Sequel, the hotly anticipated new chapter to the multi-award winning Borderlands franchise from 2K and Gearbox Software.
Discover the story behind Borderlands 2’s villain, Handsome Jack, and his rise to power. Taking place between the original Borderlands and Borderlands 2, Borderlands: The Pre-Sequel offers players a whole lotta new gameplay in low gravity.
“If you have a high-end NVIDIA GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions and ice particles, and cloth and fluid simulation that blows me away every time I see it," said Randy Pitchford, CEO and president of Gearbox Software.
With NVIDIA PhysX technology, you will feel deep space like never before. Get high in low gravity and use new ice and laser weapons to experience destructible levels of mayhem. Check out the latest trailer here: http://youtu.be/c9a4wr4I1hk that just went live this morning!
Borderlands: The Pre-Sequel will also stream to your NVIDIA SHIELD tablet or portable. For the first time ever, you can play Claptrap anywhere by using NVIDIA Gamestream technologies. You can even livestream and record every fist punch with GeForce Shadowplay
Borderlands: The Pre-Sequel will be available on October 14, 2014 in North America and on October 17, 2014 internationally. Borderlands: The Pre-Sequel is not yet rated by the ESRB.
The GeForce GTX and Borderlands: The Pre-Sequel bundle is available starting today from leading e-tailers including Amazon, NCIX, Newegg, and Tiger Direct and system builders including Canada Computers, Digital Storm, Falcon Northwest, Maingear, Memory Express, Origin PC, V3 Gaming, and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetBorderlands.
Subject: Displays | August 12, 2014 - 03:36 PM | Jeremy Hellstrom
Tagged: asus, g-sync, geforce, gsync, nvidia, pg278q, Republic of Gamers, ROG, swift, video
Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out. Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was. The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.
”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”
Here are some more Display articles from around the web:
- AOC G2460PG G-Sync 144Hz 1ms Gaming Monitor @ Kitguru
- Asus ROG Swift PG278Q 144hz G-Sync Monitor @ Kitguru
- 6400×1080: Testing Mixed-Resolution AMD Eyefinity @ eTeknix
- Demystifying NTSC Color And Progressive Scan @ Hack a Day
The Waiting Game
NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.
In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.
Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.
That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.