Subject: Graphics Cards | July 23, 2015 - 10:52 AM | Ryan Shrout
Tagged: nvidia, geforce, gtx, bundle, metal gear solid, phantom pain
NVIDIA continues with its pattern of flagship game bundles with today's announcement. Starting today, GeForce GTX 980 Ti, 980, 970 and 960 GPUs from select retailers will include a copy of Metal Gear Solid V: The Phantom Pain, due out September 15th. (Bundle is live on Amazon.com.) Also, notebooks that use the GTX 980M or 970M GPU qualify.
From NVIDIA's marketing on the bundle:
Only GeForce GTX gives you the power and performance to game like the Big Boss. Experience the METAL GEAR SOLID V: THE PHANTOM PAIN with incredible visuals, uncompromised gameplay, and advanced technologies. NVIDIA G-SYNC™ delivers smooth and stutter-free gaming, GeForce Experience™ provides optimal playable settings, and NVIDIA GameStream™ technology streams your game to any NVIDIA SHIELD™ device.
It appears that Amazon.com already has its landing page up and ready for the MGS V bundle program, so if you are hunting for a new graphics card stop there and see what they have in your range.
Let's hope that this game release goes a bit more smooth than Batman: Arkham Knight...
SLI and CrossFire
Last week I sat down with a set of three AMD Radeon R9 Fury X cards, our sampled review card as well as two retail cards purchased from Newegg, to see how the reports of the pump whine noise from the cards was shaping up. I'm not going to dive into that debate again here in this story as I think we have covered it pretty well thus far in that story as well as on our various podcasts, but rest assured we are continuing to look into the revisions of the Fury X to see if AMD and Cooler Master were actually able to fix the issue.
What we have to cover today is something very different, and likely much more interesting for a wider range of users. When you have three AMD Fury X cards in your hands, you of course have to do some multi-GPU testing with them. With our set I was able to run both 2-Way and 3-Way CrossFire with the new AMD flagship card and compare them directly to the comparable NVIDIA offering, the GeForce GTX 980 Ti.
There isn't much else I need to do to build up this story, is there? If you are curious how well the new AMD Fury X scales in CrossFire with two and even three GPUs, this is where you'll find your answers.
Subject: Graphics Cards | June 16, 2015 - 12:53 AM | Sebastian Peak
Tagged: the phantom pain, nvidia, metal gear solid, graphics, gpus, geforce, gameworks
A blog post on NVIDIA's site indicates that Konami's upcoming game Metal Gear Solid 5: The Phantom Pain will make use of NVIDIA technologies, a move that will undoubtedly rankle AMD graphics users who can't always see the full benefit of GameWorks enhancements.
"The world of Metal Gear Solid V: The Phantom Pain is going to be 200 times larger than the one explored in Ground Zeroes. Because so much of this game’s action depends on stealth, graphics are a key part of the gameplay. Shadows, light, and terrain have to be rendered perfectly. That’s a huge challenge in a game where the hero is free to find his own way from one point to another. Our engineers are signed up to work closely with Konami to get the graphics just right and to add special effects."
Now technically this quote doesn't confirm the use of any proprietary NVIDIA technology, though it sounds like that's exactly what will be taking place. In the wake of the Witcher 3 HairWorks controversy any such enhancements will certainly be looked upon with interest (especially as the next piece of big industry news will undoubtedly be coming with AMD's announcement later today at E3).
It's hard to argue with better graphical quality in high profile games such as the latest Metal Gear Solid installment, but there is certainly something to be said for adherence to open standards to ensure a more unified experience across GPUs. The dialog about inclusion though adherence to standards vs. proprietary solutions has been very heated with the FreeSync/G-Sync monitor refresh debate, and GameWorks is a series of tools that serves to further divide gamers, even as it provides an enhanced experience with GeForce GPUs.
Such advantages will likely matter less with DirectX 12 mitigating some differences with more efficiency in the vein of AMD's Mantle API, and if the rumored Fiji cards from AMD offer superior performance and arrive priced competitively this will matter even less. For now even though details are nonexistent expect an NVIDIA GeForce GPU to have the advantage in at least some graphical aspects of the latest Metal Gear title when it arrives on PC.
Subject: Graphics Cards | June 9, 2015 - 08:33 AM | Ryan Shrout
Tagged: video, tom petersen, nvidia, maxwell, live, GTX 980 Ti, gtx, gsync, gm200, giveaway, geforce, g-sync, contest
UPDATE: Did you miss the event? No worries, you can still learn all about the GTX 980 Ti, G-Sync changes and even how NVIDIA is changing VR! Once again, a HUGE thanks to NVIDIA and Tom Petersen for coming out to visit.
Even thought it's a week after official release, we are hosting a live stream from the PC Perspective offices with NVIDIA's Tom Petersen to discuss the new GeForce GTX 980 Ti graphics card as well as the changes and updates the company has made to the G-Sync brand. Why would NVIDIA undercut the GTX TITAN X by such a wide margin? Are they worried about AMD's Fiji GPU? Now that we are seeing new form factors and screen types of G-Sync monitors, will prices come down? How does G-Sync for notebooks work without a module?
All of this information and more will be learned on Tuesday, June 9th.
And what's a live stream without a prize? One lucky live viewer will win an EVGA GeForce GTX GTX 980 Ti 6GB graphics card of their very own! That's right - all you have to do is tune in for the live stream Tuesday afternoon and you could win a 980 Ti!!
NVIDIA GeForce GTX 980 Ti / G-Sync Live Stream and Giveaway
12pm PT / 3pm ET - June 9th
Need a reminder? Join our live mailing list!
The event will take place Tuesday, June 9th at 12pm PT / 3pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.
Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.
If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?
So join us! Set your calendar for this coming Tuesday at 12pm PT / 3pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!
Subject: Graphics Cards | June 2, 2015 - 04:40 PM | Jeremy Hellstrom
Tagged: video, nvidia, maxwell, GTX 980 Ti, gsync, gm200, geforce, gameworks vr, g-sync, dx12, 6Gb
Hopefully by now you have familiarized yourself with Ryan's review of the new GTX980 Ti and perhaps even some of the other reviews below. One review that you should not miss is by Scott over at The Tech Report as they used an X99 system for benchmarking and covered a slightly different suite of games. The games both sites tested show very similar results and in the case of BF4 and Crysis 3, showed that the R9 295 X2 is still a force to be reckoned with, especially when it is on sale at a price similar to the 980 Ti. In testing the Witcher 3 and Project Cars, the 980Ti showed smoother performance with impressive minimum frame times. Overall, The Tech Report gives the nod to the new GTX 980 Ti for more fluid gameplay but does offer the necessary reminder, AMD will be launching their new products very soon and could offer new competition.
"You knew it was coming. When Nvidia introduced the GeForce Titan X, it was only a matter of time before a slightly slower, less expensive version of that graphics card hit the market. That's pretty much how it always happens, and this year is no exception."
Here are some more Graphics Card articles from around the web:
- NVIDIA GeForce GTX 980 Ti @ [H]ard|OCP
- NVIDIA GEFORCE GTX 980 Ti @ NitroWare
- EVGA GeForce GTX 980Ti SC Review @ HiTech Legion
- NVIDIA GeForce GTX 980Ti Review @HiTech Legion
- NVIDIA GeForce GTX 980 Ti Review @ Neoseeker
- NVIDIA GeForce GTX 980 Ti Review @ OCC
- The New King Of High-end: NVIDIA GeForce GTX 980 Ti Review @ Techgage
- Nvidia GeForce GTX 980 Ti Video Card Preview @ Hardware Asylum
- Nvidia GeForce GTX 980 Ti @ Legion Hardware
- The New Nvidia GeForce GTX 980 Ti: Features and Tech Overview @ Bjorn3d
- The NVIDIA GTX 980Ti Performance Review @ Hardware Canucks
- Nvidia GTX980 Ti @ KitGuru
- EVGA GeForce GTX 960 SuperSC Graphics Card Review @ Techgage
- NVIDIA GeForce GTX 980 Ti 6 GB @ techPowerUp
- EVGA GTX 980 HYBRID Review @ Hardware Canucks
- Benchmarking The Latest AMD & NVIDIA Graphics Cards On Ubuntu Linux @ Phoronix
When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.
Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.
The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?
Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?
Subject: Graphics Cards | May 15, 2015 - 11:36 PM | Scott Michaud
Tagged: windows 10, geforce, graphics drivers, nvidia, whql
The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.
The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.
For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.
Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM | Scott Michaud
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5
Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.
The current list is a little different:
- NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
- Intel Core i5-4590 (or higher)
- 8GB RAM (or higher)
- A compatible HDMI 1.3 output
- 2x USB 3.0 ports
- Windows 7 SP1 (or newer).
I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.
This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.
The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.
Subject: Graphics Cards | May 5, 2015 - 12:46 PM | Sebastian Peak
Tagged: The Witcher 3, nvidia, GTX 980, GTX 970, gtx, geforce, batman arkham knight
NVIDIA has announced a new game bundle for GeForce graphics cards starting now, and it’s a doozy: Purchase a qualifying GTX card and receive download codes for both Witcher 3: Wild Hunt and Batman: Arkham Knight!
Needless to say, both of these titles have been highly anticipated, and neither have been released just yet with Witcher 3: Wild Hunt due to be released on May 19, and Batman: Arkham Knight arriving on June 23. So which cards qualify? Amazon has a page specifically created for this new offer here, and depending on which card you select you’ll be eligible for either both upcoming games, or just Witcher 3. NVIDIA has this chart on their promotion page for reference:
So GeForce GTX 980 and GTX 970 cards will qualify for both games, and GTX 960 cards and the mobile GPUs qualify for Witcher 3 alone. Regardless of which game or card you may choose these PC versions of the new games will feature graphics that can’t be matched by consoles, and NVIDIA points out the advantages in their post:
"Both Batman and The Witcher raise the bar for graphical fidelity, rendering two very different open worlds with a level of detail we could only dream of last decade. And on PC, each is bolstered by NVIDIA GameWorks effects that increase fidelity, realism, and immersion. But to use those effects in conjunction with the many other available options, whilst retaining a high frame rate, it’s entirely possible that you’ll need an upgrade."
NVIDIA also posted this Witcher 3: Wild Hunt behind-the-scenes video:
In addition to the GameWorks effects present in Witcher 3, NVIDIA worked with developer Rocksteady on Batman: Arkham Knight “to create new visual effects exclusively for the game’s PC release.” The post elaborates:
"This is in addition to integrating the latest and greatest versions of our GPU-accelerated PhysX Destruction, PhysX Clothing and PhysX Turbulence technologies, which work in tandem to create immersive effects that react realistically to external forces. In previous Batman: Arkham games, these technologies created immersive fog and ice effects, realistic destructible scenery, and game-enhancing cloth effects that added to the atmosphere and spectacle. Expect to see similar effects in Arkham Knight, along with new features and effects that we'll be talking about in depth as we approach Arkham Knight’s June 23rd release."
Of note, mobile GPUs included in the promotion will only receive download codes if the seller of the notebook is participating in the "Two Times The Adventure" offer, so be sure to check that out when looking for a gaming laptop that qualifies!
The promotion is going on now and is available “for a limited time or while supplies last”.
It's more than just a branding issue
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.