NVIDIA GameWorks Enhancements Coming in Upcoming Metal Gear Solid Game

Subject: Graphics Cards | June 16, 2015 - 12:53 AM |
Tagged: the phantom pain, nvidia, metal gear solid, graphics, gpus, geforce, gameworks

A blog post on NVIDIA's site indicates that Konami's upcoming game Metal Gear Solid 5: The Phantom Pain will make use of NVIDIA technologies, a move that will undoubtedly rankle AMD graphics users who can't always see the full benefit of GameWorks enhancements.

mgsv_1.png

"The world of Metal Gear Solid V: The Phantom Pain is going to be 200 times larger than the one explored in Ground Zeroes. Because so much of this game’s action depends on stealth, graphics are a key part of the gameplay. Shadows, light, and terrain have to be rendered perfectly. That’s a huge challenge in a game where the hero is free to find his own way from one point to another.  Our engineers are signed up to work closely with Konami to get the graphics just right and to add special effects."

Now technically this quote doesn't confirm the use of any proprietary NVIDIA technology, though it sounds like that's exactly what will be taking place. In the wake of the Witcher 3 HairWorks controversy any such enhancements will certainly be looked upon with interest (especially as the next piece of big industry news will undoubtedly be coming with AMD's announcement later today at E3).

It's hard to argue with better graphical quality in high profile games such as the latest Metal Gear Solid installment, but there is certainly something to be said for adherence to open standards to ensure a more unified experience across GPUs. The dialog about inclusion though adherence to standards vs. proprietary solutions has been very heated with the FreeSync/G-Sync monitor refresh debate, and GameWorks is a series of tools that serves to further divide gamers, even as it provides an enhanced experience with GeForce GPUs.

NV_Gameworks_blk_V.png

Such advantages will likely matter less with DirectX 12 mitigating some differences with more efficiency in the vein of AMD's Mantle API, and if the rumored Fiji cards from AMD offer superior performance and arrive priced competitively this will matter even less. For now even though details are nonexistent expect an NVIDIA GeForce GPU to have the advantage in at least some graphical aspects of the latest Metal Gear title when it arrives on PC.

Source: NVIDIA

PCPer Live! GeForce GTX 980 Ti, G-Sync Live Stream and Giveaway!

Subject: Graphics Cards | June 9, 2015 - 08:33 AM |
Tagged: video, tom petersen, nvidia, maxwell, live, GTX 980 Ti, gtx, gsync, gm200, giveaway, geforce, g-sync, contest

UPDATE: Did you miss the event? No worries, you can still learn all about the GTX 980 Ti, G-Sync changes and even how NVIDIA is changing VR! Once again, a HUGE thanks to NVIDIA and Tom Petersen for coming out to visit.

Even thought it's a week after official release, we are hosting a live stream from the PC Perspective offices with NVIDIA's Tom Petersen to discuss the new GeForce GTX 980 Ti graphics card as well as the changes and updates the company has made to the G-Sync brand. Why would NVIDIA undercut the GTX TITAN X by such a wide margin? Are they worried about AMD's Fiji GPU? Now that we are seeing new form factors and screen types of G-Sync monitors, will prices come down? How does G-Sync for notebooks work without a module?

All of this information and more will be learned on Tuesday, June 9th.

P1020269.JPG

And what's a live stream without a prize? One lucky live viewer will win an EVGA GeForce GTX GTX 980 Ti 6GB graphics card of their very own! That's right - all you have to do is tune in for the live stream Tuesday afternoon and you could win a 980 Ti!!

pcperlive.png

NVIDIA GeForce GTX 980 Ti / G-Sync Live Stream and Giveaway

12pm PT / 3pm ET - June 9th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, June 9th at 12pm PT / 3pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 12pm PT / 3pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Rounding up the GTX 980 Ti reviews

Subject: Graphics Cards | June 2, 2015 - 04:40 PM |
Tagged: video, nvidia, maxwell, GTX 980 Ti, gsync, gm200, geforce, gameworks vr, g-sync, dx12, 6Gb

Hopefully by now you have familiarized yourself with Ryan's review of the new GTX980 Ti and perhaps even some of the other reviews below.  One review that you should not miss is by Scott over at The Tech Report as they used an X99 system for benchmarking and covered a slightly different suite of games.  The games both sites tested show very similar results and in the case of BF4 and Crysis 3, showed that the R9 295 X2 is still a force to be reckoned with, especially when it is on sale at a price similar to the 980 Ti.  In testing the Witcher 3 and Project Cars, the 980Ti showed smoother performance with impressive minimum frame times.  Overall, The Tech Report gives the nod to the new GTX 980 Ti for more fluid gameplay but does offer the necessary reminder, AMD will be launching their new products very soon and could offer new competition.

card-front.jpg

"You knew it was coming. When Nvidia introduced the GeForce Titan X, it was only a matter of time before a slightly slower, less expensive version of that graphics card hit the market. That's pretty much how it always happens, and this year is no exception."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: NVIDIA

Specifications

When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.

Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.

P1020283.jpg

The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?

Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?

Continue reading our review of the new NVIDIA GeForce GTX 980 Ti 6GB Graphics Card!!

NVIDIA Releases 352.84 WHQL Drivers for Windows 10

Subject: Graphics Cards | May 15, 2015 - 11:36 PM |
Tagged: windows 10, geforce, graphics drivers, nvidia, whql

The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.

nvidia-geforce.png

The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.

For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.

Source: NVIDIA

Oculus Rift "Full Rift Experience" Specifications Released

Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM |
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5

Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.

oculus-dk2-product.jpg

The current list is a little different:

  • NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
  • Intel Core i5-4590 (or higher)
  • 8GB RAM (or higher)
  • A compatible HDMI 1.3 output
  • 2x USB 3.0 ports
  • Windows 7 SP1 (or newer).

I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.

This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.

The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.

Source: Oculus

New GeForce Bundle: Witcher 3 and Batman: Arkham Knight

Subject: Graphics Cards | May 5, 2015 - 12:46 PM |
Tagged: The Witcher 3, nvidia, GTX 980, GTX 970, gtx, geforce, batman arkham knight

NVIDIA has announced a new game bundle for GeForce graphics cards starting now, and it’s a doozy: Purchase a qualifying GTX card and receive download codes for both Witcher 3: Wild Hunt and Batman: Arkham Knight!

batman-witcher-glp-header.png

Needless to say, both of these titles have been highly anticipated, and neither have been released just yet with Witcher 3: Wild Hunt due to be released on May 19, and Batman: Arkham Knight arriving on June 23. So which cards qualify? Amazon has a page specifically created for this new offer here, and depending on which card you select you’ll be eligible for either both upcoming games, or just Witcher 3. NVIDIA has this chart on their promotion page for reference:

eligible_cards.png

So GeForce GTX 980 and GTX 970 cards will qualify for both games, and GTX 960 cards and the mobile GPUs qualify for Witcher 3 alone. Regardless of which game or card you may choose these PC versions of the new games will feature graphics that can’t be matched by consoles, and NVIDIA points out the advantages in their post:

"Both Batman and The Witcher raise the bar for graphical fidelity, rendering two very different open worlds with a level of detail we could only dream of last decade. And on PC, each is bolstered by NVIDIA GameWorks effects that increase fidelity, realism, and immersion. But to use those effects in conjunction with the many other available options, whilst retaining a high frame rate, it’s entirely possible that you’ll need an upgrade."

NVIDIA also posted this Witcher 3: Wild Hunt behind-the-scenes video:

In addition to the GameWorks effects present in Witcher 3, NVIDIA worked with developer Rocksteady on Batman: Arkham Knight “to create new visual effects exclusively for the game’s PC release.” The post elaborates:

"This is in addition to integrating the latest and greatest versions of our GPU-accelerated PhysX Destruction, PhysX Clothing and PhysX Turbulence technologies, which work in tandem to create immersive effects that react realistically to external forces. In previous Batman: Arkham games, these technologies created immersive fog and ice effects, realistic destructible scenery, and game-enhancing cloth effects that added to the atmosphere and spectacle. Expect to see similar effects in Arkham Knight, along with new features and effects that we'll be talking about in depth as we approach Arkham Knight’s June 23rd release."

batman-arkham-knight-screenshot-1.jpg

Of note, mobile GPUs included in the promotion will only receive download codes if the seller of the notebook is participating in the "Two Times The Adventure" offer, so be sure to check that out when looking for a gaming laptop that qualifies!

The promotion is going on now and is available “for a limited time or while supplies last”.

Source: NVIDIA
Author:
Manufacturer: Various

It's more than just a branding issue

As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:

First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.

AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync.  For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.

Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.

Continue reading our story dissecting NVIDIA G-Sync and AMD FreeSync!!

A TITANic roundup of GPUs

Subject: Graphics Cards | March 19, 2015 - 03:20 PM |
Tagged: titan x, nvidia, gtx titan x, gm200, geforce, 4k

You have read Ryan's review of the $999 behemoth from NVIDIA and now you can take the opportunity to see what other reviewers think of the card.  [H]ard|OCP tested it against the GTX 980 which shares the same cooler and is every bit as long as the TITAN X.  Along the way they found a use for the 12GB of VRAM as both Watch_Dogs and Far Cry 4 used over 7GB of memory when tested at 4k resolution though the frame rates were not really playable, you will need at least two TITAN X's to pull that off.  They will be revisiting this card in the future, providing more tests for a card with incredible performance and an even more incredible price.

14265930473m7LV4iNyQ_1_6_l.jpg

"The TITAN X video card has 12GB of VRAM, not 11.5GB, 50% more streaming units, 50% more texture units, and 50% more CUDA cores than the current GTX 980 flagship NVIDIA GPU. While this is not our full TITAN X review, this preview focuses on what the TITAN X delivers when directly compared to the GTX 980."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

GM200 Specifications

With the release of the GeForce GTX 980 back in September of 2014, NVIDIA took the lead in performance with single GPU graphics cards. The GTX 980 and GTX 970 were both impressive options. The GTX 970 offered better performance than the R9 290 as did the GTX 980 compared to the R9 290X; on top of that, both did so while running at lower power consumption and while including new features like DX12 feature level support, HDMI 2.0 and MFAA (multi-frame antialiasing). Because of those factors, the GTX 980 and GTX 970 were fantastic sellers, helping to push NVIDIA’s market share over 75% as of the 4th quarter of 2014.

IMG_1954.JPG

But in the back of our mind, and in the minds of many NVIDIA fans, we knew that the company had another GPU it was holding on to: the bigger, badder version of Maxwell. The only question was going to be WHEN the company would release it and sell us a new flagship GeForce card. In most instances, this decision is based on the competitive landscape, such as when AMD might be finally updating its Radeon R9 290X Hawaii family of products with the rumored R9 390X. Perhaps NVIDIA is tired of waiting or maybe the strategy is to launch soon before Fiji GPUs make their debut. Either way, NVIDIA officially took the wraps off of the new GeForce GTX TITAN X at the Game Developers Conference two weeks ago.

At the session hosted by Epic Games’ Tim Sweeney, NVIDIA CEO Jen-Hsun Huang arrived when Tim lamented about needing more GPU horsepower for their UE4 content. In his hands he had the first TITAN X GPU and talked about only a couple of specifications: the card would have 12GB of memory and it would be based on a GPU with 8 billion transistors.

Since that day, you have likely seen picture after picture, rumor after rumor, about specifications, pricing and performance. Wait no longer: the GeForce GTX TITAN X is here. With a $999 price tag and a GPU with 3072 CUDA cores, we clearly have a new king of the court.

Continue reading our review of the NVIDIA GeForce GTX Titan X 12GB Graphics Card!!