NVIDIA GameWorks Enhancements Coming in Upcoming Metal Gear Solid Game

Subject: Graphics Cards | June 16, 2015 - 12:53 AM |
Tagged: the phantom pain, nvidia, metal gear solid, graphics, gpus, geforce, gameworks

A blog post on NVIDIA's site indicates that Konami's upcoming game Metal Gear Solid 5: The Phantom Pain will make use of NVIDIA technologies, a move that will undoubtedly rankle AMD graphics users who can't always see the full benefit of GameWorks enhancements.

mgsv_1.png

"The world of Metal Gear Solid V: The Phantom Pain is going to be 200 times larger than the one explored in Ground Zeroes. Because so much of this game’s action depends on stealth, graphics are a key part of the gameplay. Shadows, light, and terrain have to be rendered perfectly. That’s a huge challenge in a game where the hero is free to find his own way from one point to another.  Our engineers are signed up to work closely with Konami to get the graphics just right and to add special effects."

Now technically this quote doesn't confirm the use of any proprietary NVIDIA technology, though it sounds like that's exactly what will be taking place. In the wake of the Witcher 3 HairWorks controversy any such enhancements will certainly be looked upon with interest (especially as the next piece of big industry news will undoubtedly be coming with AMD's announcement later today at E3).

It's hard to argue with better graphical quality in high profile games such as the latest Metal Gear Solid installment, but there is certainly something to be said for adherence to open standards to ensure a more unified experience across GPUs. The dialog about inclusion though adherence to standards vs. proprietary solutions has been very heated with the FreeSync/G-Sync monitor refresh debate, and GameWorks is a series of tools that serves to further divide gamers, even as it provides an enhanced experience with GeForce GPUs.

NV_Gameworks_blk_V.png

Such advantages will likely matter less with DirectX 12 mitigating some differences with more efficiency in the vein of AMD's Mantle API, and if the rumored Fiji cards from AMD offer superior performance and arrive priced competitively this will matter even less. For now even though details are nonexistent expect an NVIDIA GeForce GPU to have the advantage in at least some graphical aspects of the latest Metal Gear title when it arrives on PC.

Source: NVIDIA

Introduction and Technical Specifications

Introduction

The measure of a true modder is not in how powerful he can make his system by throwing money at it, but in how well he can innovate to make his components run better with what he or she has on hand. Some make artistic statements with their truly awe-inspiring cases, while others take the dremel and clamps to their beloved video cards in an attempt to eek out that last bit of performance. This article serves the later of the two. Don't get me wrong, the card will look nice once we're done with it, but the point here is to re-use components on hand where possible to minimize the cost while maximizing the performance (and sound) benefits.

EVGA GTX 970 SC Graphics Card

02-evga970sc-full.jpg

Courtesy of EVGA

We started with an EVGA GTX 970 SC card with 4GB ram and bundled with the new revision of EVGA's ACX cooler, ACX 2.0. This card is well built with a slight factory overclock out of the box. The ACX 2.0 cooler is a redesigned version of the initial version of the cooler included with the card, offering better cooling potential with fan's not activated for active cooling until the GPU block temperature breeches 60C.

03-evga970sc-card-profile.jpg

Courtesy of EVGA

WATERCOOL HeatKiller GPU-X3 Core GPU Waterblock

04-gpu-x3-core-top.jpg

Courtesy of WATERCOOL

For water cooling the EVGA GTX 970 SC GPU, we decided to use the WATERCOOL HeatKiller GPU-X3 Core water block. This block features a POM-based body with a copper core for superior heat transfer from the GPU to the liquid medium. The HeatKiller GPU-X3 Core block is a GPU-only cooler, meaning that the memory and integrated VRM circuitry will not be actively cooled by the block. The decision to use a GPU only block rather than a full cover block was two fold - availability and cost. I had a few of these on hand, making of an easy decision cost-wise.

Continue reading our article on Modding the EVGA GTX 970 SC Graphics Card!

Subject: Mobile
Manufacturer: ASUS

Introduction and Design

P4110149.jpg

With the introduction of the ASUS G751JT-CH71, we’ve now got our first look at the newest ROG notebook design revision.  The celebrated design language remains the same, and the machine’s lineage is immediately discernible.  However, unlike the $2,000 G750JX-DB71 unit we reviewed a year and a half ago, this particular G751JT configuration is 25% less expensive at just $1,500.  So first off, what’s changed on the inside?

specs.png

(Editor's Note: This is NOT the recent G-Sync version of the ASUS G751 notebook that was announced at Computex. This is the previously released version, one that I am told will continue to sell for the foreseeable future and one that will come at a lower overall price than the G-Sync enabled model. Expect a review on the G-Sync derivative very soon!)

Quite a lot, as it turns out.  For starters, we’ve moved all the way from the 700M series to the 900M series—a leap which clearly ought to pay off in spades in terms of GPU performance.  The CPU and RAM remain virtually equivalent, while the battery has migrated from external to internal and enjoyed a 100 mAh bump in the process (from 5900 to 6000 mAh).  So what’s with the lower price then?  Well, apart from the age difference, it’s the storage: the G750JX featured both a 1 TB storage drive and a 256 GB SSD, while the G751JT-CH71 drops the SSD.  That’s a small sacrifice in our book, especially when an SSD is so easily added thereafter.  By the way, if you’d rather simply have ASUS handle that part of the equation for you, you can score a virtually equivalent configuration (chipset and design evolutions notwithstanding) in the G751JT-DH72 for $1750—still $250 less than the G750JX we reviewed.

Continue reading our review of the ASUS G751JT!!!

PCPer Live! GeForce GTX 980 Ti, G-Sync Live Stream and Giveaway!

Subject: Graphics Cards | June 9, 2015 - 08:33 AM |
Tagged: video, tom petersen, nvidia, maxwell, live, GTX 980 Ti, gtx, gsync, gm200, giveaway, geforce, g-sync, contest

UPDATE: Did you miss the event? No worries, you can still learn all about the GTX 980 Ti, G-Sync changes and even how NVIDIA is changing VR! Once again, a HUGE thanks to NVIDIA and Tom Petersen for coming out to visit.

Even thought it's a week after official release, we are hosting a live stream from the PC Perspective offices with NVIDIA's Tom Petersen to discuss the new GeForce GTX 980 Ti graphics card as well as the changes and updates the company has made to the G-Sync brand. Why would NVIDIA undercut the GTX TITAN X by such a wide margin? Are they worried about AMD's Fiji GPU? Now that we are seeing new form factors and screen types of G-Sync monitors, will prices come down? How does G-Sync for notebooks work without a module?

All of this information and more will be learned on Tuesday, June 9th.

P1020269.JPG

And what's a live stream without a prize? One lucky live viewer will win an EVGA GeForce GTX GTX 980 Ti 6GB graphics card of their very own! That's right - all you have to do is tune in for the live stream Tuesday afternoon and you could win a 980 Ti!!

pcperlive.png

NVIDIA GeForce GTX 980 Ti / G-Sync Live Stream and Giveaway

12pm PT / 3pm ET - June 9th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, June 9th at 12pm PT / 3pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 12pm PT / 3pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

NVIDIA VR Headset Patent Found

Subject: Graphics Cards, Mobile | June 6, 2015 - 04:05 PM |
Tagged: VR, nvidia, gameworks vr

So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.

nvidia-2015-vr-patent-1.png

Image Credit: Declassified

The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.

nvidia-2015-vr-patent-2.png

Image Credit: Declassified

VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?

Source: VRFocus

Podcast #352 - GTX 980 Ti, News from Computex, AMD Fiji Leaks and more!

Subject: General Tech | June 4, 2015 - 02:22 PM |
Tagged: zotac, video, titan x, thunderbolt 3, SSD 750, podcast, ocz, nvidia, msi, micron, Intel, hbm, g-sync, Fiji, computex, amd, acer, 980 Ti

PC Perspective Podcast #352 - 06/04/2015

Join us this week as we discuss the GTX 980 Ti, News from Computex, AMD Fiji Leaks and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 2:02:45

  1. Week in Review:
  2. Computex, Dawg
  3. News item of interest:
  4. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Rounding up the GTX 980 Ti reviews

Subject: Graphics Cards | June 2, 2015 - 04:40 PM |
Tagged: video, nvidia, maxwell, GTX 980 Ti, gsync, gm200, geforce, gameworks vr, g-sync, dx12, 6Gb

Hopefully by now you have familiarized yourself with Ryan's review of the new GTX980 Ti and perhaps even some of the other reviews below.  One review that you should not miss is by Scott over at The Tech Report as they used an X99 system for benchmarking and covered a slightly different suite of games.  The games both sites tested show very similar results and in the case of BF4 and Crysis 3, showed that the R9 295 X2 is still a force to be reckoned with, especially when it is on sale at a price similar to the 980 Ti.  In testing the Witcher 3 and Project Cars, the 980Ti showed smoother performance with impressive minimum frame times.  Overall, The Tech Report gives the nod to the new GTX 980 Ti for more fluid gameplay but does offer the necessary reminder, AMD will be launching their new products very soon and could offer new competition.

card-front.jpg

"You knew it was coming. When Nvidia introduced the GeForce Titan X, it was only a matter of time before a slightly slower, less expensive version of that graphics card hit the market. That's pretty much how it always happens, and this year is no exception."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Computex 2015: ASUS 3800R 34″ Ultrawide 3440x1440 IPS 21:9 Curved G-SYNC Monitor

Subject: Displays | June 1, 2015 - 12:21 PM |
Tagged: nvidia, gsync, g-sync, asus, 3800R

At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor,  a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:

ROG-34-inch-curved-gaming-monitor.jpg

Here's a list of specifications:

  • 34” optimal dimension for QHD resolutions with 3440×1440 resolution
  • 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
  • IPS based panel for superior color reproduction, black levels and reduction of color shifting
  • NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
  • Frameless design for seamless surround gaming
  • ASUS exclusive GamePlus feature and Turbo Key
  • Ergonomic adjustment including tilt, swivel and height adjustment

Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...

Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.

This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.

The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!

Author:
Manufacturer: NVIDIA

Specifications

When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.

Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.

P1020283.jpg

The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?

Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?

Continue reading our review of the new NVIDIA GeForce GTX 980 Ti 6GB Graphics Card!!

NVIDIA G-Sync Update: New Monitors, Windowed Mode, V-Sync Options

Subject: Displays | May 31, 2015 - 06:00 PM |
Tagged: nvidia, gsync, g-sync, computex 2015, computex

In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.

First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.

980ti-42.jpg

Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.

ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain. 

Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though. 

Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.

gsync-windows.jpg

By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio. 

Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.

gsyncsettings2.png

One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.

Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.

Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.

gsyncsettings1.png

NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!