Rumor: NVIDIA's Next GPU Called GTX 1080, Uses GDDR5X

Subject: Graphics Cards | March 11, 2016 - 05:03 PM |
Tagged: rumor, report, pascal, nvidia, HBM2, gtx1080, GTX 1080, gtx, GP104, geforce, gddr5x

We are expecting news of the next NVIDIA graphics card this spring, and as usual whenever an announcement is imminent we have started seeing some rumors about the next GeForce card.

GeForce-GTX.jpg

(Image credit: NVIDIA)

Pascal is the name we've all being hearing about, and along with this next-gen core we've been expecting HBM2 (second-gen High Bandwidth Memory). This makes today's rumor all the more interesting, as VideoCardz is reporting (via BenchLife) that a card called either the GTX 1080 or GTX 1800 will be announced, using the GP104 GPU core with 8GB of GDDR5X - and not HBM2.

The report also claims that NVIDIA CEO Jen-Hsun Huang will have an announcement for Pascal in April, which leads us to believe a shipping product based on Pascal is finally in the works. Taking in all of the information from the BenchLife report, VideoCardz has created this list to summarize the rumors (taken directly from the source link):

  • Pascal launch in April
  • GTX 1080/1800 launch in May 27th
  • GTX 1080/1800 has GP104 Pascal GPU
  • GTX 1080/1800 has 8GB GDDR5X memory
  • GTX 1080/1800 has one 8pin power connector
  • GTX 1080/1800 has 1x DVI, 1x HDMI, 2x DisplayPort
  • First Pascal board with HBM would be GP100 (Big Pascal)

VideoCardz_Chart.png

Rumored GTX 1080 Specs (Credit: VideoCardz)

The alleged single 8-pin power connector with this GTX 1080 would place the power limit at 225W, though it could very well require less power. The GTX 980 is only a 165W part, with the GTX 980 Ti rated at 250W.

As always, only time will tell how accurate these rumors are; though VideoCardz points out "BenchLife stories are usually correct", though they are skeptical of the report based on the name GTX 1080 (though this would follow the current naming scheme of GeForce cards).

Source: VideoCardz

ZOTAC Introduces ZBOX MAGNUS EN980 VR Ready Mini-PC

Subject: Graphics Cards, Systems | March 10, 2016 - 11:38 AM |
Tagged: zotac, zbox, VR, SFF, nvidia, mini-pc, MAGNUS EN980, liquid cooling, GTX980, GTX 980, graphics, gpu, geforce

ZOTAC is teasing a new mini PC "ready for virtual reality" leading up to Cebit 2016, happening later this month. The ZBOX MAGNUS EN980 supplants the EN970 as the most powerful version of ZOTAC's gaming mini systems, and will come equipped with no less than an NVIDIA GeForce GTX 980.

ZOTAC.jpg

(Image via Guru3D)

Some questions remain ahead of a more formal announcemnent, and foremost among them is the version of the system's GTX 980. Is this the full desktop variant, or the GTX 980m? It seems to be the former, if we can read into the "factory-installed water-cooling solution", especially if that pertains to the GPU. In any case this will easily be the most powerful mini-PC ZOTAC has released, as even the current MAGNUS EN970 doesn't actually ship with a GTX 970 as the name would imply; rather, a GTX 960 handles discrete graphics duties according to the specs.

The MAGNUS EN980's GTX 980 GPU - mobile or not - will make this a formidable gaming system, paired as it is with a 6th-gen Intel Skylake CPU (the specific model was not mentioned in the press release; the current high-end EN970 with dicrete graphics uses the Intel Core i5-5200U). Other details include support for up to four displays via HDMI and DisplayPort, USB 3.0 and 3.1 Type-C inputs, and built-in 802.11ac wireless.

We'll have to wait until Cebit (which runs from March 14 - 18) for more details. Full press release after the break.

Source: ZOTAC

New ASUS GeForce GTX 950 2G Requires No PCIe Power

Subject: Graphics Cards | March 4, 2016 - 04:48 PM |
Tagged: PCIe power, PCI Express, nvidia, GTX 950 2G, gtx 950, graphics card, gpu, geforce, asus, 75W

ASUS has released a new version of the GTX 950 called the GTX 950 2G, and the interesting part isn't what's been added, but what was taken away; namely, the PCIe power requirement.

gtx9502g_1.jpeg

When NVIDIA announced the GTX 950 (which Ryan reviewed here) it carried a TDP of 90W, which prevented it from running without a PCIe power connector. The GTX 950 was (seemingly) the replacement for the GTX 750, which didn't require anything beyond motherboard power via the PCIe slot, and the same held true for the more powerful GTX 750 Ti. Without the need for PCIe power that GTX 750 Ti became our (any many others) default recommendation to turn any PC into a gaming machine (an idea we just happened to cover in depth here).

Here's a look at the specs from ASUS for the GTX 950 2G:

  • Graphics Engine: NVIDIA GeForce GTX 950
  • Interface: PCI Express 3.0
  • Video Memory: GDDR5 2GB
  • CUDA Cores: 768
  • Memory Clock: 6610 MHz
  • Memory Interface: 128-bit
  • Engine Clock
    • Gaming Mode (Default) - GPU Boost Clock : 1190 MHZ , GPU Base Clock : 1026 MHz
    • OC Mode - GPU Boost Clock : 1228 MHZ , GPU Base Clock : 1051 MHz
  • Interface: HDMI 2.0, DisplayPort, DVI
  • Power Consumption: Up to 75W, no additional PCIe power required
  • Dimensions: 8.3 x 4.5 x 1.6 inches

gtx9502g_2.jpeg

Whether this model has any relation to the rumored "GTX 950 SE/LP" remains to be seen (and other than power, this card appears to have stock GTX 950 specs), but the option of adding in a GPU without concern over power requirements makes this a very attractive upgrade proposition for older builds or OEM PC's, depending on cost.

The full model number is ASUS GTX950-2G, and a listing is up on Amazon, though seemingly only a placeholder at the moment. (Link removed. The listing was apparently for an existing GTX 950 product.)

Source: ASUS

NVIDIA Releases GeForce Drivers for Far Cry Primal, Gears of War: Ultimate Edition

Subject: Graphics Cards | March 2, 2016 - 05:30 PM |
Tagged: nvidia, geforce, game ready, 362.00 WHQL

The new Far Cry game is out (Far Cry Primal), and for NVIDIA graphics card owners this means a new GeForce Game Ready driver. The 362.00 WHQL certified driver provides “performance optimizations and a SLI profile” for the new game, is now available via GeForce Experience, as well as the manual driver download page.

FC_Primal.jpg

(Image credit: Ubisoft)

The 362.00 WHQL driver also supports the new Gears of War: Ultimate Edition, which is a remastered version of 2007 PC version of the game that includes Windows 10 only enhancements such as 4k resolution support and unlocked frame rates. (Why these "need" to be Windows 10 exclusives can be explained by checking the name of the game’s publisher: Microsoft Studios.)

large_prison2.jpg

(Image credit: Microsoft)

Here’s a list of what’s new in version 362.00 of the driver:

Gaming Technology

  • Added Beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3. GPUs supported include all GTX 900 series, Titan X, and GeForce GTX 750 and 750Ti.

Fermi GPUs:

  • As of Windows 10 November Update, Fermi GPUs now use WDDM 2.0 in single GPU configurations.

For multi-GPU configurations, WDDM usage is as follows:

  • In non-SLI multi-GPU configurations, Fermi GPUs use WDDM 2.0. This includes configurations where a Fermi GPU is used with Kepler or Maxwell GPUs.
  • In SLI mode, Fermi GPUs still use WDDM 1.3. Application SLI Profiles

Added or updated the following SLI profiles:

  • Assassin's Creed Syndicate - SLI profile changed (with driver code as well) to make the application scale better
  • Bless - DirectX 9 SLI profile added, SLI set to SLI-Single
  • DayZ - SLI AA and NVIDIA Control Panel AA enhance disabled
  • Dungeon Defenders 2 - DirectX 9 SLI profile added
  • Elite Dangerous - 64-bit EXE added
  • Hard West - DirectX 11 SLI profile added
  • Metal Gear Solid V: The Phantom Pain - multiplayer EXE added to profile
  • Need for Speed - profile EXEs updated to support trial version of the game
  • Plants vs Zombies Garden Warfare 2 - SLI profile added
  • Rise of the Tomb Raider - profile added
  • Sebastien Loeb Rally Evo - profile updated to match latest app behavior
  • Tom Clancy's Rainbow Six: Siege - profile updated to match latest app behavior
  • Tom Clancy's The Division - profile added
  • XCOM 2 - SLI profile added (including necessary code change)

release_notes.png

The "beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3" is certainly interesting addition, and one that could eventually lead to external solutions for notebooks, coming on the heels of AMD teasing their own standardization of external GPUs.

The full release 361 (GeForce 362.00) notes can be viewed here (warning: PDF).

Source: NVIDIA

Report: NVIDIA Preparing GeForce GTX 980MX and 970MX Mobile GPUs

Subject: Graphics Cards | January 19, 2016 - 10:31 AM |
Tagged: rumor, report, nvidia, GTX 980MX, GTX 980M, GTX 970MX, GTX 970M, geforce

NVIDIA is reportedly preparing faster mobile GPUs based on Maxwell, with a GTX 980MX and 970MX on the way.

img-01.jpg

The new GTX 980MX would sit between the GTX 980M and the laptop version of the full GTX 980, with 1664 CUDA cores (compared to 1536 with the 980M), 104 Texture Units (up from the 980M's 96), a 1048 MHz core clock, and up to 8 GB of GDDR5. Memory speed and bandwidth will reportedly be identical to the GTX 980M at 5000 MHz and 160 GB/s respectively, with both GPUs using a 256-bit memory bus.

The GTX 970MX represents a similar upgrade over the existing GTX 970M, with CUDA Core count increased from 1280 to 1408, Texture Units up from 80 to 88, and 8 additional raster devices available (56 vs. 48). Both the 970M and 970MX use 192-bit GDDR5 clocked at 5000 MHz, and available with the same 3 GB or 6 GB of frame buffer.

WCCFtech prepared a chart to demonstrate the differences between NVIDIA's mobile offerings:

Model GeForce GTX 980 Laptop Version GeForce GTX 980MX

GeForce GTX 980M

GeForce GTX 970MX GeForce GTX 970M GeForce GTX 965M

GeForce GTX 960M

Architecture

Maxwell Maxwell Maxwell Maxwell Maxwell Maxwell Maxwell
GPU GM204 GM204 GM204 GM204 GM204 GM204 GM107
CUDA Cores 2048 1664 1536 1408 1280 1024 640
Texture Units 128 104 96 88 80 64 40
Raster Devices 64 64 64 56 48 32 16
Clock Speed 1218 MHz 1048 MHz 1038 MHz 941 MHz 924 MHz 950 MHz 1097 MHz
Memory Bus 256-bit 256-bit 256-bit 192-bit 192-bit 128-bit 128-bit
Frame Buffer 8 GB GDDR5 8/4 GB GDDR5 8/4 GB GDDR5 6/3 GB GDDR5 6/3 GB GDDR5 4 GB GDDR5 4 GB GDDR5
Memory Frequency 7008 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz
Memory Bandwidth 224 GB/s 160 GB/s 160 GB/s 120 GB/s 120 GB/s 80 GB/s 80 GB/s
TDP ~150W 125W 125W 100W 100W 90W 75W

These new GPUs will reportedly be based on the same Maxwell GM204 core, and TDPs are apparently unchanged at 125W for the GTX 980MX, and 100W for the 970MX.

We will await any official announcement.

Source: WCCFtech

CES 2016: NVIDIA talks SHIELD Updates and VR-Ready Systems

Subject: Graphics Cards, Shows and Expos | January 5, 2016 - 09:39 PM |
Tagged: vr ready, VR, virtual reality, video, Oculus, nvidia, htc, geforce, CES 2016, CES

Other than the in-depth discussion from NVIDIA on the Drive PX 2 and its push into autonomous driving, NVIDIA didn't have much other news to report. We stopped by the suite and got a few updates on SHIELD and the company's VR Ready program to certify systems that meet minimum recommended specifications for a solid VR experience.

For the SHIELD,  NVIDIA is bringing Android 6.0 Marshmallow to the device, with new features like shared storage and the ability to customize the home screen of the Android TV interface. Nothing earth shattering and all of it is part of the 6.0 rollout. 

The VR Ready program from NVIDIA will validate notebooks, systems and graphics cards that have the amount of horsepower to meet the minimum performance levels for a good VR experience. At this point, the specs essentially match up with what Oculus has put forth: a GTX 970 or better on the desktop and a GTX 980 (full, not 980M) on mobile. 

Other than that, Ken and I took in some of the more recent VR demos including Epic's Bullet Train on the final Oculus Rift and Google's Tilt Brush on the latest iteration of the HTC Vive. Those were both incredibly impressive though the Everest demo that simulates a portion of the mountain climb was the one that really made me feel like I was somewhere else.

Check out the video above for more impressions!

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA

NVIDIA Updates GeForce Experience with Beta Features

Subject: Graphics Cards | December 16, 2015 - 08:12 AM |
Tagged: nvidia, geforce experience, geforce

A new version of GeForce Experience was published yesterday. It is classified as a beta, which I'm guessing means that they wanted to release it before the Holidays, but they didn't want to have to fix potential, post-launch issues during the break. Thus, release it as a beta so users will just roll back if something doesn't work. On the other hand, NVIDIA is suggesting that it will be a recurring theme with their new "Early Access" program.

It has a few interesting features, though. First, it has a screenshot function that connects with Imgur. Steam's F12 function is pretty good for almost any title, but there are some times that you don't want to register the game with Steam, so a second option is welcome. They also have an overlay to control your stream, rather than just an indicator icon.

nvidia-2015-geforce-experience-streamstuff.png

They added the ability to choose the Twitch Ingest Server, which is the server that the broadcaster connects to and delivers your stream into Twitch's back-end. I haven't used ShadowPlay for a while, but you previously needed to use whatever GeForce Experience chose. If it's not the best connection (ex: across the continent) then you basically had to deal with it. OBS and OBS Studio have these features too of course. I'll be clear: NVIDIA is playing catch-up to open source software in that area.

The last feature to be mentioned might just be the most interesting, though. A while ago, we mentioned that NVIDIA wants to allow online co-op by a GameStream-like service. They now have it, and it's called GameStream Co-op. The viewer can watch, take over your input, or register as a second gamepad. It requires a GeForce GTX 650 (or 660M) or higher.

GeForce Experience Beta is available now.

Source: NVIDIA

Report: NVIDIA To Cut GeForce GTX 900 Series GPU Prices

Subject: Graphics Cards | November 30, 2015 - 03:48 PM |
Tagged: rumor, report, price cut, nvidia, GTX 980, GTX 970, gtx 960, geforce

A report published by TechPowerUp suggests NVIDIA will soon be cutting prices across their existing GeForce lineup, with potential price changes reaching consumers in time for the holiday shopping season.

GeForce-GTX-690-image03.jpg

So what does this report suggest? The GTX 980 drops to $449, the GTX 970 goes to $299, and the GTX 960 goes to $179. These are pretty consistent with some of the sale or post-rebate prices we’ve seen of late, and such a move would certainly even things up somewhat between AMD and NVIDIA with regard to cost. Of course, we could see an answer from AMD in the form of a price reduction from their R9 300-series or Fury/Nano. We can only hope!

We’ve already seen prices come down during various black Friday sales on several GPUs, but the potential for a permanent price cut makes for interesting speculation if nothing else. Not to disparage the source, but no substantive evidence exists to directly point to a plan by NVIDIA to lower their GPU prices for some 900-series cards, but it would make sense given their competition from AMD at various price points.

Here’s to lower prices going forward.

Source: TechPowerUp

NVIDIA Releases Driver 358.91 for Fallout 4, Star Wars Battlefront, Legacy of the Void

Subject: Graphics Cards | November 9, 2015 - 01:44 PM |
Tagged: nvidia, geforce, 358.91, fallout 4, Star Wars, battlefront, starcraft, legacy of the void

It's a huge month for PC gaming with the release of Bethesda's Fallout 4 and EA's Star Wars Battlefront likely to take up hours and hours of your (and my) time in the lead up to the holiday season. NVIDIA just passed over links to its latest "Game Ready" driver, version 358.91.

bethesda-2015-fallout4-official-2.jpg

Fallout 4 is going to be impressive graphically

Here's the blurb from NVIDIA directly:

Continuing to fulfill our commitment to GeForce gamers to have them Game Ready for the top Holiday titles, today we released a new Game Ready driver.  This Game Ready driver will get GeForce Gamers set-up for tomorrow’s release of Fallout 4, as well as Star Wars Battlefront, StarCraft II: Legacy of the Void. WHQLed and ready for the Fallout wasteland, driver version 358.91 will deliver the best experience for GeForce gamers in some of the holiday’s hottest titles.

Other than learning that NVIDIA considers "WHQLed" to be a verb now, this is good news for PC gamers looking to dive into the world of Fallout or take up arms against the Empire on the day of release. I honestly believe that these kinds of software updates and frequent driver improvements timed to major game releases is one of the biggest advantages that GeForce gamers have over Radeon users; though I hold out hope that the red team will get on the same cadence with one Raja Koduri in charge.

You can also find more information from NVIDIA about configuration with its own GPUs for Fallout 4 and for Star Wars Battlefront on GeForce.com.

Source: NVIDIA

NVIDIA Confirms Clock Speed, Power Increases at High Refresh Rates, Promises Fix

Subject: Graphics Cards | November 6, 2015 - 04:05 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

Last month I wrote a story that detailed some odd behavior with NVIDIA's GeForce GTX graphics cards and high refresh rate monitors, in particular with the new ASUS ROG Swift PG279Q that has a rated 165Hz refresh rate. We found that when running this monitor at 144Hz or higher refresh rate, idle clock speeds and power consumption of the graphics card increased dramatically.

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

powerdraw.png

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

We put NVIDIA on notice with the story and followed up with emails including more information from other users as well as additional testing completed after the story was posted. The result: NVIDIA has confirmed it exists and has a fix incoming!

In an email we got from NVIDIA PR last night: 

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.
 
Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.
 
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.
 
We’ll have this fixed in an upcoming driver.

This actually supports an oddity we found before: we noticed that the PG279Q at 144Hz refresh was pushing GPU clocks up pretty high while a monitor without G-Sync support at 144Hz did not. We'll see if this addresses the entire gamut of experiences that users have had (and have emailed me about) with high refresh rate displays and power consumption, but at the very least NVIDIA is aware of the problems and working to fix them.

I don't have confirmation of WHEN I'll be able to test out that updated driver, but hopefully it will be soon, so we can confirm the fix works with the displays we have in-house. NVIDIA also hasn't confirmed what the root cause of the problem is - was it related to the clock domains as we had theorized? Maybe not, since this was a G-Sync specific display issue (based on the quote above). I'll try to weasel out the technical reasoning for the bug if we can and update the story later!