NVIDIA Promoting Game Ready Drivers with Giveaway

Subject: Graphics Cards | November 4, 2015 - 09:01 AM |
Tagged: nvidia, graphics drivers, geforce, game ready

In mid-October, NVIDIA announced that Game Ready drivers would only be available through GeForce Experience with a registered email address, which we covered. Users are able to opt-out of NVIDIA's mailing list, though. They said that this would provide early access to new features, chances to win free hardware, and the ability to participate in the driver development process.

nvidia-geforce.png

Today's announcement follows up on the “win free hardware” part. The company will be giving away $100,000 worth of prizes, including graphics cards up to the GeForce GTX 980 Ti, game keys, and SHIELD Android TV boxes. To be eligible, users need to register with GeForce Experience and use it to download the latest Game Ready driver.

Speaking of Game Ready drivers, the main purpose of this blog post is to share the list of November/December games that are in this program. NVIDIA pledges to have optimized drivers for these titles on or before their release date:

  • Assassin's Creed: Syndicate
  • Call of Duty Black Ops III
  • Civilization Online
  • Fallout 4
  • Just Cause 3
  • Monster Hunter Online
  • Overwatch
  • RollerCoaster Tycoon World
  • StarCraft II: Legacy of the Void
  • Star Wars Battlefront
  • Tom Clancy's Rainbow Six Siege
  • War Thunder

As is the case recently, NVIDIA also plans to get every Game Ready driver certified by Microsoft, through Microsoft's WHQL driver certification program.

Source: NVIDIA

Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Subject: Graphics Cards, Displays | October 24, 2015 - 04:16 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

In the comments to our recent review of the ASUS ROG Swift PG279Q G-Sync monitor, a commenter by the name of Cyclops pointed me in the direction of an interesting quirk that I hadn’t considered before. According to reports, the higher refresh rates of some panels, including the 165Hz option available on this new monitor, can cause power draw to increase by as much as 100 watts on the system itself. While I did say in the review that the larger power brick ASUS provided with it (compared to last year’s PG278Q model) pointed toward higher power requirements for the display itself, I never thought to measure the system.

To setup a quick test I brought the ASUS ROG Swift PG279Q back to its rightful home in front of our graphics test bed, connected an EVGA GeForce GTX 980 Ti (with GPU driver 358.50) and chained both the PC and the monitor up to separate power monitoring devices. While sitting at a Windows 8.1 desktop I cycled the monitor through different refresh rate options and then recorded the power draw from both meters after 60-90 seconds of time to idle out.

powerdraw.png

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.

So, what the hell is going on? A look at GPU-Z clock speeds reveals the source of the power consumption increase.

powerdraw2.png

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.

refreshsetup.jpg

Pushing refresh rates of 144Hz and higher causes a surprsing increase in power draw

The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop. NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road. With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.

What happens to a modern AMD GPU like the R9 Fury with a similar test? To find out we connected our same GPU test bed to the ASUS MG279Q, a FreeSync enabled monitor capable of 144 Hz refresh rates, and swapped the GTX 980 Ti for an ASUS R9 Fury STRIX.

powerdrawamd1.png

powerdrawamd2.png

The AMD Fury does not demonstrate the same phenomenon that the GTX 980 Ti does when running at high refresh rates. The Fiji GPU runs at the same static 300MHz clock rate at 60Hz, 120Hz and 144Hz and the power draw on the system only inches up by 2 watts or so. I wasn't able to test 165Hz refresh rates on the AMD setup so it is possible that at that threshold the AMD graphics card would behave differently. It's also true that the NVIDIA Maxwell GPU is running at less than half the clock rate of AMD Fiji in this idle state, and that may account for difference in pixel clocks we are seeing. Still, the NVIDIA platform draws slightly more power at idle than the AMD platform, so advantage AMD here.

For today, know that if you choose to use a 144Hz or even a 165Hz refresh rate on your NVIDIA GeForce GPU you are going to be drawing a bit more power and will be less efficient than expected even just sitting in Windows. I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future.

NVIDIA Releases 358.50 WHQL Game Ready Drivers

Subject: Graphics Cards | October 7, 2015 - 01:45 PM |
Tagged: opengl es 3.2, nvidia, graphics drivers, geforce

The GeForce Game Ready 358.50 WHQL driver has been released so users can perform their updates before the Star Wars Battlefront beta goes live tomorrow (unless you already received a key). As with every “Game Ready” driver, NVIDIA ensures that the essential performance and stability tweaks are rolled in to this version, and tests it against the title. It is WHQL certified too, which is a recent priority for NVIDIA. Years ago, “Game Ready” drivers were often classified as Beta, but the company now intends to pass their work through Microsoft for a final sniff test.

ea-2015-battlefront.jpg

Another interesting addition to this driver is the inclusion of OpenGL 2015 ARB and OpenGL ES 3.2. To use OpenGL ES 3.2 on the PC, if you want to develop software in it for instance, you needed to use a separate release since it was released at SIGGRAPH. It has now been rolled into the main, public driver. The mobile devs who use their production machines to play Battlefront rejoice, I guess. It might also be useful if developers, for instance at Mozilla or Google, want to create pre-release implementations of future WebGL specs too.

Source: NVIDIA

Microsoft Surface Book 2-in-1 with Skylake with NVIDIA Discrete GPU Announced

Subject: Mobile | October 6, 2015 - 02:38 PM |
Tagged: video, surface book, surface, Skylake, nvidia, microsoft, Intel, geforce

Along with the announcement of the new Surface Pro 4, Microsoft surprised many with the release of the new Surface Book 2-in-1 convertible laptop. Sharing much of the same DNA as the Surface tablet line, the Surface Book adopts a more traditional notebook design while still adding enough to the formula to produce a unique product.

book-3.jpg

The pivotal part of the design (no pun intended) is the new hinge, a "dynamic fulcrum" design that looks great and also (supposedly) will be incredibly strong. The screen / tablet attachment mechanism is called Muscle Wire and promises secure attachment as well as ease of release with a single button.

An interesting aspect of the fulcrum design is that, when closed, the Surface Book screen and keyboard do not actually touch near the hinge. Instead you have a small gap in this area. I'm curious how this will play out in real-world usage - it creates a natural angle for using the screen in its tablet form but also may find itself "catching" coin, pens and other things between the two sections. 

book-5.jpg

The 13.5-in screen has a 3000 x 2000 resolution (3:2 aspect ratio obviously) with a 267 PPI pixel density. Just like the Surface Pro 4, it has a 10-point touch capability and uses the exclusive PixelSense display technology for improved image quality.

While most of the hardware is included in the tablet portion of the device, the keyboard dock includes some surprises of its own. You get a set of two USB 3.0 ports, a full size SD card slot and a proprietary SurfaceConnect port for an add-on dock. But most interestingly you'll find an optional discrete GPU from NVIDIA, an as-yet-undiscovered GeForce GPU with 1GB (??) of memory. I have sent inquiries to Microsoft and NVIDIA for details on the GPU, but haven't heard back yet. We think it is a 30 watt GeForce GPU of some kind (by looking at the power adapter differences) but I'm more interested in how the GPU changes both battery life and performance.

UPDATE: Just got official word from NVIDIA on the GPU, but unfortunately it doesn't tell us much.

The new GPU is a Maxwell based GPU with GDDR5 memory. It was designed to deliver the best performance in ultra-thin form factors such as the Surface Book keyboard dock. Given its unique implementation and design in the keyboard module, it cannot be compared to a traditional 900M series GPU. Contact Microsoft for performance information.

book02.jpg

Keyboard and touchpad performance looks to be impressive as well, with a full glass trackpad integration, backlit keyboard design and "class leading" key switch throw distance.

The Surface Book is powered by Intel Skylake processors, available in both Core i5 and Core i7 options, but does not offer Core m-based or Iris graphics options. Instead the integrated GPU will only be offered with the Intel HD 520.

book-4.jpg

Microsoft promises "up to" 12 hours of battery life on the Surface Book, though that claim was made with the Core i5 / 256GB / 8GB configuration option; no discrete GPU included. 

book-1.jpg

Pricing on the Surface Book starts at $1499 but can reach as high as $2699 with the maximum performance and storage capacity options. 

Source: Microsoft
Author:
Subject: General Tech
Manufacturer: NVIDIA

Setup, Game Selection

Yesterday NVIDIA officially announced the new GeForce NOW streaming game service, the conclusion to the years-long beta and development process known as NVIDIA GRID. As I detailed on my story yesterday about the reveal, GeForce NOW is a $7.99/mo. subscription service that will offer on-demand, cloud-streamed games to NVIDIA SHIELD devices, including a library of 60 games for that $7.99/mo. fee in addition to 7 titles in the “purchase and play” category. There are several advantages that NVIDIA claims make GeForce NOW a step above any other streaming gaming service including PlayStation Now, OnLive and others. Those include load times, resolution and frame rate, combined local PC and streaming game support and more.

geforcenow-8.jpg

I have been able to use and play with the GeForce NOW service on our SHIELD Android TV device in the office for the last few days and I thought I would quickly go over my initial thoughts and impressions up to this point.

Setup and Availability

If you have an NVIDIA SHIELD Android TV (or a SHIELD Tablet) then the setup and getting started process couldn’t be any simpler for new users. An OS update is pushed that changes the GRID application on your home screen to GeForce NOW and you can sign in using your existing Google account on your Android device, making payment and subscription simple to manage. Once inside the application you can easily browse through the included streaming games or look through the smaller list of purchasable games and buy them if you so choose.

SHIELD-Hub_20150930_180537.jpg

Playing a game is as simple and selecting title from the grid list and hitting play.

Game Selection

Let’s talk about that game selection first. For $7.99/mo. you get access to 60 titles for unlimited streaming. I have included a full list below, originally posted in our story yesterday, for reference.

Continue reading my initial thoughts and an early review of GeForce NOW!!

The Fable of the uncontroversial benchmark

Subject: Graphics Cards | September 24, 2015 - 02:53 PM |
Tagged: radeon, nvidia, lionhead, geforce, fable legends, fable, dx12, benchmark, amd

By now you should have memorized Ryan's review of Fable's DirectX 12 performance on a variety of cards and hopefully tried out our new interactive IFU charts.  You can't always cover every card, as those who were brave enough to look at the CSV file Ryan provided might have come to realize.  That's why it is worth peeking at The Tech Report's review after reading through ours.  They have included an MSI R9 285 and XFX R9 390 as well as an MSI GTX 970, which may be cards you are interested in seeing.  They also spend some time looking at CPU scaling and the effect that has on AMD and NVIDIA's performance.  Check it out here.

mountains-shot.jpg

"Fable Legends is one of the first games to make use of DirectX 12, and it produces some truly sumptuous visuals. Here's a look at how Legends performs on the latest graphics cards."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: Lionhead Studios

Benchmark Overview

When approached a couple of weeks ago by Microsoft with the opportunity to take an early look at an upcoming performance benchmark built on a DX12 game pending release later this year, I of course was excited for the opportunity. Our adventure into the world of DirectX 12 and performance evaluation started with the 3DMark API Overhead Feature Test back in March and was followed by the release of the Ashes of the Singularity performance test in mid-August. Both of these tests were pinpointing one particular aspect of the DX12 API - the ability to improve CPU throughput and efficiency with higher draw call counts and thus enabling higher frame rates on existing GPUs.

ScreenShot00004.jpg

This game and benchmark are beautiful...

Today we dive into the world of Fable Legends, an upcoming free to play based on the world of Albion. This title will be released on the Xbox One and for Windows 10 PCs and it will require the use of DX12. Though scheduled for release in Q4 of this year, Microsoft and Lionhead Studios allowed us early access to a specific performance test using the UE4 engine and the world of Fable Legends. UPDATE: It turns out that the game will have a fall-back DX11 mode that will be enabled if the game detects a GPU incapable of running DX12.

This benchmark focuses more on the GPU side of DirectX 12 - on improved rendering techniques and visual quality rather than on the CPU scaling aspects that made Ashes of the Singularity stand out from other graphics tests we have utilized. Fable Legends is more representative of what we expect to see with the release of AAA games using DX12. Let's dive into the test and our results!

Continue reading our look at the new Fable Legends DX12 Performance Test!!

Corsair and MSI Introduce Hydro GFX Liquid Cooled GeForce GTX 980 Ti

Subject: Graphics Cards | September 16, 2015 - 09:00 PM |
Tagged: nvidia, msi, liquid cooler, GTX 980 Ti, geforce, corsair, AIO

A GPU with attached closed-loop liquid cooler is a little more mainstream these days with AMD's Fury X a high-profile example, and now a partnership between Corsair and MSI is bringing a very powerful NVIDIA option to the market.

HydroGFX_01.png

The new product is called the Hydro GFX, with NVIDIA's GeForce GTX 980 Ti supplying the GPU horsepower. Of course the advantage of a closed-loop cooler would be higher (sustained) clocks and lower temps/noise, which in turns means much better performance. Corsair explains:

"Hydro GFX consists of a MSI GeForce GTX 980 Ti card with an integrated aluminum bracket cooled by a Corsair Hydro Series H55 liquid cooler.

Liquid cooling keeps the card’s hottest, most critical components - the GPU, memory, and power circuitry - 30% cooler than standard cards while running at higher clock speeds with no throttling, boosting the GPU clock 20% and graphics performance up to 15%.

The Hydro Series H55 micro-fin copper cooling block and 120mm radiator expels the heat from the PC reducing overall system temperature and noise. The result is faster, smoother frame rates at resolutions of 4K and beyond at whisper quiet levels."

The factory overclock this 980 Ti is pretty substantial out of the box with a 1190 MHz Base (stock 1000 MHz) and 1291 MHz Boost clock (stock 1075 MHz). Memory is not overclocked (running at the default 7096 MHz), so there should still be some headroom for overclocking thanks to the air cooling for the RAM/VRM.

MSI-HYDRO-GFX-FRONT.png

A look at the box - and the Corsair branding

Specs from Corsair:

  • NVIDIA GeForce GTX 980 Ti GPU with Maxwell 2.0 microarchitecture
  • 1190/1291 MHz base/boost clock
  • Clocked 20% faster than standard GeForce GTX 980 Ti cards for up to a 15% performance boost.
  • Integrated liquid cooling technology keeps GPU, video RAM, and voltage regulator 30% cooler than standard cards
  • Corsair Hydro Series H55 liquid cooler with micro-fin copper block, 120mm radiator/fan
  • Memory: 6GB GDDR5, 7096 MHz, 384-bit interface
  • Outputs: 3x DisplayPort 1.2, HDMI 2.0, and Dual Link DVI
  • Power: 250 watts (600 watt PSU required)
  • Requirements: PCI Express 3.0 16x dual-width slot, 8+6-pin power connector, 600 watt PSU
  • Dimensions: 10.5 x 4.376 inches
  • Warranty: 3 years
  • MSRP: $739.99

As far as pricing/availability goes Corsair says the new card will debut in October in the U.S. with an MSRP of $739.99.

Source: Corsair

IFA 2015: Acer Aspire V Notebook Series Gets Skylake and Advanced Wi-Fi

Subject: Systems, Mobile | September 2, 2015 - 06:00 AM |
Tagged: V Nitro, Skylake, NVMe, nvidia, notebook, mu-mimo, laptop, IFA 2015, geforce, aspire V, acer

Acer’s updated V Nitro notebook series has been announced, and the notebooks have received the newest Intel mobile processors and have been fully updated with the latest connectivity some advanced wireless tech.

Aspire_V3-372_white_acercp_straight on.jpg

The Aspire V 13

"The refreshed Aspire V Nitro Series notebooks and Aspire V 13 support the latest USB 3.1 Type-C port, while 'Black Edition' Aspire V Nitro models support Thunderbolt 3, which brings Thunderbolt to USB Type-C at speeds up to 40Gbps. All models include Qualcomm VIVE 2x2 802.11ac Wi-Fi with Qualcomm MU | EFX MU-MIMO technology."

MU-MIMO devices are just starting to hit the market and the tech promises to eliminate bottlenecks when multiple devices are in use on the same network – with compatible adapters/routers, that is.

Aspire V15 Nitro VN7-592_08.jpg

The Aspire V 15 Nitro

What kind of hardware will be offered? Here’s a brief overview:

  • 6th Gen Intel Core processors
  • Up to 32GB DDR4 system memory
  • NVIDIA GeForce graphics
  • (SATA) SSD/SSHD/HDD storage options
  • Touchscreen option added for the 15-inch model

Additionally, the “Black Edition” models offer a 4K 100% Adobe RGB display option, NVIDIA GeForce GTX 960M up to 4GB, NVMe SSDs, and something called “AeroBlade” thermal exhaust, which Acer said has “the world’s thinnest metallic blades of just 0.1mm thin, which are stronger and quieter”.

Aspire V17 Nitro VN7-792_win10_02.jpg

The Aspire V 17 Nitro

Pricing will start at $599 for the V Nitro 13, $999 for the V Nitro 15, and $1099 for the V Nitro 17. All versions will be available in the U.S. in October.

Source: Acer

NVIDIA Releases 355.82 WHQL Drivers

Subject: Graphics Cards | August 31, 2015 - 07:19 PM |
Tagged: nvidia, graphics drivers, geforce, drivers

Unlike last week's 355.80 Hotfix, today's driver is fully certified by both NVIDIA and Microsoft (WHQL). According to users on GeForce Forums, this driver includes the hotfix changes, although I am still seeing a few users complain about memory issues under SLI. The general consensus seems to be that a number of bugs were fixed, and that driver quality is steadily increasing. This is also a “Game Ready” driver for Mad Max and Metal Gear Solid V: The Phantom Pain.

nvidia-2015-drivers-35582.png

NVIDIA's GeForce Game Ready 355.82 WHQL Mad Max and Metal Gear Solid V: The Phantom Pain drivers (inhale, exhale, inhale) are now available for download at their website. Note that Windows 10 drivers are separate from Windows 7 and Windows 8.x ones, so be sure to not take shortcuts when filling out the “select your driver” form. That, or just use GeForce Experience.

Source: NVIDIA