GDC: NVIDIA Announces GTX 1080 Price Drop to $499

Subject: Graphics Cards | February 28, 2017 - 10:55 PM |
Tagged: pascal, nvidia, GTX 1080, GDC

Update Feb 28 @ 10:03pm It's official, NVIDIA launches $699 GTX 1080 Ti.

NVIDIA is hosting a "Gaming Celebration" live event during GDC 2017 to talk PC gaming and possibly launch new hardware (if rumors are true!). During the event, NVIDIA CEO Jen-Hsun Huang made a major announcement regarding its top-end GTX 1080 graphics card with a price drop to $499 effective immediately.

NVIDIA 499 GTX 1080.png

The NVIDIA GTX 1080 is a pascal based graphics card with 2560 CUDA cores paired with 8GB of GDDR5X memory. Graphics cards based on this GP104 GPU are currently selling for around $580 to $700 (most are around $650+/-) with the "Founders Edition" having an MSRP of $699. The $499 price teased at the live stream represents a significant price drop compared to what the graphics cards are going for now. NVIDIA did not specify if the new $499 MSRP was the new Founders Edition price or an average price that includes partner cards as well but even if it only happened on the reference cards, the partners would have to adjust their prices downwards accordingly to compete.

I suspect that NVIDIA is making such a bold move to make room in their lineup for a new product (the long-rumored 1080 Ti perhaps?) as well as a pre-emptive strike against AMD and their Radeon RX Vega products. This move may also be good news for GTX 1070 pricing as they may also see price drops to make room for cheaper GTX 1080 partner cards that come in below the $499 price point.

If you have been considering buying a new graphics card, NVIDIA has sweetened the pot a bit especially if you had already been eyeing a GTX 1080. (Note that while the price drop is said to be effective immediately, at the time of writing Amazon was still showing "normal"/typical prices for the cards. Enthusiasts might have to wait a few hours or days for the retailers to catch up and update their sites.)

This makes me a bit more excited to see what AMD will have to offer with Vega as well as the likelihood of a GTX 1080 Ti launch happening sooner rather than later!

Source: NVIDIA

Futuremark at GDC and MWC

Subject: General Tech, Graphics Cards | February 27, 2017 - 03:39 PM |
Tagged: MWC, GDC, VRMark, Servermark, OptoFidelity, cyan room, benchmark

Futuremark are showing off new benchmarks at GDC and MWC, the two conferences which are both happening this week.  We will have quite a bit of coverage this week as we try to keep up with simultaneous news releases and presentations.

vrmark.jpg

First up is a new benchmark in their recently released DX12 VRMark suite, the new Cyan Room which sits between the existing two in the suite.  The Orange Room is to test if your system is capable of providing you with an acceptable VR experience or if your system falls somewhat short of the minimum requirements while the Blue Room is to show off what a system that exceeds the recommended specs can manage.  The Cyan room will be for those who know that their system can handle most VR, and need to test their systems settings.  If you don't have the test suite Humble Bundle has a great deal on this suite and several other tools, if you act quickly.

unnamed.jpg

Next up is a new suite to test Google Daydream, Google Cardboard, and Samsung Gear VR performance and ability.  There is more than just performance to test when you are using your phone to view VR content, such as avoiding setting your eyeholes on fire.  The tests will help you determine just how long your device can run VR content before overheating becomes an issue and interferes with performance, as well as helping you determine your battery life.

latency.jpg

VR Latency testing is the next in the list of announcements and is very important when it comes to VR as high or unstable latency is the reason some users need to add a bucket to their list of VR essentials.  Futuremark have partnered with OptoFidelity to produce VR Multimeter HMD hardware based testing. This allows you, and hopefully soon PCPer as well, to test motion-to-photon latency, display persistence, and frame jitter as well as audio to video synchronization and motion-to-audio-latency all of which could lead to a bad time.

servermark.jpg

Last up is the brand new Servermark to test the performance you can expect out of virtual servers, media servers and other common tasks.  The VDI test lets you determine if a virtual machine has been provisioned at a level commensurate to the assigned task, so you can adjust it as required.  The Media Transcode portion lets you determine the maximum number of concurrent streams as well as the maximum quality of those streams which your server can handle, very nice for those hosting media for an audience. 

Expect to hear more as we see the new benchmarks in action.

Source: Futuremark

Faster than a speeding Gigabyte, the Aorus GTX 1080 XE

Subject: Graphics Cards | February 20, 2017 - 02:54 PM |
Tagged: nvidia, gtx 1080 Xtreme Edition, GTX 1080, gigabyte, aorus

Gigabyte created their Aorus line of products to attract enthusiasts away from some of the competitions sub-brands, such as ASUS ROG.  It is somewhat similar to the Gigabyte Xtreme Edition released last year but their are some differences, such as the large copper heatsink attached to the bottom of the GPU.  The stated clockspeeds are the same as last years model and it also sports the two HDMI connections on the front of the card to connect to Gigabyte's VR Extended Front panel.  The Tech Report manually overclocked the card and saw the Aorus reach the highest frequencies they have seen from a GP104 chip, albeit by a small margin.  Check out the full review right here.

back34.jpg

"Aorus is expanding into graphics cards today with the GeForce GTX 1080 Xtreme Edition 8G, a card that builds on the strong bones of Gigabyte's Editor's Choice-winning GTX 1080 Xtreme Gaming. We dig in to see whether Aorus' take on a GTX 1080 is good enough for a repeat."

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Releases GeForce 378.72 Hotfix (Bonus: a Discussion)

Subject: Graphics Cards | February 17, 2017 - 07:42 AM |
Tagged: nvidia, graphics drivers

Just a couple of days after publishing 378.66, NVIDIA released GeForce 378.72 Hotfix drivers. This fixes a bug encoding video in Steam’s In-Home Streaming, and it also fixes PhysX not being enabled on the GPU under certain conditions. Normally, hotfix drivers solve large-enough issues that were introduced with the previous release. This time, as far as I can tell, is a little different, though. Instead, these fixes seem to be intended for 378.66 but, for one reason or another, couldn’t be integrated and tested in time for the driver to be available for the game launches.

nvidia-2015-bandaid.png

This is an interesting effect of the Game Ready program. There is value in having a graphics driver available on the same day (or early) as a major game releases, so that people can enjoy the title as soon as it is available. There is also value in having as many fixes as the vendor can provide. These conditions oppose each other to some extent.

From a user standpoint, driver updates are cumulative, so they are able to skip a driver or two if they are not affected by any given issue. AMD has taken up a similar structure, some times releasing three or four drivers in a month with only, like, one of them being WHQL certified. For these reasons, I tend to lean on the side of “release ‘em as you got them”. Still, I can see people feeling a little uneasy about a driver being released incomplete to hit a due-date.

But, again, that due-date has value.

It’s interesting. I’m personally glad that AMD and NVIDIA are on a rapid-release schedule, but I can see where complaints could arise. What’s your opinion?

Source: NVIDIA
Manufacturer: PC Perspective

Living Long and Prospering

The open fork of AMD’s Mantle, the Vulkan API, was released exactly a year ago with, as we reported, a hard launch. This meant public, but not main-branch drivers for developers, a few public SDKs, a proof-of-concept patch for The Talos Principle, and, of course, the ratified specification. This sets up the API to find success right out of the gate, and we can now look back over the year since.

khronos-2017-vulkan-alt-logo.png

Thor's hammer, or a tempest in a teapot?

The elephant in the room is DOOM. This game has successfully integrated the API and it uses many of its more interesting features, like asynchronous compute. Because the API is designed in a sort-of “make a command, drop it on a list” paradigm, the driver is able to select commands based on priority and available resources. AMD’s products got a significant performance boost, relative to OpenGL, catapulting their Fury X GPU up to the enthusiast level that its theoretical performance suggested.

Mobile developers have been picking up the API, too. Google, who is known for banishing OpenCL from their Nexus line and challenging OpenGL ES with their Android Extension Pack (later integrated into OpenGL ES with version 3.2), has strongly backed Vulkan. The API was integrated as a core feature of Android 7.0.

On the engine and middleware side of things, Vulkan is currently “ready for shipping games” as of Unreal Engine 4.14. It is also included in Unity 5.6 Beta, which is expected for full release in March. Frameworks for emulators are also integrating Vulkan, often just to say they did, but sometimes to emulate the quirks of these system’s offbeat graphics co-processors. Many other engines, from Source 2 to Torque 3D, have also announced or added Vulkan support.

Finally, for the API itself, The Khronos Group announced (pg 22 from SIGGRAPH 2016) areas that they are actively working on. The top feature is “better” multi-GPU support. While Vulkan, like OpenCL, allows developers to enumerate all graphics devices and target them, individually, with work, it doesn’t have certain mechanisms, like being able to directly ingest output from one GPU into another. They haven’t announced a timeline for this.

MSI's wee AERO ITX family of NVIDIA graphics cards

Subject: Graphics Cards | February 16, 2017 - 03:35 PM |
Tagged: msi, AERO ITX, gtx 1070, gtx 1060, gtx 1050, GTX 1050 Ti, SFF, itx

MSI have just release their new series of ITX compatible GPUs, covering NVIDIA's latest series of cards from the GTX 1050 through to the GTX 1070; the GTX 1080 is not available in this form factor.  The GTX 1070 and 1060 are available in both factory overclocked and standard versions.

ITX series.png

All models share a similar design, with a single TORX fan with 8mm Super Pipes and the Zero Frozr feature which stops the fan to give silent operation when temperatures are below 60C.  They are all compatible with the Afterburner Overclocking Utility, including recordings via Predator and wireless control from your phone. 

The overclocked cards run slightly over reference, from the GTX 1070 at 1721MHz boost, 1531MHz base with the GDDR5 at 8GHz to the GTX 1050 at 1518MHz boost, 1404MHz base and the GDDR5 at 7GHz.  The models which do not bear the OC moniker run at NVIDIA's reference clocks even if they are not quite fully grown.

Source: MSI

NVIDIA Releases GeForce 378.66 Drivers with New Features

Subject: Graphics Cards | February 14, 2017 - 09:29 PM |
Tagged: opencl 2.0, opencl, nvidia, graphics drivers

While the headline of the GeForce 378.66 graphics driver release is support for For Honor, Halo Wars 2, and Sniper Elite 4, NVIDIA has snuck something major into the 378 branch: OpenCL 2.0 is now available for evaluation. (I double-checked 378.49 release notes and confirmed that this is new to 378.66.)

nvidia-geforce.png

OpenCL 2.0 support is not complete yet, but at least NVIDIA is now clearly intending to roll it out to end-users. Among other benefits, OpenCL 2.0 allows kernels (think shaders) to, without the host intervening, enqueue work onto the GPU. This saves one (or more) round-trips to the CPU, especially in workloads where you don’t know which kernel will be required until you see the results of the previous run, like recursive sorting algorithms.

So yeah, that’s good, albeit you usually see big changes at the start of version branches.

Another major addition is Video SDK 8.0. This version allows 10- and 12-bit decoding of VP9 and HEVC video. So... yeah. Applications that want to accelerate video encoding or decoding can now hook up to NVIDIA GPUs for more codecs and features.

NVIDIA’s GeForce 378.66 drivers are available now.

Source: NVIDIA

AMD Releases Radeon Software Crimson ReLive 17.2.1

Subject: Graphics Cards | February 14, 2017 - 05:57 PM |
Tagged: amd, graphics drivers

Just in time for For Honor and Sniper Elite 4, AMD has released a new set of graphics drivers, Radeon Software Crimson ReLive 17.2.1, that target these games. The performance improvements that they quote are in the 4-5% range, when compared to their previous driver on the RX 480, which would be equivalent to saving a whole millisecond per frame at 60 FPS. (This is just for mathematical reference; I don’t know what performance users should expect with an RX 480.)

amd-2016-crimson-relive-logo.png

Beyond driver overhead improvements, you will now be able to utilize multiple GPUs in CrossFire (for DirectX 11) on both titles.

Also, several issues have been fixed with this version. If you have a FreeSync monitor, and some games fail to activate variable refresh mode, then this driver might solve this problem for you. Scrubbing through some videos (DXVA H.264) should no longer cause visible corruption. A couple applications, like GRID and DayZ, should no longer crash under certain situations. You get the idea.

If you have an AMD GPU on Windows, pick up these drivers from their support page.

Source: AMD
Author:
Manufacturer: EVGA

The new EVGA GTX 1080 FTW2 with iCX Technology

Back in November of 2016, EVGA had a problem on its hands. The company had a batch of GTX 10-series graphics cards using the new ACX 3.0 cooler solution leave the warehouse missing thermal pads required to keep the power management hardware on its cards within reasonable temperature margins. To its credit, the company took the oversight seriously and instituted a set of solutions for consumers to select from: RMA, new VBIOS to increase fan speeds, or to install thermal pads on your hardware manually. Still, as is the case with any kind of product quality lapse like that, there were (and are) lingering questions about EVGA’s ability to maintain reliable product; with features and new options that don’t compromise the basics.

Internally, the drive to correct these lapses was…strong. From the very top of the food chain on down, it was hammered home that something like this simply couldn’t occur again, and even more so, EVGA was to develop and showcase a new feature set and product lineup demonstrating its ability to innovate. Thus was born, and accelerated, the EVGA iCX Technology infrastructure. While this was something in the pipeline for some time already, it was moved up to counter any negative bias that might have formed for EVGA’s graphics cards over the last several months. The goal was simple: prove that EVGA was the leader in graphics card design and prove that EVGA has learned from previous mistakes.

EVGA iCX Technology

Previous issues aside, the creation of iCX Technology is built around one simple question: is one GPU temperature sensor enough? For nearly all of today’s graphics cards, cooling is based around the temperature of the GPU silicon itself, as measured by NVIDIA (for all of EVGA’s cards). This is how fan curves are built, how GPU clock speeds are handled with GPU Boost, how noise profiles are created, and more. But as process technology has improved, and GPU design has weighed towards power efficiency, the GPU itself is often no longer the thermally limiting factor.

slides05.jpg

As it turns out, converting 12V (from the power supply) to ~1V (necessary for the GPU) is a simple process that creates a lot of excess heat. The thermal images above clearly demonstrate that and EVGA isn’t the only card vendor to take notice of this. As it turns out, EVGA’s product issue from last year was related to this – the fans were only spinning fast enough to keep the GPU cool and did not take into account the temperature of memory or power delivery.

The fix from EVGA is to ratchet up the number of sensors on the card PCB and wrap them with intelligence in the form of MCUs, updated Precision XOC software and user viewable LEDs on the card itself.

slides10.jpg

EVGA graphics cards with iCX Technology will include 9 total thermal sensors on the board, independent of the GPU temperature sensor directly integrated by NVIDIA. There are three sensors for memory, five for power delivery and an additional sensor for the GPU temperature. Some are located on the back of the PCB to avoid any conflicts with trace routing between critical components, including the secondary GPU sensor.

Continue reading about EVGA iCX Technology!

New graphics drivers? Fine, back to benchmarking.

Subject: Graphics Cards | February 9, 2017 - 02:46 PM |
Tagged: amd, nvidia

New graphics drivers are a boon to everyone who isn't a hardware reviewer, especially one who has just wrapped up benchmarking a new card the same day one is released.  To address this issue see what changes have been implemented by AMD and NVIDIA in their last few releases, [H]ard|OCP tested a slew of recent drivers from both companies.  The performance of AMD's past releases, up to and including the AMD Crimson ReLive Edition 17.1.1 Beta can be found here.  For NVIDIA users, recent drivers covering up to the 378.57 Beta Hotfix are right here.  The tests show both companies generally increasing the performance of their drivers, however the change is so small you are not going to notice a large difference.

0chained-to-office-man-desk-stick-figure-vector-43659486-958x1024.jpg

"We take the AMD Radeon R9 Fury X and AMD Radeon RX 480 for a ride in 11 games using drivers from the time of each video card’s launch date, to the latest AMD Radeon Software Crimson ReLive Edition 17.1.1 Beta driver. We will see how performance in old and newer games has changed over the course of 2015-2017 with new drivers. "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP