GeForce Driver Updates Contain Security Fixes

Subject: Graphics Cards | February 28, 2019 - 11:25 PM |
Tagged: nvidia, graphics drivers, security

Normally, when we discuss graphics drivers, there are a subset of users that like to stay on old versions. Some have older hardware and they believe that they will get limited benefits going forward. Others encounter a bug with a certain version and will refuse to update until it is patched.

In this case – you probably want to update regardless.

nvidia-2015-bandaid.png

NVIDIA has found eight security vulnerabilities in their drivers, which have been corrected in their latest versions. One of them also affects Linux... more on that later.

On Windows, there are five supported branches:

  • Users of R418 for GeForce, Quadro, and NVS should install 419.17.
  • Users of R418 for Tesla should install 418.96.
  • Users of R400 for Quadro and NVS should install 412.29.
  • Users of R400 for Tesla should install 412.29.
  • Users of R390 for Quadro and NVS should install 392.37.

Basically, you should install 419.17 unless you are using professional hardware.

One issue is being likened to Meltdown and Spectre although it is not quite the same. In those cases, the exploit took advantage of hardware optimizations leak system memory. In the case of CVE-2018-6260, however, the attack uses NVIDIA’s performance counters to potentially leak graphics memory. The difference is that GPU performance counters are a developer tool, used by applications like NVIDIA Nsight, to provide diagnostics. Further, beyond targeting a developer tool that can be disabled, this attack also requires local access to the device.

Linux users are also vulnerable to this attack (but not the other seven):

  • Users of R418 for GeForce, Quadro, and NVS should install 418.43.
  • Users of R418 for Tesla should install 418.39.
  • Users of R400 for GeForce, Quadro, NVS, and Tesla should install 410.104.
  • Users of R390 for GeForce, Quadro, NVS, and Tesla should install 390.116.
  • Users of R384 for Tesla should install 384.183.

Whether on Windows or Linux, after installing the update, a hidden option will allow you to disable GPU performance counters unless admin credentials are provided. I don’t know why it’s set to the insecure variant by default… but the setting can be toggled in the NVIDIA Control Panel. On Windows it’s Desktop then Enable Developer Settings then Manage GPU Performance Counters under Developer then Restrict access to the GPU counters to admin users only. See the driver release notes (especially the "Driver Security" section) for more info.

nvidia-2019-disableperfcounters.png

The main thing to fix is the other seven, however. That just requires the driver update. You should have received a notification from GeForce Experience if you use it; otherwise, check out NVIDIA’s website.

Source: NVIDIA

EWC 2019: Vulkan Safety Critical (SC) Working Group Created

Subject: Graphics Cards | February 26, 2019 - 12:41 AM |
Tagged: Khronos, Khronos Group, vulkan, vulkan sc, opengl, opengl sc

The Khronos Group, the industry body that maintains OpenGL, OpenCL, EGL, glTF, Vulkan, OpenXR, and several other standards, has announced the Vulkan Safety Critical (SC) Working Group at Embedded World Conference 2019. The goal is to create an API that leverages Vulkan’s graphics and compute capabilities in a way that implementations can be safe and secure enough for the strictest of industries, such as automotive, air, medical, and energy.

khronos-2017-vulkan-alt-logo.png

It's a safety hammer, I promise. (No I don't.)

The primary goal is graphics and compute, although the working group will also consider exposing other hardware capabilities, such as video encode and decode. These industries currently have access to graphics through OpenGL SC, although the latest release is still significantly behind what a GPU can do. To put it into perspective – the latest OpenGL SC 2.0 (which was released in 2016) has less functionality than the original release of WebGL back in 2011.

While OpenGL SC 2.0 allows programmable vertex and fragment (pixel) shaders, it falls short in many areas. Most importantly, OpenGL SC 2.0 does not allow compute shaders; Vulkan SC is aiming to promote the GPU into a coprocessor for each of these important industries.

There is not much else to report on at this point – the working group has been formed. A bunch of industry members have voiced their excitement about the new API’s potential, such as Codeplay, Arm, and NVIDIA. The obvious example application would be self-driving cars, although I’m personally interested in the medical industry. Is there any sort of instrument that could do significantly more if it had access to a parallel compute device?

If you are in a safety-critical enterprise, then look into joining the Khronos Group.

AMD's Radeon Software Adrenalin 2019 Edition 19.2.3 mobilizes

Subject: Graphics Cards, Processors | February 25, 2019 - 07:19 PM |
Tagged: Adrenalin Edition, adrenaline 19.2.3, amd, ryzen, Vega

AMD's regular driver updates have a new trick up their sleeves, they now include drivers for AMD Ryzen APUs with a Vega GPU inside.  Today's 19.2.3 launch is the first to be able to do so, and you can expect future releases to as well.  This is a handy integration for AMD users, even if you have a GPU installed you can be sure that your APU drivers are also up to date in case you need them.  For many users this may mean your Hybrid APU + GPU combination will offer better performance than you have seen recently, with no extra effort required from you.

Slide1.JPG

Along with the support for Ryzen APUs you will also see these changes.

Support For:

  • AMD Ryzen Mobile Processors with Radeon Vega Graphics Up to 10% average performance gains with AMD Radeon Software Adrenalin 2019 Edition 19.2.3 vs. 17.40 launch drivers for AMD Ryzen Mobile Processors with Radeon Vega Graphics.
  • Up to 17% average performance gains in eSports titles with AMD Radeon Software Adrenalin 2019 Edition 19.2.3 vs. 17.40 launch drivers for AMD Ryzen Mobile Processors with Radeon Vega Graphics.
  • Dirt Rally 2 - Up to 3% performance gains with AMD Radeon Software Adrenalin 2019 Edition 19.2.3, on a Radeon RX Vega 64 in Dirt Rally 2.

Fixed Issues:

  • Battlefield V players may experience character outlines stuck on screen after being revived.
  • Fan speeds may remain elevated for longer periods than expected when using Tuning Control Auto Overclock or manual fan curve in Radeon WattMan on AMD Radeon VII.
  • ReLive wireless VR may experience an application crash or hang during extended periods of play.
  • Zero RPM will correctly disable in Radeon WattMan on available system configurations when manual fan curve is enabled.
  • A loss of video may be intermittently experienced when launching a fullscreen player application with Radeon FreeSync enabled.

Known Issues:

  • Mouse lag or system slowdown is observed for extended periods of time with two or more displays connected and one display switched off.
  • Changes made in Radeon WattMan settings via Radeon Overlay may sometimes not save or take effect once Radeon Overlay is closed.
  • Some Mobile or Hybrid Graphics system configurations may intermittently experience green flicker when moving the mouse over YouTube videos in Chrome web browser.
    • A work around if this occurs is to disable hardware acceleration.
  • Radeon WattMan settings changes may intermittently not apply on AMD Radeon VII.
  • Performance metrics overlay and Radeon WattMan gauges may experience inaccurate fluctuating readings on AMD Radeon VII.

 

Source: AMD

Every NVIDIA GeForce GTX 1660 Ti on Amazon (So Far)

Subject: Graphics Cards | February 23, 2019 - 03:58 PM |
Tagged: zotac, video card, turing, nvidia, msi, gtx 1660 ti, graphics, gpu, gigabyte, geforce, gaming, evga, asus, amazon

NVIDIA partners launched their new GeForce GTX 1660 Ti graphics cards yesterday, and we checked out a pair of these in our review and found these new TU116-based cards to offer excellent performance (and overclocking headroom) for the price. Looking over Amazon listings today here is everything available so far, separated by board partner. We've added the Boost Clock speeds for your reference to show how these cards are clocked compared to the reference (1770 MHz), and purchases made through any of these Amazon affiliate links help us out with a small commission.

1660_Ti_Cards_Stack.jpg

In any case, this list at least demonstrates the current retail picture of NVIDIA's new mainstream Turing GPU on Amazon, so without further preamble here are all currently available cards in alphabetical order by brand:

ASUS

ASUS Phoenix GeForce GTX 1660 Ti OC

ASUS Dual GeForce GTX 1660 Ti OC

ASUS Strix Gaming GTX 1660 Ti OC

EVGA

EVGA GeForce GTX 1660 Ti XC Black Gaming

EVGA GeForce GTX 1660 Ti XC Gaming

GIGABYTE

GIGABYTE GeForce GTX 1660 Ti OC 6G

GIGABYTE GeForce GTX 1660 Ti Windforce OC 6G

MSI

MSI GTX 1660 Ti VENTUS XS 6G OC

MSI GTX 1660 Ti ARMOR 6G OC

MSI GTX 1660 Ti GAMING X 6G

ZOTAC

ZOTAC Gaming GeForce GTX 1660 Ti

Already we are seeing many cards offering factory overclocks, ranging from a small 30 MHz bump at $279.99 from GIGABYTE (GTX 1660 Ti OC 6G, 1800 MHz Boost Clock) to 100 MHz+ from the MSI GTX 1660 Ti GAMING X 6G (1875 MHz Boost Clock) we reviewed at $309.99.

We will update the list as additional cards become available on Amazon.

Source: Amazon.com

Forget the GTX 1660 Ti, let's start speculating about the other GTX 1660

Subject: Graphics Cards | February 22, 2019 - 01:54 PM |
Tagged: video card, Turin, tu116, rtx, ray tracing, nvidia, msi, gtx 1660 ti, gtx, graphics, gpu, geforce, gaming, asus, DLSS, palit

Today is the day that the GTX 1660 Ti moves from rumour to fact as the NDA is finally over and we can share our results! Sebastian's testing compared the overclocked and slightly above base price MSI GTX 1660 Ti GAMING X against the interestingly shaped EVGA GTX 1660 Ti XC Black.  Performance-wise, the rumours were fairly accurate, the card offers comparable performance to the 1070 Ti, and at at ~$280 price point it is certainly less expensive but still shows evidence of the upwards trend in price for GPUs.

If you are interested in other models, take a peek at The Guru of 3D who reviewed not one or two, but four different 1660 Ti's.  From the tiny little Palit StormX model pictured below through MSI's dual fan VENTUS XS and Gaming X to the full sized ASUS ROG STRIX with three fans you have a fair number of charts to go through!

img_7747.jpg

"We have four new reviews to present today. NVIDIA is launching the 279 USD GeForce GTX 1660 Ti. We've talked about it a lot, it is the more affordable offering, Turing GPU based, yet stripped from RT and tensor functionality."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Guru of 3D
Manufacturer: NVIDIA

The TU116 GPU and First Look at Cards from MSI and EVGA

NVIDIA is introducing the GTX 1660 Ti today, a card build from the ground up to take advantage of the new Turing architecture but without real-time ray tracing capabilities. It seems like the logical next step for NVIDIA as gamers eager for a current-generation replacement to the popular GTX 1060, and who may have been disappointed with the launch of the RTX 2060 because it was priced $100 above the 1060 6GB, now have something a lot closer to a true replacement in the GTX 1660 Ti.

There is more to the story of course, and we are still talking about a “Ti” part and not a vanilla GTX 1660, which presumably will be coming at some point down the road; but this new card should make an immediate impact. Is it fair to say that the GTX 1660 Ti the true successor to the GTX 1060 that we might have assumed the RTX 2060 to be? Perhaps. And is the $279 price tag a good value? We will endeavor to find out here.

1660_Ti_Boxes.jpg

RTX: Off

It has been a rocky start for RTX, and while some might say that releasing GTX cards after the fact represents back-peddling from NVIDIA, consider the possibility that the 2019 roadmap always had space for new GTX cards. Real-time ray tracing does not make sense below a certain performance threshold, and it was pretty clear with the launch of the RTX 2060 that DLSS was the only legitimate option for ray tracing at acceptable frame rates. DLSS itself has been maligned of late based on a questions about visual quality, which NVIDIA has now addressed in a recent blog post. There is clearly a lot invested in DLSS, and regardless of your stance on the technology NVIDIA is going to continue working on it and releasing updates to improve performance and visual quality in games.

As its “GTX” designation denotes, the GeForce GTX 1660 Ti does not include the RT and Tensor Cores that are found in GeForce RTX graphics cards. In order to deliver the Turing architecture to the sub-$300 graphics segment, we must be very thoughtful about the types and numbers of cores we use in the GPU: adding dedicated cores to accelerate Ray Tracing and AI doesn’t make sense unless you can first achieve a certain level of rendering performance. As a result, we chose to focus the GTX 1660 Ti’s cores exclusively on graphics rendering in order to achieve the best balance of performance, power, and cost.

If the RTX 2060 is the real-time ray tracing threshold, then it's pretty obvious that any card that NVIDIA released this year below that performance (and price) level would not carry RTX branding. And here we are with the next card, still based on the latest Turing architecture but with an all-new GPU that has no ray tracing support in hardware. There is nothing fused off here or disabled in software with TU116, and the considerable reduction in die size from the TU106 reflects this.

Continue reading our review of the NVIDIA GeForce GTX 1660 Ti graphics card!

Keep your eyes peeled for AMD’s Radeon RX Vega 56 for $279

Subject: Graphics Cards | February 21, 2019 - 07:36 PM |
Tagged: msi, RX Vega 56 Air Boost 8G OC, RX Vega 56, amd

The news is a bit late as NewEgg is currently out of stock, but it is worth keeping your eyes peeled for the Vega 56; a mid-range card at a mid-range price is somewhat rare at the moment.

mianv.jpg

MSI are selling their Vega 56 Air Boost 8G OC card for $279 USD, though keep away from the Canadian site as the price drop has yet to spread northwards.  While not availble at the time of this posting you should pay attention as not only is the card likely to come back in the not too distant future but this may prompt a drop in price for other cards. 

This particular model sports a Core Clock of 1181 MHz which can hit 1520 MHz on Boost and the 8GB of HBM2 runs at 1600 MHz on a 2048-bit interface giving it an impressive amount of bandwidth.  It is admittedly not a new card, you can see how it was received when it initially launched right here.

 

Source: AMD

NVIDIA Launches GeForce MX230 and MX250 Laptop GPUs

Subject: Graphics Cards, Systems | February 21, 2019 - 03:04 PM |
Tagged: pascal, nvidia, mx250, mx230, mx, gp108, geforce mx

Two new laptop GPUs launched in NVIDIA’s low-end MX line. This classification of products is designed to slide above the GPUs found on typical laptop CPUs by a wide enough margin to justify an extra chip, but not enough to be endorsed as part of their gaming line.

nvidia-2019-mx250.png

As such, pretty much the only performance number that NVIDIA provides is an “up-to” factor relative to Intel’s HD620 iGPU as seen on the Core i5-8265U. For reference, the iGPU on this specific CPU has 192 shader units running at up to 1.1 GHz. Technically there exists some variants that have boost clocks up to 1.15 GHz but that extra 4.5% shouldn’t matter too much for this comparison.

Versus this part, the MX250 is rated as up to 3.5x faster; the MX230 is rated at up to 2.6x faster.

One thing that I should note is that the last generation’s MX150 is listed as up to 4x the Intel UHD 620, although they don’t state which specific CPU’s UHD 620.

This leads to a few possibilities:

  1. The MX250 has a minor performance regression versus the MX150 in the “up to” test(s)
  2. The UHD 620 had significant driver optimizations in at least the “up to” test(s)
  3. The UHD 620 that they tested back then is significantly slower than the i5-8265U
  4. They rounded differently then vs now
  5. They couldn’t include the previous “up to” test for some reason

Unfortunately, because NVIDIA is not releasing any specifics, we can only list possibilities and maybe speculate if one seems exceedingly likely. (To me, none of the first four stands out head-and-shoulders above the other three.)

Like the MX150 that came before it, both the MX230 and MX250 will use GDDR5 memory. The MX130 could be paired with either GDDR5 or DDR3.

Anandtech speculates that it is based on the GP108, which is a safe assumption. NVIDIA confirmed that the new parts are using the Pascal architecture, and the GP108 is the Pascal chip in that performance range. Anandtech also claims that the MX230 and MX250 are fabricated under Samsung 14nm, while the “typical” MX150 is TSMC 16nm. The Wikipedia list of NVIDIA graphics, however, claims that the MX150 is fabricated at 14nm. While both could be right, a die shrink would make a bit of sense to squeeze out a few more chips from a wafer (if yields are relatively equal). If that’s the case, and they changed manufacturers, then there might be a slight revision change to the GP108; these changes happen frequently, and their effects should be invisible to the end user… but sometimes they make a difference.

It’ll be interesting to see benchmarks when they hit the market.

Source: NVIDIA

I am the BIOS that flashes in the RTX ... Let's get dangerous!

Subject: Graphics Cards | February 19, 2019 - 06:21 PM |
Tagged: danger, rtx, bios, flash, nvidia, risky business

So you like living dangerously and are willing to bet $1000 or more on something that might make your new NVIDIA GPU a bit faster, or transform it into a brick?  Then does Overclockers Club have a scoop for you!  There exists a tool called NVFlash, with or without added ID Mismatch Modified, which will allow you to change the BIOS of your card to another manufacturers design which can increase your cards power envelope and offer better performance ...

or kill it dead ...

or introduce artifacting, random crashes or all sort of other mischief.

On the other hand, if all goes well you can turn your plain old RTX card into an overclocked model of the same type and see higher performance overall.  Take a look at OCC's article and read it fully before deciding if this is a risk you might be willing to take.

6_thumb.jpg

"WARNING! Flash the BIOS at your own risk. Flashing a video card BIOS to a different model and/or series WILL void your warranty. This process can also cause other permanent issues like video artifacts and premature hardware failure!"

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Releases 418.99 Hotfix for Windows 7 / 8.1 Crashes

Subject: Graphics Cards | February 18, 2019 - 12:07 PM |
Tagged: nvidia, graphics drivers, geforce

Apparently the latest WHQL driver, 418.81, can cause random application crashes and TDRs (“Timeout Detection and Recovery”) issues on Windows 7 and 8.1. NVIDIA has followed up with a hotfix driver, 418.99, that addresses the issue.

nvidia-2015-bandaid.png

Hotfix drivers do not undergo full testing, so they should not be installed unless you are concerned about the specific issues they fix. In this case, because the bug does not affect Windows 10, a Windows 10 driver is not even provided.

In case you’re wondering what “Timeout Detection and Recovery” is, Windows monitors the graphics driver to make sure that work is being completed quickly (unless it is not driving a monitor – Windows doesn’t care how long a GPU is crunching on compute tasks if it is not being used for graphics). If it hangs for a significant time, Windows reboots the graphics driver just in case it was stuck in, for example, an infinite loop caused by a bad shader or compute task. Without TDR, the only way to get out of this situation would be to cut power to the system.