NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 05:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

Radeon Software Crimson ReLive 17.1.2 Drivers Released

Subject: Graphics Cards | February 2, 2017 - 07:02 AM |
Tagged: graphics drivers, amd

A few days ago, AMD released their second graphics drivers of January 2017: Radeon Software Crimson ReLive 17.1.2. The main goal of these drivers are to support the early access of Conan Exiles as well as tomorrow’s closed beta for Tom Clancy’s Ghost Recon Wildlands. Optimization that AMD has been working on prior to release, for either game, are targeted at this version.

amd-2016-crimson-relive-logo.png

Beyond game-specific optimizations, a handful of bugs are also fixed, ranging from crashes to rendering artifacts. There was also an issue with configuring WattMan on a system that has multiple monitors, where the memory clock would drop or bounce around. There is driver also has a bunch of known issues, including a couple of hangs and crashes under certain situations.

Radeon Software Crimson ReLive Edition 17.1.2 is available at AMD’s website.

Source: AMD

NVIDIA Release GeForce 378.57 Hotfix Drivers

Subject: Graphics Cards | February 2, 2017 - 07:01 AM |
Tagged: nvidia, graphics drivers

If you were having issues with Minecraft on NVIDIA’s recent 378.49 drivers, then you probably want to try out their latest hotfix. This version, numbered 378.57, will not be pushed down GeForce Experience, so you will need to grab them from NVIDIA’s customer support page.

nvidia-2015-bandaid.png

Beyond Minecraft, this also fixes an issue with “debug mode”. For some Pascal-based graphics cards, the option in NVIDIA Control Panel > Help > Debug Mode might be on by default. This option will reduce factory-overclocked GPUs down to NVIDIA’s reference speeds, which is useful to eliminate stability issues in testing, but pointlessly slow if you’re already stable. I mean, you bought the factory overclock, right? I’m guessing someone at NVIDIA used it to test 378.49 during its development, fixed an issue, and accidentally commit the config file with the rest of the fix. Either way, someone caught it, and it’s now fixed, even though you should be able to just untick it if you have a factory-overclocked GPU.

Source: NVIDIA

Win our RX 460 Budget Gaming System!!

Subject: Graphics Cards | January 31, 2017 - 11:18 AM |
Tagged: rx 460, radeon, giveaway, contest, buildapc, amd

As part of our partnership with AMD to take a look at the Radeon RX 460 as a budget gaming graphics solution, we are giving away the computer we built for our testing. If you missed our previous stories, shame on you. Check them out here:

Check out the embeded block below to see how you can win our system. It is a global giveaway, so feel free to enter no matter where you live! Thanks again to AMD for providing the hardware for this build!

Radeon RX 460 Budget System Giveaway (sponsored by AMD)

Source: AMD

DirectX Intermediate Language Announced... via GitHub

Subject: Graphics Cards | January 27, 2017 - 09:19 PM |
Tagged: microsoft, DirectX, llvm, dxil, spir-v, vulkan

Over the holidays, Microsoft has published the DirectX Shader Compiler onto GitHub. The interesting part about this is that it outputs HLSL into DirectX Intermediate Language (DXIL) bytecode, which can be ingested by GPU drivers and executed on graphics devices. The reason why this is interesting is that DXIL is based on LLVM, which might start to sound familiar if you have been following along with The Khronos Group and their announcements regarding Vulkan, OpenCL, and SPIR-V.

As it turns out, they were on to something, and Microsoft is working on a DirectX analogue of it.

microsoft-2015-directx12-logo.jpg

The main advantage of LLVM-based bytecode is that you can eventually support multiple languages (and the libraries of code developed in them). When SPIR-V was announced with Vulkan, the first thing that came to my mind was compiling to it from HLSL, which would be useful for existing engines, as they are typically written in HLSL and transpiled to the target platform when used outside of DirectX (like GLSL for OpenGL). So, in Microsoft’s case, it would make sense that they start there (since they own the thing) but I doubt that is the end goal. The most seductive outcome for game engine developers would be single-source C++, but there is a lot of steps between there and here.

Another advantage, albeit to a lesser extent, is that you might be able to benefit from performance optimizations, both on the LLVM / language side as well as on the driver’s side.

According to their readme, the minimum support will be HLSL Shader Model 6. This is the most recent shading model, and it introduces some interesting instructions, typically for GPGPU applications, that allow multiple GPU threads to interact, like balloting. Ironically, while DirectCompute and C++AMP don’t seem to be too popular, this would nudge DirectX 12 into a somewhat competent GPU compute API.

DXIL support is limited to Windows 10 Build 15007 and later, so you will need to either switch one (or more) workstation(s) to Insider, or wait until it launches with the Creators Update (unless something surprising holds it back).

NVIDIA Releases GeForce 378.49 Drivers

Subject: Graphics Cards | January 26, 2017 - 09:38 PM |
Tagged: nvidia, graphics drivers

Update: There are multiple issues being raised in our comments, including a Steam post by Sam Lantinga (Valve) about this driver breaking In-Home Streaming. Other complaints include certain applications crashing and hardware acceleration issues.

Original Post Below

Now that the holidays are over, we’re ready for the late-Winter rush of “AAA” video games. Three of them, Resident Evil VII, the early access of Conan Exiles, and the closed beta of For Honor, are targeted by NVIDIA’s GeForce 378.49 Game Ready drivers. Unless we get a non-Game Ready driver in the interim, I am guessing that this will cover us until mid-February, before the full release of For Honor, alongside Sniper Elite 4 and followed by Halo Wars 2 on the next week.

nvidia-geforce.png

Beyond game-specific updates, the 378-branch of drivers includes a bunch of SLI profiles, including Battlefield 1. It also paves the way for GTX 1050- and GTX 1050 Ti-based notebooks; this is their launch driver whenever OEMs begin to ship the laptops they announced at CES.

This release also contains a bunch of bug fixes (pdf), including a reboot bug with Wargames: Red Dragon and TDR (driver time-out) with Windows 10 Anniversary Update. I haven’t experienced any of these, but it’s good to be fixed regardless.

You can pick up the new drivers from their website if, you know, GeForce Experience hasn’t already notified you.

Source: NVIDIA

The Gigabyte GTX 1060 G6, a decent card with a bit of work

Subject: Graphics Cards | January 25, 2017 - 03:37 PM |
Tagged: windforce, factory overclocked, GTX 1060 G1 GAMING 6G, GeForce GTX 1060, gigabyte

In their testing [H]ard|OCP proved that the Windforce cooler is not the limiting factor when overclocking Gigabyte's GTX 1060 G1 Gaming G6, even at their top overclock of  2.1GHz GPU, 9.4GHz memory the temperature never reached 60C.  They did have some obstacles reaching those speeds, the cards onboard Gaming mode offered an anemic boost and in order to start manually overclocking this card you will need to install the XTREME ENGINE VGA Utility.  Once you have that, you can increase the voltage and clocks to find the limits of the card you have, which should offer a noticeable improvement from its performance straight out of the box.

1484519953uTUJTeH5LD_1_1.png

"We’ve got the brand new GIGABYTE GeForce GTX 1060 G1 GAMING 6G video card to put through the paces and find out how well it performs in games and overclocks. We will compare its highest overclock with one of the best overclocks we’ve achieved on AMD Radeon RX 480 to put it to the test. How will it stand up? Let’s find out."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Sapphire Releases AMD Radeon RX460 with 1024 Shaders

Subject: Graphics Cards | January 18, 2017 - 08:43 PM |
Tagged: video, unlock, shaders, shader cores, sapphire, radeon, Polaris, graphics, gpu, gaming, card, bios, amd, 1024

As reported by WCCFtech, AMD partner Sapphire has a new 1024 stream processor version of the RX460 listed on their site (Chinese language), and this product reveal of course comes after it became known that RX460 graphics cards had the potential to have their stream processor count unlocked from 896 to 1024 via BIOS update.

Sapphire_201712151752.jpg

Sapphire RX460 1024SP 4G D5 Ultra Platinum OC (image credit: Sapphire)

The Sapphire RX460 1024SP edition offers a full Polaris 11 core operating at 1250 MHz, and it otherwise matches the specifications of a stock RX460 graphics card. Whether this product will be available outside of China is unknown, as is the potential pricing model should it be available in the USA. A 4GB Radeon RX460 retails for $99, while the current step-up option is the RX470, which doubles up on this 1024SP RX460's shader count with 2048, with a price increase of about 70% ($169).

Radeon_Chart.PNG

AMD Polaris GCN 4.0 GPU lineup (Credit WCCFtech)

As you may note from the chart above, there is also an RX470D option between these cards that features 1792 shaders, though this option is also China-only.

Source: WCCFtech

Gigabyte Shows Off Half Height GTX 1050 and GTX 1050 Ti Graphics Cards

Subject: Graphics Cards | January 17, 2017 - 10:31 PM |
Tagged: SFF, pascal, low profile, GTX 1050 Ti, gtx 1050, gigabyte

Without much fanfare Gigabyte recently launched two new low profile half height graphics cards packing factory overclocked GTX 1050 and GTX 1050 Ti GPUs. The new cards measure 6.6” x 2.7” x 1.5” (167mm long) and are cooled by a small shrouded single fan cooler. 
 
Gigabyte GTX 1050 OC Low Profile 2G.png
 
Around back, both the Gigabyte GTX 1050 OC Low Profile 2G and GTX 1050 Ti OC Low Profile 4G offer four display outputs in the form of two HDMI 2.0b, one DisplayPort 1.4, and one dual-link DVI-D. It appears that Gigabyte is using the same cooler for both cards. There is not much information on this cooler, but it utilizes an aluminum heatsink and what looks like a ~50mm fan. Note that while the cards are half-height, they use a dual slot design which may limit the cases it can be used in.
 
The GTX 1050 OC Low Profile 2G features 640 Pascal-based CUDA cores clocked at 1366 MHz base and 1468 MHz boost out of the box (1392 MHz base and 1506 MHz boost in OC Mode using Gigabyte’s software) and 2GB of GDDR5 memory at 7008 MHz (7GT/s). For comparison, the GTX 1050 reference clock speeds are 1354 MHz base and 1455 MHz boost.
 
Meanwhile, the GTX 1050 Ti OC Low Profile 4G has 768 cores clocked at 1303 MHz base and 1417 MHz boost by default and 1328 MHz base and 1442 MHz boost in OC Mode. The GPU is paired with 4GB of GDDR5 memory at 7GT/s. NVIDIA’s reference GPU clocks are 1290 MHz base and 1392 MHz boost.
 
The pint-sized graphics cards would certainly allow for gaming on your SFF home theater or other desktop PC as well as being an easy upgrade to make a tiny OEM PC gaming capable (think those thin towers HP, Lenovo, and Dell like to use). 
 
Of course, Gigabyte is not yet talking pricing and availability has only been narrowed down to a general Q1 2017 time frame. I would expect the cards to hit retailers within a month or so and be somewhere around $135 for their half height GTX 1050 OC LP 2G and approximately $155 for the faster GTX 1050 Ti variant. That is to say that the low profile cards should be available at a slight premium over the company's larger GTX 1050 and GTX 1050 Ti graphics cards.
Source: Gigabyte

CES 2017: Gigabyte Shows Off First Aorus Branded Graphics Card

Subject: Graphics Cards | January 10, 2017 - 10:11 PM |
Tagged: CES, CES 2017, aorus, gigabyte, xtreme gaming, GTX 1080, pascal

One interesting development from Gigabyte at this year’s CES was the expansion of its Aorus branding and the transition from Xtreme Gaming. Initially used on its RGB LED equipped motherboards, the company is rolling out the brand to its other higher end products including laptops and graphics cards. While it appears that Xtreme Gaming is not going away entirely, Aorus is taking the spotlight with the introduction of the first Aorus branded graphics card: the GTX 1080.

Aorus GTX 1080 CES 2017 Pauls Hardware.png

Paul's Hardware got hands on with the new card (video) at the Gigabyte CES booth.

Featuring a similar triple 100mm fan cooler as the GTX 1080 Xtreme Gaming 8G, the Aorus GTX 1080 comes with x patterned LED lighting as well as a backlit Aorus logo on the side and a backlit Eagle on the backplate. The cooler is comprised of three 100mm double stacked fans (the center fan is recessed and spins in the opposite direction of the side fans) over a shrouded angled aluminum fin stack that connects to the GPU over five large copper heatpipes.

The graphics card is powered by two 8-pin PCI-E power connectors.

In an interesting twist, the card has two HDMI ports on the back of the card that are intended to be used to hook up front panel HDMI outputs for things like VR headsets. Another differentiator between the upcoming card and the Xtreme Gaming 8G is the backplate which has a large copper plate secured over the underside of the GPU. Several sites are reporting that this area can be used for watercooling, but I am skeptical of this as if you are going to go out and buy a waterblock for your graphics card you might as well buy a block to put on top of the GPU and not on the area of the PCB opposite the GPU!). As is, the copper plate on the backplate certainly won’t hurt cooling, and it looks cool, but that’s all I suspect it is.

Think Computers also checked out the Aorus graphics card. (video above)

Naturally, Gigabyte is not talking clock speeds on this new card, but I expect it to hit at least the same clocks as its Xtreme Gaming 8G predecessor which was clocked at 1759 MHz base and 1848 MHz boost out of the box and 1784 MHz base and 1936 MHz boost in OC Mode respectively. Gigabyte also overlocked the memory on that card up to 10400 MHz on OC Mode.

Gigabyte also had new SLI HB bridges on display bearing the Aorus logo to match the Aorus GPU. The company also had Xtreme Gaming SLI HB bridges though which further suggests that they are not completely retiring that branding (at least not yet).

Pricing has not been announced, but the card will be available in February.

Gigabyte has yet to release official photos of the card or a product page, but it should show up on their website shortly. In the meantime, Paul's Hardware and Think Computers shot some video of the card on the show floor which I have linked above if you are interested in the card. Looking on Amazon, the Xtreme Gaming 1080 8GB is approximately $690 before rebate so I would guess that the Aorus card would come out at a slight premium over that if only for the fact that it is a newer release, has a more expensive backplate and additional RGB LED backlighting.

What are your thoughts on the move to everything-Aorus? 

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

A new Radeon Software Crimson Edition for the New Year

Subject: Graphics Cards | January 9, 2017 - 02:32 PM |
Tagged: Crimson Edition 16.12.2

Radeon Software Crimson Edition 16.12.2 is now live and WHQL certified, ready for you to grab here or through the version you already have installed, which supports the recommended clean installation option. 

index.jpg

This particular update addresses Freesync issues with borderless Fullscreen applications as well as compatibility issues with Battlefield 1 and DOTA 2.  There are also numerous optimizations and fixes for issues with Radeon ReLive, which you can read in more detail under the Release Notes tab.

 

Source: AMD

(Leak) AMD Vega 10 and Vega 20 Information Leaked

Subject: Graphics Cards | January 8, 2017 - 03:53 AM |
Tagged: vega 11, vega 10, navi, gpu, amd

During CES, AMD showed off demo machines running Ryzen CPUs and Vega graphics cards as well as gave the world a bit of information on the underlying architecture of Vega in an architectural preview that you can read about (or watch) here. AMD's Vega GPU is coming and it is poised to compete with NVIDIA on the high end (an area that has been left to NVIDIA for awhile now) in a big way.

Thanks to Videocardz, we have a bit more info on the products that we might see this year and what we can expect to see in the future. Specifically, the slides suggest that Vega 10 – the first GPUs to be based on the company's new architecture – may be available by the (end of) first half of 2017. Following that a dual GPU Vega 10 product is slated for a release in Q3 or Q4 of 2017 and a refreshed GPU based on smaller process node with more HBM2 memory called Vega 20 in the second half of 2018. The leaked slides also suggest that Navi (Vega's successor) might launch as soon as 2019 and will come in two variants called Navi 10 and Navi 11 (with Navi 11 being the smaller / less powerful GPU).

AMD Vega Leaked Info.jpg

The 14nm Vega 10 GPU allegedly offers up 64 NCUs and as much as 12 TFLOPS of single precision and 750 GFLOPS of double precision compute performance respectively. Half precision performance is twice that of FP32 at 24 TFLOPS (which would be good for things like machine learning). The NCUs allegedly run FP16 at 2x and DPFP at 1/16. If each NCU has 64 shaders like Polaris 10 and other GCN GPUs, then we are looking at a top-end Vega 10 chip having 4096 shaders which rivals that of Fiji. Further, Vega 10 supposedly has a TDP up to 225 watts.

For comparison, the 28nm 8.9 billion transistor Fiji-based R9 Fury X ran at 1050 MHz with a TDP of 275 watts and had a rated peak compute of 8.6 TFLOPS. While we do not know clock speeds of Vega 10, the numbers suggest that AMD has been able to clock the GPU much higher than Fiji while still using less power (and thus putting out less heat). This is possible with the move to the smaller process node, though I do wonder what yields will be like at first for the top end (and highest clocked) versions.

Vega 10 will be paired with two stacks of HBM2 memory on package which will offer 16GB of memory with memory bandwidth of 512 GB/s. The increase in memory bandwidth is thanks to the move to HBM2 from HBM (Fiji needed four HBM dies to hit 512 GB/s and had only 4GB).

The slide also hints at a "Vega 10 x2" in the second half of the year which is presumably a dual GPU product. The slide states that Vega 10 x2 will have four stacks of HBM2 (1TB/s) though it is not clear if they are simply adding the two stacks per GPU to claim the 1TB/s number or if both GPUs will have four stacks (this is unlikely though as there does not appear to be room on the package for two more stacks each and I am not sure if they could make the package bit enough to make room for them either). Even if we assume that they really mean 2x 512 GB/s per GPU (and maybe they can get more out of that in specific workloads across both) for memory bandwidth, the doubling of cores and at least potential compute performance will be big. This is going to be a big number crunching and machine learning card as well as for games of course. Clockspeeds will likely have to be much lower compared to the single GPU Vega 10 (especially with stated TDP of 300W) and workloads wont scale perfectly so potential compute performance will not be quite 2x but should still be a decent per-card boost.

AMD-Vega-GPU.jpg

Raja Koduri holds up a Vega GPU at CES 2017 via eTeknix

Moving into the second half of 2018, the leaked slides suggest that a Vega 20 GPU will be released based on a 7nm process node with 64 CUs and paired with four stacks of HBM2 for 16 GB or 32 GB of memory with 1TB/s of bandwidth. Interestingly, the shaders will be setup such that the GPU can still do half precision calculations at twice that of single precision, but will not take nearly the hit on double precision at Vega 10 at only 1/2 single precision rather than 1/16. The GPU(s) will use between 150W and 300W of power, and it seems these are set to be the real professional and workstation workhorses. A Vega 10 with 1/2 DPFP compute would hit 6 TFLOPS which is not bad (and it would hopefully be more than this due to faster clocks and architecture improvements).

Beyond that, the slides mention Navi's existence and that it will come in Navi 10 and Navi 11 but no other details were shared which makes sense as it is still far off.

You can see the leaked slides here. In all, it is an interesting look at potential Vega 10 and beyond GPUs but definitely keep in mind that this is leaked information and that the information allegedly came from an internal presentation that likely showed the graphics processors in their best possible/expect light. It does add a bit more hope to the fire of excitement for Vega though, and I hope that AMD pulls it off as my unlocked 6950 is no longer supported and it is only a matter of time before new games perform poorly or not at all!

Also read: 

Source: eTeknix.com

CES 2017: AMD Vega Running DOOM at 4K

Subject: Graphics Cards | January 6, 2017 - 11:27 PM |
Tagged: Vega, doom, amd

One of the demos that AMD had at CES was their new Vega architecture running DOOM with Vulkan on Ultra settings at 4K resolution. With this configuration, the pre-release card was coasting along at the high 60s / low 70s frames per second. Compared to PC Gamer’s benchmarks of the Vulkan patch (ours was focused on 1080p) this puts Vega somewhat ahead of the GTX 1080, which averages the low 60s.

Some of the comments note that, during one of the melee kills, the frame rate stutters a bit, dropping down to about 37 FPS. That’s true, and I included a screenshot of it below, but momentary dips sometimes just happen. It could even be a bug in pre-release drivers for a brand new GPU architecture, after all.

amd-2017-ces-vegadip.png

Yes, the frame rate dipped in the video, but stutters happen. No big deal.

As always, this is a single, vendor-controlled data point. There will be other benchmarks, and NVIDIA has both GP102 and Volta to consider. The GTX 1080 is only ~314 mm2, so there’s a lot more room for enthusiast GPUs to expand on 14nm, but this test suggests Vega will at least surpass it. (When a process node is fully mature, you will typically see low-yield chips up to around 600mm2.)

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at http://pcper.com/ces!

Keep your eyes peeled for FreeSync 2

Subject: Graphics Cards, Displays, Shows and Expos | January 6, 2017 - 03:19 PM |
Tagged: freesync 2, amd

So far we have yet to see a Freesync 2 capable monitor on the floor at CES but we do know about the technology.  We have seen Ryan's overview of what we know of the new technology and its benefits and recently The Tech Report also posted their thoughts on it.  For instance, did you know that there are 121 FreeSync displays from 20 display partners of various quality, compared to NVIDIA eight partners and 18 GSYNC displays.  The Tech Report are also on the hunt for a Freesync 2 display at CES, we will let you know once the hunt is successful.

modeswitch.jpg

"AMD has pulled back the curtain on FreeSync 2, the new version of the FreeSync variable refresh rate technology."

Here are some more Graphics Card articles from around the web:

Graphics Cards

CES 2017: MSI GUS Thunderbolt 3 External Graphics Upgrade System

Subject: Graphics Cards | January 5, 2017 - 11:50 AM |
Tagged: video card, thunderbolt 3, msi, gus, graphics, external gpu, enclosure, CES 2017, CES

You would need to go all the way back to CES 2012 to see our coverage of the GUS II external graphics enclosure, and now MSI has a new G.U.S. (Graphics Upgrade System) GPU enclosure to show, this time using Thunderbolt 3.

MSI_GUS.jpg

In addition to 40 Gbps Thunderbolt 3 connectivity, the G.U.S. includes a built-in 500W power supply with 80 Plus Gold certification, as well as USB 3.0 Type-C and Type-A ports including a quick-charge port on the front of the unit.

Ryan had a look at the G.U.S. (running an NVIDIA GeForce GTX 1080, no less) at MSI's booth:

Specifications from MSI:

  • 1x Thunderbolt 3 (40 Gbps) port to connect to host PCs
  • 2x USB 3.0 Type-A (rear)
  • 1x USB 3.0 Type-C (rear)
  • 1x USB 3.0 Type-A w/QC (front)
  • 80 Plus Gold 500W internal PSU

We do not have specifics on pricing or availablity for the G.U.S. just yet.

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

Source: MSI

The GTX 1050 and 1050 Ti are going on a road trip

Subject: Graphics Cards, Mobile | January 3, 2017 - 07:34 PM |
Tagged: nvidia, GTX 1050 Ti, gtx 1050

Keep an eye out for reviews of new gaming laptops containing mobile versions of the GTX 1050 and 1050 Ti.  These laptops should also be on sale soon, with a quote from NVIDIA suggesting prices will start at around $700 for a base model.

1050m.PNG

The price reflects the power of the GPU, you are not going to match a $2000 machine with a GTX 1080 in it, but then again there are many gamers who do not need such a powerful card.  If your gaming machine is a current generation laptop with integrated graphics this will be a huge improvement and even a laptop with a discrete mid-range GPU from a previous generation is going to lag behind these new models.  Of course, waiting to see what laptops based off of AMD's new Ryzen platform may be worth waiting for but for those hoping to upgrade soon, laptops with these cards installed are going to be worth looking at.

1080p gaming at 60fps will not be a problem, and for strategy games and online multiplayer entertainment such as LOL you should even be able to pump up the graphics settings.  The cards will support the various GameWorks enhancements as well as other features such as Ansel.

Dell_Inspiron_15_7000_Gaming_Image_2.0.jpeg

The above example comes from The Verge, who spotted a 14" Dell laptop with the GTX 1050 already on sale at $800.  If you want the best choice you should look to the 15.6" model, which offers the choice of a GTX 1050 or 1050 Ti, an i5-7300HQ or i7-7700HQ and a 512GB PCIe SSD.

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA

AMD FreeSync 2 Brings Latency, LFC and Color Space Requirements

Subject: Graphics Cards, Displays | January 3, 2017 - 09:00 AM |
Tagged: srgb, lfc, hdr10, hdr, freesync 2, freesync, dolby vision, color space, amd

Since the initial FreeSync launch in March of 2015, AMD has quickly expanded the role and impact that the display technology has had on the market. Technologically, AMD added low frame rate compensation (LFC) to mimic the experience of G-Sync displays, effectively removing the bottom limit to the variable refresh rate. LFC is an optional feature that requires a large enough gap between the displays minimum and maximum refresh rates to be enabled, but the monitors that do integrate it work well. Last year AMD brought FreeSync to HDMI connections too by overlaying the standard as an extension. This helped to expand the quantity and lower the price of available FreeSync options. Most recently, AMD announced that borderless windowed mode was being added as well, another feature-match to what NVIDIA can do with G-Sync.

slides-3.jpg

The biggest feather in the cap for AMD FreeSync is the sheer quantity of displays that exist on the market that support it. As of our briefing in early December, AMD claimed 121 design wins for FreeSync to just 18 for NVIDIA G-Sync. I am not often in the camp of quantity over quality, but the numbers are impressive. The pervasiveness of FreeSync monitors means that at least some of them are going to be very high quality integrations and that prices are going to be lower compared to the green team’s selection.

slides-14.jpg

Today AMD is announcing FreeSync 2, a new, concurrently running program that adds some new qualifications to displays for latency, color space and LFC. This new program will be much more hands-on from AMD, requiring per-product validation and certification and this will likely come at a cost. (To be clear, AMD hasn’t confirmed if that is the case to me yet.)

Let’s start with the easy stuff first: latency and LFC. FreeSync 2 will require monitors to support LFC and thus to have no effective bottom limit to their variable refresh rate. AMD will also instill a maximum latency allowable for FS2, on the order of “a few milliseconds” from frame buffer flip to photon. This can be easily measured with some high-speed camera work by both AMD and external parties (like us).

These are fantastic additions to the FreeSync 2 standard and should drastically increase the quality of panels and product.

slides-17.jpg

The bigger change to FreeSync 2 is on the color space. FS2 will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards. This means that any monitor that has the FreeSync 2 brand will have a significantly larger color space and ~400 nits brightness. Current HDR standards exceed these FreeSync 2 requirements, but there is nothing preventing monitor vendors from exceeding these levels; they simply set a baseline that users should expect going forward.

slides-16.jpg

In addition to just requiring the panel to support a wider color gamut, FS2 will also enable user experience improvements as well. First, each FS2 monitor must communicate its color space and brightness ranges to the AMD driver through a similar communication path used today for variable refresh rate information. By having access to this data, AMD can enable automatic mode switches from SDR to HDR/wide color gamut based on the application. Windows can remain in a basic SDR color space but games or video applications that support HDR modes can enter that mode without user intervention.

slides-15.jpg

Color space mapping can take time in low power consumption monitors, adding potential latency. For movies that might not be an issue, but for enthusiast gamers it definitely is. The solution is to do all the tone mapping BEFORE the image data is sent to the monitor itself. But with varying monitors, varying color space limits and varying integrations of HDR standards, and no operating system level integration for tone mapping, it’s a difficult task.

The solution is for games to map directly to the color space of the display. AMD will foster this through FreeSync 2 – a game that integrates support for FS2 will be able to get data from the AMD driver stack about the maximum color space of the attached display. The engine can then do its tone mapping to that color space directly, rather than some intermediate state, saving on latency and improving the gaming experience. AMD can then automatically switch the monitor to its largest color space, as well as its maximum brightness. This does require the game engine or game developer to directly integrate support for this feature though – it will not be a catch-all solution for AMD Radeon users.

This combination of latency, LFC and color space additions to FreeSync 2 make it an incredibly interesting standard. Pushing specific standards and requirements on hardware vendors is not something AMD has had the gall to do the past, and honestly the company has publicly been very against it. But to guarantee the experience for Radeon gamers, AMD and the Radeon Technologies Group appear to be willing to make some changes.

slides-18.jpg

NVIDIA has yet to make any noise about HDR or color space requirements for future monitors and while the FreeSync 2 standards shown here don’t quite guarantee HDR10/Dolby Vision quality displays, they do force vendors to pay more attention to what they are building and create higher quality products for the gaming market.

All GPUs that support FreeSync will support FreeSync 2 and both programs will co-exist. FS2 is currently going to be built on DisplayPort and could find its way into another standard extension (as Adaptive Sync was). Displays are set to be available in the first half of this year.

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: AMD

AMD's Drumming Up Excitement for Vega

Subject: Graphics Cards | January 2, 2017 - 02:56 PM |
Tagged: Vega, amd

Just ahead of CES, AMD has published a teaser page with, currently, a single YouTube video and a countdown widget. In the video, a young man is walking down the street while tapping on a drum and passing by Red Team propaganda posters. It also contains subtle references to Vega on walls and things, in case the explicit references, including the site’s URL, weren’t explicit enough.

amd-2017-ces-vega.jpg

How subtle, AMD.

Speaking of references to Vega, the countdown widget claims to lead up to the architecture preview. We were expecting AMD to launch their high-end GPU line at CES, and this is the first (official) day of the show. Until it happens, I don’t really know whether it will be a more technical look, or if they will be focusing on the use cases.

The countdown ends at 9am (EST) on January 5th.

Source: AMD

NVIDIA Releases GeForce 376.48 Hotfix Drivers

Subject: Graphics Cards | December 22, 2016 - 07:01 AM |
Tagged: nvidia, graphics drivers

The latest hotfix drivers from NVIDIA, 376.48, address five issues, some of which were long-standing complaints. The headlining bug is apparently a workaround for an issue in Folding@Home, until the application patches the root issue on its end. Prior to this, users needed to stick on 373.06 in order to successfully complete a Folding@Home run, avoiding all drivers since mid-October.

nvidia-2015-bandaid.png

Other fixes include rendering artifacts in Just Cause 3, flickering and crashes in Battlefield 1, and rendering issues in Wargame: Red Dragon. These drivers, like all hotfix drivers, will not be pushed by GeForce Experience. You will need to download them from NVIDIA’s support page.

Source: NVIDIA

AMD Releases Radeon Software Crimson ReLive 16.12.2

Subject: Graphics Cards | December 20, 2016 - 02:41 PM |
Tagged: graphics drivers, amd

Last week, AMD came down to the PC Perspective offices to show off their new graphics drivers, which introduced optional game capture software (and it doesn’t require a login to operate). This week, they are publishing a new version of it, Radeon Software Crimson ReLive Edition 16.12.2, which fixes a huge list of issues.

amd-2016-crimson-relive-logo.png

While most of these problem were minor, the headlining fix could have been annoying for FreeSync users (until this update fixed it, of course). It turns out that, when using FreeSync with a borderless fullscreen application, and another monitor has an active window, such as a video in YouTube, the user would experience performance issues in the FreeSync application (unless all of these other windows were minimized). This sounds like a lot of steps, but you could imagine how many people have a YouTube or Twitch stream running while playing a semi-casual game. Also, those types of games lend themselves well to being run in borderless window mode, too, so you can easily alt-tab to the other monitors, exacerbating the issue. Regardless, it’s fixed now.

Other fixed issues involve mouse pointer corruption with an RX 480 and multi-GPU issues in Battlefield 1. You can download them at AMD's website.

Source: AMD