All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | October 22, 2016 - 06:49 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Before it was released, employees of NVIDIA were claiming that it was difficult to get their drivers through Microsoft's WHQL certification. It is a busy time of year, with the holiday gaming and hardware rush in full swing, so there was likely a backlog until Microsoft could return the signed graphics driver. It also seems like GeForce 375.57 drivers could have used a little more time in NVIDIA's QA department.
At the GeForce Forums, users are complaining about a variety of issues. Ironically, there seems to be a bunch of them claiming that Battlefield 1 is crashing and otherwise being buggy. I haven't installed the game yet, so I cannot contribute my own experiences to it, one way or the other. I have seen some issues myself, though. For instance, I can confirm that tiles in the Windows 10 Start Menu lock up the entire panel if you attempt to move them. NVIDIA acknowledges a handful of issues with Windows 10 on their forums, and they plan a hotfix driver soon (which I'm guessing cannot be applied on PCs running Anniversary Edition clean installs that have secure boot enabled, because of Microsoft's kernel mode driver changes -- thankfully, I'm guessing that applies to very few people).
One issue that seems localized to me, though, is StarCraft II. Since I installed the driver (and granted I installed several things that night, like the CUDA SDK) it fails to launch about three-quarters of the time. Could be unrelated, but it should give you an idea about how broad the issues seem to be. Other users are complaining about GIFV corruption, for instance.
Best to roll back and wait for the next WHQL driver (unless hotfix users give glowing praise).
Subject: Graphics Cards | October 21, 2016 - 05:20 PM | Jeremy Hellstrom
Tagged: nvidia, geforce 375.57
NVIDIA was beaten to the punch by AMD this particular cycle, today marks the release of the GeForce 375.57 driver with new profiles for BF1, Civ VI and Titanfall 2 as well as VR support updates for the same two pre-release games. If you haven't signed up for the GeForce Experience you can still grab the drivers here.
Game Ready Drivers provide the best possible gaming experience for all major new releases, including Virtual Reality games. Prior to a new title launching, our driver team is working up until the last minute to ensure every performance tweak and bug fix is included for the best gameplay on day-1.
Provides the optimal experience for Battlefield 1, Civilization VI, and Titanfall 2
Game Ready VR
Provides the optimal VR experience for Eagle Flight and Serious Sam VR: The Last Hope
Subject: Graphics Cards | October 20, 2016 - 02:14 PM | Jeremy Hellstrom
Tagged: amd, driver, Crimson Edition 16.10.2
AMD is expecting their new driver to arrive any moment now, in time for several game launches as well as updating some existing early access titles. You can keep your eye out for the update on their driver page or wait for your installed driver to prompt you to upgrade. Here is a quick list of the new features and bug fixes to expect.a
Radeon Software Crimson Edition 16.10.2 Highlights
- Battlefield 1
- Sid Meier’s Civilization VI
- Titanfall 2
- Serious Sam VR Early Access
- Eagle Flight VR
New AMD CrossFire profile added for DirectX® 11:
- Sid Meier’s Civilization® VI
- Fan speed may sometimes remain elevated on select Radeon RX 400 series graphics products even when an application has been exited.
- Eyefinity group settings may not be retained after driver upgrade when using AMD CrossFire configurations.
- Gears of War 4 may experience an application hang when using select high resolution and quality configurations in some specific game maps.
- DirectX®12 content may be unable to launch on some older CPUs that do not support popcnt instruction.
- Battlefield 1TM AMD CrossFire profile updates for game launch.
Subject: Graphics Cards | October 19, 2016 - 08:08 PM | Scott Michaud
Tagged: amd, nvidia, gtx 1060, rx 480, dx12, dx11, battlefield 1
Battlefield 1 is just a few days from launching. In fact, owners of the Deluxe Edition have the game unlock yesterday. It's interesting that multiple publishers are using release date as a special edition bonus these days, including Microsoft's recent Windows Store releases. I'm not going to say interesting bad or good, though, because I'll leave that up to the reader to decide.
Anywho, DigitalFoundry is doing their benchmarking thing, and they wanted to see what GPU could provide a solid 60FPS when everything is maxed out (at 1080p). They start off with a DX12-to-DX12 comparison between the GTX 1060 and the RX 480. This is a relatively fair comparison, because the 3GB GTX 1060 and the 4GB RX 480 both come in at about $200, while upgrading to 6GB for the 1060 or 8GB for the 480 bumps each respective SKU up to the ~$250 price point. In this test, NVIDIA has a few dips slightly below 60 FPS in complex scenes, while AMD stays above that beloved threshold.
They also compare the two cards in DX11 and DX12 mode, with both cards using a Skylake-based Core i5 CPU. In this test, AMD's card noticed a nice increase in frame rate when switching to DirectX 12, while NVIDIA had a performance regression in the new API. This raises two questions, one of which is potentially pro-NVIDIA, and the other, pro-AMD. First, would the original test, if NVIDIA's card was allowed to use DirectX 11, show the GTX 1060 more competitive against the DX12-running RX 480? This brings me to the second question: what would the user see? A major draw of Mantle-based graphics APIs is that the application has more control over traditionally driver-level tasks. Would 60 FPS in DX12 be more smooth than 60 FPS in DX11?
I don't know. It's something we'll need to test.
Subject: Graphics Cards | October 14, 2016 - 01:27 PM | Jeremy Hellstrom
Tagged: msi, gtx 1070, GTX 1070 GAMING X 8G, factory overclocked
The most noticeable feature of this GTX 1070 from MSI is that is has an additional 6 pin power connector intended to ensure smooth power delivery. The most confusing part is the branding, a GAMING X is better than a GAMING Z which is better than a GAMING which is better than a non-GAMING 1070. The factory overclock on the card pushes the boost clock to 1771MHz and [H]ard|OCP also tested it the best overclock they could manage, a base clock of 1692MHz and a boost clock of 1882MHz. Check out the effect that had on gameplay in their full review.
"We have MSI’s new GeForce GTX 1070 GAMING X 8G video card to evaluate today. We will push this GPU as high as we can, and see how the overclock compares to the default factory overclock, and a Founders Edition NVIDIA GeForce GTX 1070. This video card is a fully custom retail video card with the Twin Frozr VI cooling system. "
Here are some more Graphics Card articles from around the web:
Subject: Graphics Cards | October 8, 2016 - 07:01 AM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
On Thursday, NVIDIA released their latest graphics drivers to align with Gears of War 4, Mafia 3, and Shadow Warrior 2. The drivers were published before each of these games launched, which allows gamers to optimize their PCs ahead of time. Graphics vendors work with many big-budget studios during their development cycles, and any tweaks that they found over the months and years will be targeted to this release, as usual.
Beyond tweaking for these games, NVIDIA has also announced a couple of fixes. If you were experiencing issues in Overwatch, then these new drivers fix how decals are drawn. The major fix claims to reduce inconsistent performance in multiple VR titles, which is very useful for these applications.
You can get these drivers from their website, or just install them from GeForce Experience.
Subject: Graphics Cards | October 6, 2016 - 03:17 PM | Tim Verry
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte
Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.
The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.
Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).
The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.
There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch.
Subject: Graphics Cards | October 5, 2016 - 09:01 PM | Scott Michaud
Tagged: amd, frame pacing, DirectX 12
When I first read this post, it was on the same day that AMD released their Radeon Software Crimson Edition 16.10.1 drivers, although it was apparently posted the day prior. As a result, I thought that their reference to 16.9.1 was a typo, but it apparently wasn't. These changes have been in the driver for a month, at least internally, but it's unclear how much it was enabled until today. (The Scott Wasson video suggests 16.10.1.) It would have been nice to see it on their release notes as a new feature, but at least they made up for it with a blog post and a video.
If you don't recognize him, Scott Wasson used to run The Tech Report, and he shared notes with Ryan while we were developing our Frame Rating testing methodology. He was focused on benchmarking GPUs by frame time, rather than frame rate, because the number of frames that the user sees means less than how smooth the animation they present is. Our sites diverged on implementation, though, as The Tech Report focused on software, while Ryan determined that capturing and analyzing output frames, intercepted between the GPU and the monitor, would tell a more complete story. Regardless, Scott Wasson left his site to work for AMD last year, with the intent to lead User Experience.
We're now seeing AMD announce frame pacing for DirectX 12 Multi-GPU.
This feature particularly interesting, because, depending on the multi-adapter mode, a lot of that control should be in the hands of the game developers. It seems like the three titles they announced, 3D Mark: Time Spy, Rise of the Tomb Raider, and Total War: Warhammer, would be using implicit linked multi-adapter, which basically maps to CrossFire. I'd be interested to see if they can affect this in explicit mode via driver updates as well, but we'll need to wait and see for that (and there isn't many explicit mode titles anyway -- basically just Ashes of the Singularity for now).
If you're interested to see how multi-GPU load-balancing works, we published an animation a little over a month ago that explains three different algorithms, and how explicit APIs differ from OpenGL and DirectX 11. It is also embedded above.
Subject: Graphics Cards | October 5, 2016 - 08:37 PM | Scott Michaud
Tagged: graphics drivers, amd
Earlier today, AMD has released their Radeon Software Crimson Edition 16.10.1 drivers. These continue AMD's trend of releasing drivers alongside major titles, which, this time, are Mafia III (October 7th) and Gears of War 4 (October 11th). Both of these titles are multiple days out, apart from a handful of insiders with advanced copies, which makes it nice for gamers by letting them optimize their machine ahead of time, on their own schedule, before launch.
The driver also includes a handful of interesting fixes. First, a handful of games, such as Overwatch, Battlefield 1, and Paragon, should no longer flicker when set to CrossFire mode. Also, performance issues in The Crew should be fixed with this release.
You can download AMD Radeon Software Crimson Edition 16.10.1 from their website.
Subject: Graphics Cards | October 2, 2016 - 12:12 PM | Sebastian Peak
Tagged: rumor, report, pascal, nvidia, GTX 1050 Ti, graphics card, gpu, GP107, geforce
A report published by VideoCardz.com (via Baidu) contains pictures of an alleged NVIDIA GeForce GTX 1050 Ti graphics card, which is apparently based on a new Pascal GP107 GPU.
Image credit: VideoCardz
The card shown is also equipped with 4GB of GDDR5 memory, and contains a 6-pin power connector - though such a power requirement might be specific to this particular version of the upcoming GPU.
Image credit: VideoCardz
Specifications for the GTX 1050 Ti were previously reported by VideoCardz, with a reported GPU-Z screenshot. The card will apparently feature 768 CUDA cores and a 128-bit memory bus, with clock speeds (for this particular sample) of 1291 MHz base, 1392 MHz boost (with some room to overclock, from this screenshot).
Image credit: VideoCardz
An official announcement for the new GPU has not been made by NVIDIA, though if these PCB photos are real it probably won't be far off.
Subject: Graphics Cards | September 27, 2016 - 10:04 PM | Tim Verry
Tagged: water cooling, pascal, hybrid cooler, gtx 1070, GP104, evga
EVGA is preparing to launch the GTX 1070 FTW Hybrid which is a water cooled card that pairs NVIDIA's GTX 1070 GPU with EVGA's Hybrid cooler and custom FTW PCB. The factory overclocked graphics card is currently up for pre-order for $500 on EVGA's website.
The GTX 1070 FTW Hybrid uses EVGA's custom PCB that features two 8-pin power connectors that drive a 10+2 power phase and dual BIOS chips. The Hybrid cooler includes a shrouded 100mm axial fan and a water block that directly touches both the GPU and the memory chips. The water block connects to an external 120mm radiator and a single fan that can be swapped out and/or powered by a motherboard using a standard four pin connector. Additionally, the cooler has a metal back plate and RGB LED back-lit EVGA logos on the side and windows on the front. Display outputs include one DVI, one HDMI, and three DisplayPort connectors.
As far as specification go, EVGA did not get too crazy with the factory overclock, but users should be able to push it quite far on their own assuming they get a decent chip from the silicon lottery. The GP104 GPU has 1920 CUDA cores clocked at 1607 MHz base and 1797 MHz boost. However, the 8 GB of memory is clocked at the stock 8,000 MHz. For comparison, reference clock speeds are 1506 MHz base and 1683 MHz boost.
Interestingly, EVGA rates the GTX 1070 FTW Hybrid at 215 watts versus the reference card's 150 watts. It is also the same TDP rating as the GTX 1080 FTW Hybrid card.
The table below outlines the specifications of EVGA's water cooled card compared to the GTX 1070 reference GPU and the GTX 1080 FTW Hybrid.
|GTX 1070||GTX 1070 FTW Hybrid||GTX 1080 FTW Hybrid|
|Rated Clock||1506 MHz||1607 MHz||1721 MHz|
|Boost Clock||1683 MHz||1797 MHz||1860 MHz|
|Memory Clock||8000 MHz||8000 MHz||10000 MHz|
|TDP||150 watts||215 watts||215 watts|
|MSRP (current)||$379 ($449 FE)||$500||$730|
According to EVGA, the Hybrid cooler offers up GPU and memory temperatures to 45°C and 57°C respectively compared to reference temperatures of 80°C and 85°C. Keeping in mind that these are EVGA's own numbers (you can see our Founder's Edition temperature results here), the Hybrid cooler seems to be well suited for keeping Pascal GPUs in check even when overclocked. In reviews of the GTX 1080 FTW Hybrid, reviewers found that the Hybrid cooler allowed stable 2GHz+ GPU clock speeds that let the card hit their maximum boost clocks and stay there under load. Hopefully the GTX 1070 version will have similar results. I am interested to see whether the memory chips they are using will be capable of hitting at least the 10 GHz of the 1080 cards if not more since they are being cooled by the water loop.
You can find more information on the factory overclocked water cooled graphics card on EVGA's website. The card is available for pre-order at $500 with a 3 year warranty.
Pricing does seem a bit high at first glance, but looking around at other custom GTX 1070 cards, it is only at about a $50 premium which is not too bad in my opinion. I will wait to see actual reviews before I believe it, but if I had to guess the upcoming card should have a lot of headroom for overclocking and I'm interested to see how far people are able to push it!
Subject: Graphics Cards | September 27, 2016 - 01:57 PM | Jeremy Hellstrom
Tagged: VR, trickster vr, amd, nvidia, htc vive
[H]ard|OCP continues their look into the performance of VR games on NVIDIA's Titan X, GTX 1080, 1070, 1060 and 970 as well as AMD's Fury X and RX 480. This particular title allowed AMD to shine, they saw the RX 480 come within a hair of matching the GTX 1060 which is a first for them and shows that AMD can be a contender in the VR market. Pop by to see their review in full.
"Arm yourself with a bow and arrows, a magic sword that flies, or if you prefer, a handful of throwing darts. Then get ready to take on the procedurally generated fantasy world full of cartoonish Orcs, and more Orcs, and some other Orcs. Headshots count as well as chaining your shots so aim is critical. Did I mention the Orcs?"
Here are some more Graphics Card articles from around the web:
Subject: Graphics Cards | September 21, 2016 - 05:54 PM | Scott Michaud
Tagged: nvidia, graphics drivers
With Forza Horizon 3 coming out for Ultimate Edition SKU users in a little over a day, NVIDIA has released their new Game Ready drivers. GeForce 372.90 drivers roll in all of NVIDIA's fixes for the game that have been discovered during its development.
Thankfully, unlike the slippage that I've witnessed from them recently in this regard, the release notes for 372.90 are quite verbose (PDF). For instance, and this probably affects a few of our readers, NVIDIA has finally fixed the issue with HTC Vive over DisplayPort. Their description sounds like it wasn't failing to connect, as users believed, but rather it was just failing to light up the display. Of course, from a user's standpoint, a black screen is a black screen, but it's interesting to see what honest admissions of what exactly any given error was.
So, TL;DR: HTC Vive users should be able to use it over DisplayPort with Pascal again.
Also, they announced that the driver contains security updates. They don't elaborate on what specifically was fixed, especially since it will take a while for users to update, but it sounds like NVIDIA was in bug-fixing mode with this driver, which I appreciate.
You can get GeForce 372.90 from GeForce Experience and their website.
Subject: Graphics Cards | September 21, 2016 - 05:39 PM | Scott Michaud
Tagged: amd, radeon, graphics drivers, crimson
Continuing with AMD's attempts, especially since the start of the Crimson Edition line, to release a driver alongside big game releases, the graphics vendor has published Radeon Software Crimson Edition 16.9.2. This one aligns with the Ultimate Edition SKU of Forza Horizon 3 from Microsoft Studios, which unlocks in a little over a day. Standard and Deluxe Edition users will need to wait until Tuesday, the 27th. As always, it rolls in all of the tweaks and fixes that AMD has found prior to the game's general release.
Also, AMD has fixed several issues, according to their pleasantly verbose release notes. Crimson Edition 16.9.2 should resolve crashes that occur in Multi-GPU mode with Ashes of the Singularity in DirectX 12. It should also fix things like mouse pointer corruption on RX 400 series graphics.
You can pick it up from AMD's website, for Windows 7, 8.1, and 10, both 32- and 64-bit versions.
Subject: Graphics Cards | September 20, 2016 - 03:58 PM | Scott Michaud
Tagged: microsoft, xbox, xbox one, pc gaming, nvidia, GTX 1080, gtx 1070
NVIDIA has just announced that specially marked, 10-series GPUs will be eligible for a Gears of War 4 download code. This bundle applies to GeForce GTX 1080 and GeForce GTX 1070 desktop GPUs, as well as laptops which integrate either of those two GPUs. As always, if you plan on purchasing a GPU due to this bundle, make sure that the product page for your retailer mentions the bundle.
Also, through the Xbox Play Anywhere initiative, NVIDIA claims that this code can be used to play the game on Xbox One as well. Xbox Play Anywhere allows users to purchase a game on either of Microsoft's software stores, Xbox Store or Windows Store, and it will automatically count as a purchase for the cross-platform equivalent. It also has implications for cloud saves, but that's a story for another day.
The bundle begins today, September 20th. Gears of War 4 launches on October 11th.
Subject: Graphics Cards | September 20, 2016 - 03:35 PM | Jeremy Hellstrom
Tagged: gigabyte, GTX 1080, GTX 1080 Xtreme Gaming Premium, factory overclocked, GIGABYTE Xtreme Engine, vr link
Gigabyte's GeForce GTX 1080 Xtreme Gaming comes with a nice overclock right out of the box, 1759MHz base, 1898MHz boost clock and a small bump to the VRAM frequency to 10.2GHz. At the push of a button you can add an extra 25MHz to the GPUs clocks assuming you install the bundled GIGABYTE Xtreme Engine which also allows you to manually tweak your settings. The Package part of the official name indicates that Gigabyte's Xtreme VR Link header panel is included with the card, you can install it in the front of your case to provide easy access to two HDMI connectors and two USB 3.0 ports for a VR headset.
Pop on over to [H]ard|OCP to see how much more they could get out of the card as well as the effect it had on gameplay.
"GIGABYTE’s GeForce GTX 1080 Xtreme Gaming Premium Pack is one premium package of goodness. Not only have we got one of the fastest GeForce GTX 1080 video cards, but GIGABYTE has thrown in the kitchen sink in this Premium Package with enthusiast oriented gaming as the focus."
Here are some more Graphics Card articles from around the web:
Subject: General Tech, Graphics Cards | September 15, 2016 - 02:51 PM | Jeremy Hellstrom
Tagged: RGB, msi, GTX 1080, EKWB, factory overclocked
MSI has just turned 30 and to help you join in the festivities they've released a custom GTX 1080 for purchase. It uses an EK Predator Liquid Cooling Unit, the card is fully covered by a waterblock and a radiator and fan are already attached. The card comes in a wooden box as a keepsake.
The card is still two slots high and the GPU is overclocked somewhat, the boost is 1860 MHz. In addition to the 30th Anniversary and MSI logos on the card, there are of course RGB lights which offer 16.8 million colours controlled by the MSI Gaming App.
Subject: General Tech, Graphics Cards | September 14, 2016 - 09:55 PM | Tim Verry
Tagged: rtg, radeon technologies group, Polaris, crimson, amd radeon, amd
It has now been a year since the formation of AMD’s Radeon Technologies Group and the graphics driven division has proven itself rather successful. Looking back with hindsight, AMD's new graphics division has enjoyed several wins with new products and advancements in driver support reclaiming market share from NVIDIA and new initiatives advancing VR, HDR, and open source visual effects.
Specifically, the Radeon Technologies Group, led by Raja Koduri, has managed to launch its new "Polaris" graphics architecture based on a 14nm FinFET process with the RX 400 series for consumers and the Radeon Pro Duo, Radeon Pro WX series, and Radeon Pro SSG (Solid State Graphics) for professionals. The company asl hit a milestone on FreeSync monitor design wins with a total of 101 displays launched to date.
Along with actual hardware, the graphics division has shaken up branding by rolling out new driver software under the Radeon Crimson Edition brand (with 21 driver releases since release) and dropping FirePro in favor of carrying over the Radeon name to create new Radeon Pro branding for its professional series of graphics cards. Driver support has also been enhanced on Linux and the AMDGPU-Pro driver works for RX 400 series.
Further, the Radeon Technologies Group launched its GPUOpen initiative back in December to foster the creation and advancement of free and open source visual effects and productivity code that developers are free to download, modify, and share.
Speaking of market share, AMD has managed to claw back some discrete GPU market share from a lowly 18% of GPUs in Q2 2015 to nearly 30% last quarter (Q2'16). That is a very respectable jump in just a year's time especially against NVIDIA's successful Pascal launches helped both by the price/performance of RX 400 as well as much needed focus on improving driver quality and timeliness of releases.
Where does this leave AMD and its RTG? Honestly, the graphics division is in a much better place than it was last year and it is in a decent position to survive and make a difference. There are still many unknowns and just as AMD's processor division is dependent on a successful Zen release, the graphics division will need Vega to be a hit in order for AMD to get wins on the high end and compete with NVIDIA on the flagship and performance fronts. They will further need Vega to update their professional series of cards many of which are still using the company's Fiji architecture which is not as power efficient as Pascal or future Volta (the competition).
With that said, the team had solid wins since their formation and are gearing up for the future. According to the announcement, the Radeon Technologies Group will be focusing on pushing virtual reality (VR) and HDR (high dynamic range) in gaming by working with developers, improving drivers, adding to their GPUOpen software collection, and launching new products.
From the press release:
"We’re passionate about perfecting pixels and delivering an unrivaled gaming experience for our community, and uncompromising power and creative freedom for developers and content creators. And if you think our first year was exciting, wait until you see what RTG has lined up for the future."
In the near future, Raja Koduri told Venture Beat to expect VR backpacks to be on show at CES in January and to look out for mobile Polaris graphics cards. Also, Radeon Crimson Edition may be incorporating features from recently acquired startup HiAlgo who developed software to dynamically monitor gameplay and adjust the resolution to maintain maximum frame rates and prevent overheating during long game sessions. One of their techniques called HiAlgo Switch would allow gamers to switch from full to half resolution (and back again) at the press of a hot-key button so as to keep FPS high if a gamer anticipates they are about to enter a demanding area that would normally result in low frame rates. While these techniques are not very important for desktop gaming (especially the CPU/GPU limiter to prevent overheating), all three would come in handy for mobile gamers using laptops with discrete cards or especially APUs.
I am looking forward to seeing where Raja and the RTG team go from here and what they have in store for AMD graphics.
Subject: Graphics Cards | September 9, 2016 - 03:59 AM | Scott Michaud
Tagged: nvidia, graphics drivers, linux
Unfortunately, I don't tend to notice when Linux drivers get released; it's something I want to report more frequently on. Luckily, this time, I heard about NVIDIA's 370.28 graphics drivers while they were still fresh. This one opens up overclocking (and underclocking) for GeForce 10-series GPUs, although NVIDIA (of course) mentions that this is “at the user's own risk”. It also fixes a bunch of Vulkan bugs.
Many of these fixes were in the previous, but beta-class drivers, 370.23. It, like 370.28, also includes experimental support for PRIME Synchronization. PRIME handles choosing which GPU drives a given display, which may be different from the GPU that is rendering that image. I'm not too familiar with the system, and I've heard some jokes from the Linux community over the last couple of years about its almost vaporware-like status, but I don't have any personal experience with it.
Subject: Graphics Cards | September 7, 2016 - 08:02 PM | Scott Michaud
Tagged: dirty pool, nvidia, geforce experience, geforce
Update (September 7th @ 9:34pm): It's been pointed out in our comments that the new GeForce Experience cannot be used without logging in. It supports NVIDIA, Google, and Facebook accounts.
It's been in Beta for a while, but NVIDIA has just officially launched their new GeForce Experience application. The release version is 184.108.40.206, so be sure to check for updates if you were in the beta and your settings panel shows an earlier version. Also, there's an “allow experimental features” checkbox right under the version number, too, also in the settings panel. It defaults to on for me, so you might want to take a look if you use GeForce Experience for anything professional (ex: Twitch streaming).
Anywho, the new version runs a lot better for me than the previous one. I used to have quite long load times, often literally in the minutes, with version 2. With version 3, it often pops up in less than a second, or maybe a couple of seconds at the worst.
Obviously, if you don't use GeForce Experience, then you don't really need to update. WHQL drivers can still be downloaded from their website (although installing drivers through GeForce Experience 3.0 has been fairly bug-free for me) and most of its other features can be obtained with other applications, like OBS Studio. That said, it's free and pretty good, so it's worth giving it a try.