The 1080 roundup, Pascal in all its glory

Subject: Graphics Cards | May 17, 2016 - 06:22 PM |
Tagged: nvidia, pascal, video, GTX 1080, gtx, GP104, geforce, founders edition

Yes that's right, if you felt Ryan and Al somehow missed something in our review of the new GTX 1080 or you felt the obvious pro-Matrox bios was showing here are the other reviews you can pick and choose from.  Start off with [H]ard|OCP who also tested Ashes of the Singularity and Doom as well as the old favourite Battlefield 4.  Doom really showed itself off as a next generation game, its Nightmare mode scoffing at any GPU with less than 5GB of VRAM available and pushing the single 1080 hard.  Read on to see how the competition stacked up ... or wait for the 1440 to come out some time in the future.

1463427458xepmrLV68z_1_8_l.jpg

"NVIDIA's next generation video card is here, the GeForce GTX 1080 Founders Edition video card based on the new Pascal architecture will be explored. We will compare it against the GeForce GTX 980 Ti and Radeon R9 Fury X in many games to find out what it is capable of."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Pairing up the R9 380X

Subject: Graphics Cards | May 16, 2016 - 03:52 PM |
Tagged: amd, r9 380x, crossfire

A pair of R9 380X's will cost you around $500, a bit more $100 less than a single GTX 980Ti and on par or a little less expensive than a straight GTX 980.  You have likely seen these cards compared but how often have you seen these cards pitted against a pair of GTX 960's which costs a little bit less than two 380X cards?  [H]ard|OCP decided it was worth investigating, perhaps for those who currently have a single one of these cards that are considering a second if the price is right.  The results are very tight, overall the two setups performed very similarly with some games favouring AMD and others NVIDIA, check out the full review here.

1462161482JkHsFf1A5H_1_1.jpg

"We are evaluating two Radeon R9 380X video cards in CrossFire against two GeForce GTX 960 video cards in a SLI arrangement. We will overclock each setup to its highest, to experience the full gaming benefit each configuration has to offer. Additionally we will compare a Radeon R9 380 CrossFire setup to help determine the best value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

PCPer Live! GeForce GTX 1080 Live Stream with Tom Petersen (Now with free cards!)

Subject: General Tech, Graphics Cards | May 16, 2016 - 03:19 PM |
Tagged: video, tom petersen, pascal, nvidia, live, GTX 1080, gtx, GP104, geforce

Our review of the GeForce GTX 1080 is LIVE NOW, so be sure you check that out before today's live stream!!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout and NVIDIA’s Tom Petersen. The general details about consumer Pascal and the GeForce GTX 1080 graphics card are already official and based on the traffic to our stories and the response on Twitter and YouTube, there is more than a little pent-up excitement. .

gtx10801.jpg

On hand to talk about the new graphics card, answer questions about technologies in the GeForce family including Pascal, SLI, VR, Simultaneous Multi-Projection and more will be Tom Petersen, well known in our community. We have done quite a few awesome live steams with Tom in the past, check them out if you haven't already.

pcperlive.png

NVIDIA GeForce GTX 1080 Live Stream

10am PT / 1pm ET - May 17th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, May 17th at 1pm ET / 10am PT at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience, asking questions for me and Tom to answer live. 

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else. Previous streams have produced news as well – including statements on support for Adaptive Sync, release dates for displays and first-ever demos of triple display G-Sync functionality. You never know what’s going to happen or what will be said!

UPDATE! UPDATE! UPDATE! This just in fellow gamers: Tom is going to be providing two GeForce GTX 1080 graphics cards to give away during the live stream! We won't be able to ship them until availability hits at the end of May, but two lucky viewers of the live stream will be able to get their paws on the fastest graphics card we have ever tested!! Make sure you are scheduled to be here on May 17th at 10am PT / 1pm ET!!

DSC00218.jpg

Don't you want to win me??!?

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 1pm ET / 10am PT and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

AMD Releases Radeon Software Crimson Edition 16.5.2 Beta

Subject: Graphics Cards | May 11, 2016 - 11:53 PM |
Tagged: amd, crimson, graphics drivers

For the second time this month, hence the version number, AMD has released a driver to coincide with a major game release. This one is for DOOM, which will be available on Friday. Like the previous driver, which was aligned with Forza, it has not been WHQL-certified. That's okay, though. NVIDIA's Game Ready drivers didn't strive for WHQL certification until just recently, and, even then, WHQL certification doesn't mean what it used to.

amd-2015-crimson-logo.png

But yeah, apart from game-specific optimizations for DOOM, 16.5.2 has a few extra reasons to be used. If you play Battleborn, which launched on May 3rd, then AMD has added a new CrossFire profile for that game. They have also fixed at least eleven issues (plus however many undocumented ones). It comes with ten known issues, but none of them seem particularly troubling. It seems to be mostly CrossFire-related issues.

You can pick up the driver at AMD's website.

Source: AMD

NVIDIA Limits GTX 1080 SLI to Two Cards

Subject: Graphics Cards | May 11, 2016 - 10:57 PM |
Tagged: sli, nvidia, GTX 1080, GeForce GTX 1080

Update (May 12th, 1:45am): Okay so the post has been deleted, which was originally from Chris Bencivenga, Support Manager at EVGA. A screenshot of it is attached below. Note that Jacob Freeman later posted that "More info about SLI support will be coming soon, please stay tuned." I guess this means take the news with a grain of salt until an official word can be released.

evga-2016-gtx1080sli.png

Original Post Below

According to EVGA, NVIDIA will not support three- and four-way SLI on the GeForce GTX 1080. They state that, even if you use the old, multi-way connectors, it will still be limited to two-way. The new SLI connector (called SLI HB) will provide better performance “than 2-way SLI did in the past on previous series”. This suggests that the old SLI connectors can be used with the GTX 1080, although with less performance and only for two cards.

nvidia-2016-dreamhack-newsli.png

This is the only hard information that we have on this change, but I will elaborate a bit based on what I know about graphics APIs. Basically, SLI (and CrossFire) are simplifications of the multi-GPU load-balancing problems such that it is easy to do from within the driver, without the game's involvement. In DirectX 11 and earlier, the game cannot interface with the driver in that way at all. That does not apply to DirectX 12 and Vulkan, however. In those APIs, you will be able to explicitly load-balance by querying all graphics devices (including APUs) and split the commands yourself.

Even though a few DirectX 12 games exist, it's still unclear how SLI and CrossFire will be utilized in the context of DirectX 12 and Vulkan. DirectX 12 has the tier of multi-GPU called “implicit multi-adapter,” which allows the driver to load balance. How will this decision affect those APIs? Could inter-card bandwidth even be offloaded via SLI HB in DirectX 12 and Vulkan at all? Not sure yet (but you would think that they would at least add a Vulkan extension). You should be able to use three GTX 1080s in titles that manually load-balance to three or more mismatched GPUs, but only for those games.

If it relies upon SLI, which is everything DirectX 11, then you cannot. You definitely cannot.

Source: EVGA

Here Comes the Maxwell Rebates

Subject: Graphics Cards | May 10, 2016 - 07:50 PM |
Tagged: nvidia, maxwell, GTX 980 Ti, GTX 970, GTX 1080, geforce

The GTX 1080 announcement is starting to ripple into retailers, leading to price cuts on the previous generation, Maxwell-based SKUs. If you were interested in the GTX 1080, or an AMD graphics card of course, then you probably want to keep waiting. That said, you can take advantage of the discounts to get a VR-ready GPU or if you already have a Maxwell card that could use a cheap SLI buddy.

evga-2016-980ti-new.jpg

This tip comes from a NeoGAF thread. Microcenter has several cards on sale, but EVGA seems to have the biggest price cuts. This 980 Ti has dropped from $750 USD down to $499.99 (or $474.99 if you'll promise yourself to do that mail-in rebate). That's a whole third of its price slashed, and puts it about a hundred dollars under GTX 1080. Granted, it will also be slower than the GTX 1080, with 2GB less video RAM, but $100 might be worth that for you.

They highlight two other EVGA cards as well. Both deals are slight variations on the GTX 970 line, and they are available for $250 and $255 ($225 and $230 after mail-in rebate).

Source: NeoGAF

Video Perspective: NVIDIA GeForce GTX 1080 Preview

Subject: Graphics Cards | May 10, 2016 - 07:29 PM |
Tagged: video, pascal, nvidia, GTX 1080, gtx 1070, geforce

After the live streamed event announcing the GeForce GTX 1080 and GTX 1070, Allyn and I spent a few minutes this afternoon going over the information as it was provided, discussing our excitement about the product and coming to grips with what in the world a "Founder's Edition" even is.

gtx1080small.jpg

If you haven't yet done so, check out Scott's summary post on the GTX 1080 and GTX 1070 specs right here.

Galax GeForce GTX 1080 Pictured with Custom Cooler

Subject: Graphics Cards | May 10, 2016 - 04:06 PM |
Tagged: video card, reference cooler, pascal, nvidia, GTX 1080, graphics card, GeForce GTX 1080, Founder's Edition

The first non-reference GTX 1080 has been revealed courtesy of Galax, and the images (via VideoCardz.com) look a lot different than the Founder's Edition.

GALAX-GeForce-GTX-1080-box.jpg

Galax GTX 1080 (Image Credit: VideoCardz)

The Galax is the first custom implementation of the GTX 1080 we've seen, and as such the first example of a $599 variant of the GTX 1080. The Founder's Edition cards carry a $100 premium (and offer that really nice industrial design) but ultimately it's about performance and the Galax card will presumably offer completely stock specifications.

GALAX-GeForce-GTX-1080-front.jpg

(Image Credit: VideoCardz)

Expect to see a deluge of aftermarket cooling from EVGA, ASUS, MSI, and others soon enough - most of which will presumably be using a dual or triple-fan cooler, and not a simple blower like this.

Source: VideoCardz

Microsoft updates Windows 10 UWP to support unlocked frame rates and G-Sync/FreeSync

Subject: Graphics Cards | May 10, 2016 - 12:11 PM |
Tagged: windows 10, windows, vrr, variable refresh rate, uwp, microsoft, g-sync, freesync

Back in March, Microsoft's Phil Spencer addressed some of the concerns over the Unified Windows Platform and PC gaming during his keynote address at the Build Conference. He noted that MS would "plan to open up VSync off, FreeSync, and G-Sync in May" and the company would "allow modding and overlays in UWP applications" sometime further into the future. Well it appears that Microsoft is on point with the May UWP update.

According to the MS DirectX Developer Blog, a Windows 10 update being pushed out today will enable UWP to support unlocked frame rates and variable refresh rate monitors in both G-Sync and FreeSync varieties. 

windows_8_logo-redux2.png

As a direct response to your feedback, we’re excited to announce the release today of new updates to Windows 10 that make gaming even better for game developers and gamers.

Later today, Windows 10 will be updated with two key new features:

Support for AMD’s FreesyncTM and NVIDIA’s G-SYNC™ in Universal Windows Platform games and apps

Unlocked frame rate for Universal Windows Platform (UWP) games and apps

Once applications take advantage of these new features, you will be able to play your UWP games with unlocked frame rates. We expect Gears of War: UE and Forza Motorsport 6: Apex to lead the way by adding this support in the very near future.

This OS update will be gradually rolled out to all machines, but you can download it directly here.

These updates to UWP join the already great support for unlocked frame rate and AMD and NVIDIA’s technologies in Windows 10 for classic Windows (Win32) apps.

Please keep the feedback coming!

Today's update won't automatically enable these features in UWP games like Gears of War or Quantum Break, they will still need to be updated individually by the developer. MS states that Gears of War and Forza will be the first to see these changes, but there is no mention of Quantum Break here, which is a game that could DEFINITELY benefit from the love of variable refresh rate monitors. 

Microsoft describes an unlocked frame rate as thus:

Vsync refers to the ability of an application to synchronize game rendering frames with the refresh rate of the monitor. When you use a game menu to “Disable vsync”, you instruct applications to render frames out of sync with the monitor refresh. Being able to render out of sync with the monitor refresh allows the game to render as fast as the graphics card is capable (unlocked frame rate), but this also means that “tearing” will occur. Tearing occurs when part of two different frames are on the screen at the same time.

I should note that these changes do not indicate that Microsoft is going to allow UWP games to go into an exclusive full screen mode - it still believes the disadvantages of that configuration outweigh the advantages. MS wants its overlays and a user's ability to easily Alt-Tab around Windows 10 to remain. Even though MS mentions screen tearing, I don't think that non-exclusive full screen applications will exhibit tearing.

gears.jpg

Gears of War on Windows 10 is a game that could definitely use an uncapped render rate and VRR support.

Instead, what is likely occurring, as we saw with the second iteration of the Ashes of the Singularity benchmark, is that the game will have an uncapped render rate internally but that frames rendered OVER 60 FPS (or the refresh rate of the display) will not be shown. This will improve perceived latency as the game will be able to present the most up to date frame (with the most update to date input data) when the monitor is ready for a new refresh. 

UPDATE 5/10/16 @ 4:31pm: Microsoft just got back to me and said that my above statement wasn't correct. Screen tearing will be able to occur in UWP games on Windows 10 after they integrate support for today's patch. Interesting!!

For G-Sync and FreeSync users, the ability to draw to the screen at any range of render rates will offer an even further advantage of uncapped frame rates, no tearing but also, no "dropped" frames caused by running at off-ratios of a standard monitor's refresh rate.

I'm glad to see Microsoft taking these steps at a brisk pace after the feedback from the PC community early in the year. As for UWP's continued evolution, the blog post does tease that we should "expect to see some exciting developments on multiple GPUs in DirectX 12 in the near future."

Source: MSDN

EKWB Releases AMD Radeon Pro Duo Full-Cover Water Block

Subject: Graphics Cards, Cases and Cooling | May 10, 2016 - 08:55 AM |
Tagged: water cooling, radeon pro duo, radeon, pro duo, liquid cooling, graphics cards, gpu cooler, gpu, EKWB, amd

While AMD's latest dual-GPU powerhouse comes with a rather beefy-looking liquid cooling system out of the box, the team at EK Water Blocks have nonetheless created their own full-cover block for the Pro Duo, which is now available in a pair of versions.

EKFC-Radeon-Pro-Duo_NP_fill_1600.jpg

"Radeon™ has done it again by creating the fastest gaming card in the world. Improving over the Radeon™ R9 295 X2, the Radeon Pro Duo card is faster and uses the 3rd generation GCN architecture featuring asynchronous shaders enables the latest DirectX™ 12 and Vulkan™ titles to deliver amazing 4K and VR gaming experiences. And now EK Water Blocks made sure, the owners can get the best possible liquid cooling solution for the card as well!"

EKFC-Radeon-Pro-Duo_pair.png

Nickel version (top), Acetal+Nickel version (bottom)

The blocks include a single-slot I/O bracket, which will allow the Pro Duo to fit in many more systems (and allow even more of them to be installed per motherboard!).

EKFC-Radeon-Pro-Duo_NP_input_1600-1500x999.jpg

"EK-FC Radeon Pro Duo water block features EK unique central inlet split-flow cooling engine with a micro fin design for best possible cooling performance of both GPU cores. The block design also allows flawless operation with reversed water flow without adversely affecting the cooling performance. Moreover, such design offers great hydraulic performance, allowing this product to be used in liquid cooling systems using weaker water pumps.

The base is made of nickel-plated electrolytic copper while the top is made of quality POM Acetal or acrylic (depending on the variant). Screw-in brass standoffs are pre-installed and allow for safe installation procedure."

Suggested pricing is set at 155.95€ for the blocks (approx. $177 US), and they are "readily available for purchase through EK Webshop and Partner Reseller Network".

Source: EKWB

What Are NVIDIA GeForce GTX 1080 Founders Edition Cards?

Subject: Graphics Cards | May 9, 2016 - 05:00 PM |
Tagged: pascal, nvidia, GTX 1080, geforce

During the GeForce GTX 1080 launch event, NVIDIA announced two prices for the card. The new GPU has an MSRP of $599 USD, while a Founders Edition will be available for $699 USD. They did not really elaborate on the difference at the keynote, but they apparently clarified the product structure for the attending press.

nvidia-2016-dreamhack-1080-specs.png

According to GamersNexus, the “Founders Edition” is NVIDIA's new branding for their reference design, which has been updated with the GeForce GTX 1080. That is it. Normally, a reference design is pretty much bottom-tier in a product stack. Apart from AMD's water-cooling experiments, reference designs are relatively simple, single-fan blower coolers. NVIDIA's reference cooler though, at least on their top-three-or-so models of any given generation, are pretty good. They are fairly quiet, effective, and aesthetically pleasing. When searching for a specific GPU online, you will often see a half-dozen entries based on this, from various AIB partners, and another half-dozen other offerings from those same companies, which is very different. MSI does their Twin Frozr thing, while ASUS has their Direct CU and Poseidon coolers.

If you want the $599 model, then, counter to what we've been conditioned to expect, you will not be buying NVIDIA's reference cooler. These will come from AIB partners, which means that NVIDIA is (at least somewhat) allowing them to set a minimum product this time around. They expect reference cards to be intrinsically valuable, not just purchased because they rank highest on a “sort by lowest price” metric.

This is interesting for a number of reasons. It wasn't too long ago that NVIDIA finally allowed AIB vendors to customize Titan-level graphics cards. Before that, NVIDIA's reference cooler was the only option. When they released control to their partners, we started to see water cooled Titan Xs. There is two ways to look at it: either NVIDIA is relaxing their policy of controlling user experience, or they want their personal brand to be more than the cheapest offering of their part. Granted, the GTX 1080 is supposed to be their high-end, but still mainstream offering.

It's just interesting to see this decision and rationalize it both as a release of control over user experience, and, simultaneously, as an increase of it.

Source: GamersNexus

AMD Releases Radeon Software Crimson Edition 16.5.1 Beta

Subject: Graphics Cards | May 9, 2016 - 02:05 PM |
Tagged: amd, graphics drivers, crimson

This is good to see. AMD has released Radeon Software Crimson Edition 16.5.1 to align with Forza Motorsport 6: Apex. The drivers are classified as Beta, and so is the game, coincidentally, which means 16.5.1 is not WHQL-certified. That doesn't have the weight that it used to, though. Its only listed feature is performance improvements with that title, especially for the R9 Fury X graphics card. Game-specific optimizations near launch appear to be getting consistent, and that was an area that AMD really needed to improve upon, historically.

amd-2015-crimson-logo.png

There are a handful of known issues, but they don't seem particularly concerning. The AMD Gaming Evolved overlay may crash in some titles, and The Witcher 3 may flicker in Crossfire, both of which could be annoying if they affect a game that you have been focusing on, but that's about it. There might be other issues (and improvements) that are not listed in the notes, but that's all I have to work on at the moment.

If you're interested in Forza 6: Apex, check out AMD's download page.

Source: AMD

NVIDIA GeForce GTX 1080 and GTX 1070 Announced

Subject: Graphics Cards | May 6, 2016 - 10:38 PM |
Tagged: pascal, nvidia, GTX 1080, gtx 1070, GP104, geforce

So NVIDIA has announced their next generation of graphics processors, based on the Pascal architecture. They introduced it as “a new king,” because they claim that it is faster than the Titan X, even at a lower power. It will be available “around the world” on May 27th for $599 USD (MSRP). The GTX 1070 was also announced, with slightly reduced specifications, and it will be available on June 10th for $379 USD (MSRP).

nvidia-2016-dreamhack-1080-photo.png

Pascal is created on the 16nm process at TSMC, which gives them a lot of headroom. They have fewer shaders than the Titan X, but with a significantly higher clock rate. It also uses GDDR5X, which is an incremental improvement over GDDR5. We knew it wasn't going to use HBM2.0, like Big Pascal does, but it's interesting that they did not stick with old, reliable GDDR5.

nvidia-2016-dreamhack-1080-specs.png

nvidia-2016-dreamhack-1070-specs.png

The full specifications of the GTX 1080 are as follows:

  • 2560 CUDA Cores
  • 1607 MHz Base Clock (8.2 TFLOPs)
  • 1733 MHz Boost Clock (8.9 TFLOPs)
  • 8GB GDDR5X Memory at 320 GB/s (256-bit)
  • 180W Listed Power (Update: uses 1x 8-pin power)

We do not currently have the specifications of the GTX 1070, apart from it being 6.5 TFLOPs.

nvidia-2016-dreamhack-1080-stockphoto.png

It also looks like it has five display outputs: 3x DisplayPort 1.2, which are “ready” for 1.3 and 1.4, 1x HDMI 2.0b, and 1x DL-DVI. They do not explicitly state that all three DisplayPorts will run on the same standard, even though that seems likely. They also do not state whether all five outputs can be used simultaneously, but I hope that they can be.

nvidia-2016-dreamhack-newsli.png

They also have a new SLI bridge, called SLI HB Bridge, that is supposed to have double the bandwidth of Maxwell. I'm not sure what that will mean for multi-gpu systems, but it will probably be something we'll find out about soon.

Source: NVIDIA

NVIDIA GeForce "GTX 1080" Benchmark Leaked

Subject: Graphics Cards | May 5, 2016 - 02:38 PM |
Tagged: nvidia, pascal, geforce

We're expecting a major announcement tomorrow... at some point. NVIDIA create a teaser website, called “Order of 10,” that is counting down to a 1PM EDT. On the same day, at 9PM EDT, they will have a live stream on their Twitch channel. This wasn't planned as long-term as their Game24 event, which turned out to be a GTX 970 and GTX 980 launch party, but it wouldn't surprise me if it ended up being a similar format. I don't know for sure whether one or both events will be about the new mainstream Pascal, but it would be surprising if Friday ends (for North America) without a GPU launch of some sort.

nvidia-geforce.png

VideoCardz got a hold of 3DMark Fire Strike Extreme benchmarks, though. It is registered as an 8GB card with a GPU clock of 1860 MHz. While synthetic benchmarks, let alone a single benchmark of anything, isn't necessarily representative of overall performance, it scores slightly higher than a reasonably overclocked GTX 980 Ti (and way above a stock one). Specifically, this card yields a graphics score of 10102 on Fire Strike Extreme 1.1, while the 980 Ti achieved 7781 for us without an overclock.

We expected a substantial bump in clock rate, especially after GP100 was announced at GTC. This “full” Pascal chip was listed at a 1328 MHz clock, with a 1480 MHz boost. Enterprise GPUs are often underclocked compared to consumer parts, stock to stock. As stated a few times, overclocking could be a huge gap, too. The GTX 980 Ti was able to go from 1190 MHz to 1465 MHz. On the other hand, consumer Pascal's recorded 1860 MHz could itself be an overclock. We won't know until NVIDIA makes an official release. If not, maybe we could see these new parts break 2 GHz in general use?

Source: VideoCardz

This is your AMD APU. This is your AMD APU on DX12; any questions?

Subject: Graphics Cards | April 29, 2016 - 07:09 PM |
Tagged: amd, dx12, async shaders

Earlier in the month [H]ard|OCP investigated the performance scaling that Intel processors display in DX12, now they have finished their tests on AMD processors.  These tests include Async computing information, so be warned before venturing forth into the comments.  [H] tested an FX 8370 at 2GHz and 4.3GHz to see what effect this had on the games, the 3GHz tests did not add any value and were dropped in favour of these two turbo frequencies.  There are some rather interesting results and discussion, drop by for the details.

1460040024Wl2AHtuhmm_5_1.gif

"One thing that has been on our minds about the new DX12 API is its ability to distribute workloads better on the CPU side. Now that we finally have a couple of new DX12 games that have been released to test, we spend a bit of time getting to bottom of what DX12 might be able to do for you. And a couple sentences on Async Compute."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD Radeon Crimson Edition 16.4.2 hits the streets

Subject: Graphics Cards | April 25, 2016 - 04:41 PM |
Tagged: graphics driver, crimson, amd

AMD's new Crimson driver has just been released with new features including official support for the new Radeon Pro Duo as well as both the Oculus Rift and HTC Vive VR headsets.  It also adds enhanced support for AMD's XConnect technology for external GPUs connected via a Thunderbolt 3 interface.  Crossfire profile updates include Hitman, Elite Dangerous and Need for Speed and they have also resolved the ongoing issue with the internal update procedure not seeing the newest drivers.  If you are having issues with games crashing to desktop on launch you will still need to disable the AMD Gaming Evolved overlay, unfortunately.

Get 'em right here!

radeon-crimson.jpg

"The latest version of Radeon Software Crimson Edition is here with 16.4.2. With this version, AMD delivers many quality improvements, updated/introduced new CrossFire profiles and delivered full support for AMD’s XConnect technology (including plug’n’play simplicity for Thunderbolt 3 eGFX enclosures configured with Radeon R9 Fury, Nano or 300 Series GPUs.)  Best of all, our DirectX 12 leadership continues to be strong, as shown by the performance numbers below."

Capture.JPG

Source: AMD

Report: NVIDIA GP104 Die Pictured; GTX 1080 Does Not Use HBM

Subject: Graphics Cards | April 22, 2016 - 10:16 AM |
Tagged: rumor, report, pascal, nvidia, leak, graphics card, gpu, gddr5x, GDDR5

According to a report from VideoCardz (via Overclock.net/Chip Hell) high quality images have leaked of the upcoming GP104 die, which is expected to power the GeForce GTX 1070 graphics card.

NVIDIA-GP104-GPU.jpg

Image credit: VideoCardz.com

"This GP104-200 variant is supposedly planned for GeForce GTX 1070. Although it is a cut-down version of GP104-400, both GPUs will look exactly the same. The only difference being modified GPU configuration. The high quality picture is perfect material for comparison."

A couple of interesting things have emerged with this die shot, with the relatively small size of the GPU (die size estimated at 333 mm2), and the assumption that this will be using conventional GDDR5 memory - based on a previously leaked photo of the die on PCB.

NVIDIA-Pascal-GP104-200-on-PCB.jpg

Alleged photo of GP104 using GDDR5 memory (Image credit: VideoCardz via ChipHell)

"Leaker also says that GTX 1080 will feature GDDR5X memory, while GTX 1070 will stick to GDDR5 standard, both using 256-bit memory bus. Cards based on GP104 GPU are to be equipped with three DisplayPorts, HDMI and DVI."

While this is no doubt disappointing to those anticipating HBM with the upcoming Pascal consumer GPUs, the move isn't all that surprising considering the consistent rumors that GTX 1080 would use GDDR5X.

Is the lack of HBM (or HBM2) enough to make you skip this generation of GeForce GPU? This author points out that AMD's Fury X - the first GPU to use HBM - was still unable to beat a GTX 980 Ti in many tests, even though the 980 Ti uses conventional GDDR5. Memory is obviously important, but the core defines the performance of the GPU.

If NVIDIA has made improvements to performance and efficiency we should see impressive numbers, but this might be a more iterative update than originally expected - which only gives AMD more of a chance to win marketshare with their upcoming Radeon 400-series GPUs. It should be an interesting summer.

Source: VideoCardz

Zotac Releases PCI-E x1 Version of NVIDIA GT 710 Graphics Card

Subject: Graphics Cards | April 21, 2016 - 08:37 AM |
Tagged:

Zotac has released a new variant of the low-power NVIDIA GeForce GT 710, and while this wouldn't normally be news this card has a very important distinction: its PCI-E x1 interface.

zt-71304-20l_image1.jpg

With a single-slot design, low-profile ready (with a pair of brackets included), and that PCI-E x1 interface, this card can go places where GPUs have never been able to go (AFAIK). Granted, you won't be doing much gaming on a GT 710, which features 192 CUDA cores and 1GB of DDR3 memory, but this card does provide support for up to 3 monitors via DVI, HDMI, and VGA outputs. 

A PCI-E x1 GPU would certainly provide some interesting options for ultra-compact systems such as those based on thin mini-ITX, which does not offer a full-length PCI Express slot; or for adding additional monitor support for business machines that only offer a single PCI-E x16 slot, but have a x1 slot available.

zt-71304-20l_image5.jpg

Specifications from Zotac:

Zotac ZT-71304-20L

  • GPU: GeForce GT 710
  • CUDA cores: 192
  • Video Memory: 1GB DDR3
  • Memory Bus: 64-bit
  • Engine Clock: 954 MHz
  • Memory Clock: 1600 MHz
  • PCI Express: PCI-E x1
  • Display Outputs: DL-DVI, VGA, HDMI
  • HDCP Support: Yes
  • Multi Display Capability: 3
  • Recommended Power Supply: 300W
  • Power Consumption: 25W
  • Power Input: N/A
  • API Support: DirectX 12 (feature level 11_0), OpenGL 4.5
  • Cooling: Passive
  • Slot Size: Single Slot
  • SLI: N/A
  • Supported OS: Windows 10 / 8 / 7 / Vista / XP
  • Card Length: 146.05mm x 111.15mm
  • Accessories: 2x Low profile bracket I/O brackets, Driver Disk, User Manual

zt-71304-20l_image7.jpg

The card, which is listed with the model ZT-71304-20L, has not yet appeared on any U.S. sites for purchase (that I can find, anyway), so we will have to wait to see where pricing will be.

Source: Zotac

Sony plans PlayStation NEO with massive APU hardware upgrade

Subject: Graphics Cards, Processors | April 19, 2016 - 11:21 AM |
Tagged: sony, ps4, Playstation, neo, giant bomb, APU, amd

Based on a new report coming from Giant Bomb, Sony is set to release a new console this year with upgraded processing power and a focus on 4K capabilities, code named NEO. We have been hearing for several weeks that both Microsoft and Sony were planning partial generation upgrades but it appears that details for Sony's update have started leaking out in greater detail, if you believe the reports.

Giant Bomb isn't known for tossing around speculation and tends to only report details it can safely confirm. Austin Walker says "multiple sources have confirmed for us details of the project, which is internally referred to as the NEO." 

ps4gpu.jpg

The current PlayStation 4 APU
Image source: iFixIt.com

There are plenty of interesting details in the story, including Sony's determination to not split the user base with multiple consoles by forcing developers to have a mode for the "base" PS4 and one for NEO. But most interesting to us is the possible hardware upgrade.

The NEO will feature a higher clock speed than the original PS4, an improved GPU, and higher bandwidth on the memory. The documents we've received note that the HDD in the NEO is the same as that in the original PlayStation 4, but it's not clear if that means in terms of capacity or connection speed.

...

Games running in NEO mode will be able to use the hardware upgrades (and an additional 512 MiB in the memory budget) to offer increased and more stable frame rate and higher visual fidelity, at least when those games run at 1080p on HDTVs. The NEO will also support 4K image output, but games themselves are not required to be 4K native.

Giant Bomb even has details on the architectural changes.

  Shipping PS4 PS4 "NEO"
CPU 8 Jaguar Cores @ 1.6 GHz 8 Jaguar Cores @ 2.1 GHz
GPU AMD GCN, 18 CUs @ 800 MHz AMD GCN+, 36 CUs @ 911 MHz
Stream Processors 1152 SPs ~ HD 7870 equiv. 2304 SPs ~ R9 390 equiv.
Memory 8GB GDDR5 @ 176 GB/s 8GB GDDR5 @ 218 GB/s

(We actually did a full video teardown of the PS4 on launch day!)

If the Compute Unit count is right from the GB report, then the PS4 NEO system will have 2,304 stream processors running at 911 MHz, giving it performance nearing that of a consumer Radeon R9 390 graphics card. The R9 390 has 2,560 SPs running at around 1.0 GHz, so while the NEO would be slower, it would be a substantial upgrade over the current PS4 hardware and the Xbox One. Memory bandwidth on NEO is still much lower than a desktop add-in card (218 GB/s vs 384 GB/s).

DSC02539.jpg

Could Sony's NEO platform rival the R9 390?

If the NEO hardware is based on Grenada / Hawaii GPU design, there are some interesting questions to ask. With the push into 4K that we expect with the upgraded PlayStation, it would be painful if the GPU didn't natively support HDMI 2.0 (4K @ 60 Hz). With the modularity of current semi-custom APU designs it is likely that AMD could swap out the display controller on NEO with one that can support HDMI 2.0 even though no consumer shipping graphics cards in the 300-series does so. 

It is also POSSIBLE that NEO is based on the upcoming AMD Polaris GPU architecture, which supports HDR and HDMI 2.0 natively. That would be a much more impressive feat for both Sony and AMD, as we have yet to see Polaris released in any consumer GPU. Couple that with the variables of 14/16nm FinFET process production and you have a complicated production pipe that would need significant monitoring. It would potentially lower cost on the build side and lower power consumption for the NEO device, but I would be surprised if Sony wanted to take a chance on the first generation of tech from AMD / Samsung / Global Foundries.

However, if you look at recent rumors swirling about the June announcement of the Radeon R9 480 using the Polaris architecture, it is said to have 2,304 stream processors, perfectly matching the NEO specs above.

polaris-5.jpg

New features of the AMD Polaris architecture due this summer

There is a lot Sony and game developers could do with roughly twice the GPU compute capability on a console like NEO. This could make the PlayStation VR a much more comparable platform to the Oculus Rift and HTC Vive though the necessity to work with the original PS4 platform might hinder the upgrade path. 

The other obvious use is to upgrade the image quality and/or rendering resolution of current games and games in development or just to improve the frame rate, an area that many current generation consoles seem to have been slipping on

In the documents we’ve received, Sony offers suggestions for reaching 4K/UltraHD resolutions for NEO mode game builds, but they're also giving developers a degree of freedom with how to approach this. 4K TV owners should expect the NEO to upscale games to fit the format, but one place Sony is unwilling to bend is on frame rate. Throughout the documents, Sony repeatedly reminds developers that the frame rate of games in NEO Mode must meet or exceed the frame rate of the game on the original PS4 system.

There is still plenty to read in the Giant Bomb report, and I suggest you head over and do so. If you thought the summer was going to be interesting solely because of new GPU releases from AMD and NVIDIA, it appears that Sony and Microsoft have their own agenda as well.

Source: Giant Bomb

Report: NVIDIA GTX 1080 GPU Cooler Pictured

Subject: Graphics Cards | April 19, 2016 - 11:08 AM |
Tagged: rumor, report, nvidia, leak, GTX 1080, graphics card, gpu, geforce

Another reported photo of an upcoming GTX 1080 graphics card has appeared online, this time via a post on Baidu.

GTX1080.jpg

(Image credit: VR-Zone, via Baidu)

The image is typically low-resolution and features the slightly soft focus we've come to expect from alleged leaks. This doesn't mean it's not legitimate, and this isn't the first time we have seen this design. This image also appears to only be the cooler, without an actual graphics card board underneath.

We have reported on the upcoming GPU rumored to be named "GTX 1080" in the recent past, and while no official announcement has been made it seems safe to assume that a successor to the current 900-series GPUs is forthcoming.

Source: VR-Zone