All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | May 31, 2016 - 08:23 PM | Sebastian Peak
Tagged: video card, rumor, report, Radeon RX 480, radeon, Polaris, graphics card, gpu, amd
The Wall Street Journal is reporting that AMD's upcoming Polaris graphics cards will be priced no higher than $199, a startling move to say the least.
The report arrives via VideoCardz.com:
"According to WSJ article, Polaris GPUs will cost no more than 199 USD. First systems equipped with Polaris GPUs will be available end of June:
'Advanced Micro Devices Inc. is angling to lower the cost of virtual reality, targeting the field with a new line of graphics hardware priced at $199—half or less the cost of comparable products.
AMD said the first chips based on its new Polaris design are expected to arrive in graphics cards for personal computers at the end of June. The company aims to help push the starting cost of PCs that can deliver VR experiences as low as $799 from above $1,000.'"
The report lists the high-end Polaris card as the "RX 480", which would be a departure from the recent nomenclature (R9 290X, R9 390X). Pricing such a card this aggressively not only creates what one would hope to be an incredible price/performance ratio, but is likely an answer to NVIDIA's GTX 1080/1070 - especially considering NVIDIA's new GTX 1070 is as fast as a GTX 980 Ti.
Is the Radeon RX 480 really the top end card, or a lower-cost variant? Will there be a 490, or 490X? This report certainly doesn't answer any questions, but the possibility of a powerful new GPU for $199 is very appealing.
Subject: Graphics Cards, Cases and Cooling, Shows and Expos | May 31, 2016 - 07:22 PM | Jeremy Hellstrom
Tagged: supernova, SLI HB, gtx 1080 FTW, GTX 1080 classified, gtx 1070 SC, evga, Dawn, computex 2016, 850GS, 1000GX
Earlier in the week you saw a sneak peek at EVGA's GTX 1080 SC and now we can confirm there will indeed be a GTX 1070 version, bearing the same custom ACX 3.0 cooler which Al proved to be an improvement over the Founders Edition, especially when you consider the price to performance equation.
That is not the only new card announced, there is also a brand new GTX 1080 Classified with the ACX 3.0 cooler, the specific overclock is not yet known but you can bet it should be more a handful of megahertz more than the base. We know that this card will have 14 power phases, full RGB LEDs and those ACX fans are 10cm in size.
The more curious of you might have noticed there is something odd about the back end of the card and the location of the PCIe power plugs and you are absolutely right. The EVGA Power Link adapter can be plugged into any card to move the position of the power plugs to give you better cable management abilities. It also sports an LED light on one side, as you can see in the picture of the GTX 1080 Classified card.
To round out the usual suspects, here you can see the EVGA GTX 1080 FTW Edition with some fancy LEDs. Those lights match the CPU waterblock in evidence just above the card, we didn't hear anything official about it but perhaps that is yet another thing to look forward to in the coming year.
Speaking of adapters, here you can see EVGA's custom SLI HB bridges in three different sizes, especially of importance to those who plan on using a code to enable more than two Pascal cards to run in SLI. They will connect the cards at up toand will sport LEDs which can be toggled between red, green, blue or white via a switch on the bridge.
Next up is an EVGA Gaming chassis bearing the names DG-87 and Dawn, hinting that there may be more than model arriving in the near future. As you can see in the picture the front panel, up to and including the power and reset buttons, has been moved to the side. USB including a Type-C plug, HMDI and audio are all available at the sides as well as a LCD which can display the speeds of two of your fans, as well as allowing you control over their speeds. You can also set it to display a temperature, although it is unlikely you can reduce it the same way you can your fan speed. The case also obviously handles watercooling setups as stylishly as Jacob sports those jeans.
Along with the case comes two new PSUs, the successors to the G2 models. The EVGA Supernova 1000GX not only provides 1000W of 80 PLUS Gold power, it will also the smallest kilowatt class PSU available at launch. It looks to have single 12V rail which will provide up to 999.6W @ 83.3A.
If that is a little more power than you need the Supernova 850GL might be more to your taste. It is also 80 PLUS Gold and fully modular, with up to 849.6W @ 70.8A, which should handle all but the most extreme GPU setups. That picture also shows off the certain glow your system will feel when powered by one of these PSUs.
With all these lights and features it would be a shame to have boring PSU cables, now wouldn't it? That is why EVGA is also releasing PSU cables in a wide variety of colours. The ones shown below are only a small sampling of what you can choose from, more will be available from EVGA once they launch.
That is all from EVGA so far but stay tuned for more from Computex here at PC Perspective!
Subject: Graphics Cards | May 30, 2016 - 03:50 PM | Jeremy Hellstrom
Tagged: geforce, GP104, gtx 1070, nvidia, pascal
If we missed your favourite game, synthetic benchmark or a specific competitors card in our review of the new GTX 1070 then perhaps one of the sites below might satisfy your cravings. For instance, if it is Ashes of the Singularity or The Division which you want to see benchmarked the [H]ard|OCP has you covered. They also had a go at overclocking, with the new software they tweaked the card's fan speed to 100%, power target at 112%, and GPU Offset overclocking at +230. That resulted in a peak GPU speed of 2113MHz although the averaged frequency over a 30 minute gaming session was 2052MHz, they will revisit the card to overclock the memory in the near future. Check out their full review here.
"The second video card in the NVIDIA next generation Pascal GPU architecture is finally here, we will explore the GeForce GTX 1070 Founders Edition video card. In this limited preview today we will look at performance in comparison to GeForce GTX 980 Ti and Radeon R9 Fury X as well as some preview overclocking."
Here are some more Graphics Card articles from around the web:
- GeForce GTX 1070 FCAT Frametime Anaysis @ Guru of 3D
- NVIDIA GTX 1070 Review - The Revolution Continues @HiTech Legion
- Nvidia GeForce GTX 1070 @ Legion Hardware
- NVIDIA GeForce GTX 1070 Founders Edition Review @ OCC
- Radeon Linux 4.6 + Mesa 11.3 vs. NVIDIA Linux Performance & Perf-Per-Watt @ Phoronix
- XFX Radeon R9 Fury Triple Dissipation @ [H]ard|OCP
- NVIDIA GeForce GTX 1080 vs Titan X vs R9 Fury vs GTX 980 Ti vs GTX 980 vs R9 390X @HiTech Legion
- NVIDIA GeForce GTX 1080 Overclocking Review @ OCC
Subject: Graphics Cards | May 30, 2016 - 03:24 PM | Jeremy Hellstrom
Tagged: G1080, phanteks, computex 2016
Walnut, California, May, 30 th , 2016 - Phanteks a leader in thermal cooling, is excited to launch their very first water block designed for the new Nvidia GTX 1080 Founders Edition. The G1080 is made from premium materials and finest standards of craftsmanship from Phanteks.
From military standard Viton O-ring to the RGB LED lighting, all of our exceptionally high-quality materials are carefully selected. The water block features a nickel-plated copper cold plate, acrylic top and sandblasted cover plates for an elegant look.
The G1080 uses military class Viton O-ring compared to silicone O-rings used by other manufacturers, well known for its excellent heat resistance and its durability. The G1080 also features RGB LED lighting that can sync with Phanteks cases that supports RGB lighting or motherboards’ RGB software by using our Phanteks RGB adapter.
Available at most local retailers - MSRP: $129.99 / €129,90 / £99.99 (VAT included)
Subject: Graphics Cards, Motherboards, Systems, Shows and Expos | May 30, 2016 - 08:04 AM | Ryan Shrout
Tagged: crazy people, concept, computex 2016, computex, avalon, asus
If you expected Computex to be bland and stale this year, ASUS has something that is going to change your mind. During the company's Republic of Gamers press conference, it revealed a concept PC design it has been working on dubbed Avalon. The goal of this project was to improve on the fundamental design of the PC; something that hasn't changed for decades. ASUS wanted to show that you could build a platform that would allow DIY machines to be "more modular, easier to build, and more tightly integrated."
The result is a proof of concept design that looks more like a high end turntable than a PC. In reality, you are looking at a machine that has been totally redesigned, from the power supply to motherboard and case integration to cooling considerations and more. ASUS has posted a great story that goes into a lot of detail on Avalon, and it's clear this is a project the team has been working on for some time.
The brainchild of Jonathan Chu, the Avalon concept takes a notebook-like approach to desktop design. The motherboard is designed in conjunction with the chassis to enable more seamless cooperation between the two.
The first example of changes to Avalon is something as simple as the front panel connectors on a case. Connecting them to your motherboard is the same today, basically, as it has ever been. But if you are the manufacturer or designer of both the chassis and the motherboard itself, it is trivial to have the buttons, lights and even additional capabilities built into a specific location on the PCB that matches with access points on the case.
Re-thinking the rear IO panel was another target: making it modular and connected to the system via PCI Express means you can swap connectivity options based on the user's needs. Multiple Gigabit NICs a requirement? Done. Maximum USB capability? Sure. Even better, by making the back panel IO a connected device, it can host storage and sound controllers on its own, allowing for improved audio solutions and flexible data configurations.
ASUS even worked in a prototype power supply that is based on the SFX form factor but that uses a server-style edge connector, removing wires from the equation. It then becomes the motherboard's responsibility to distribute power through the other components; which again is easy to work through if you are designing these things in tandem. Installing or swapping a power supply becomes as simple as pulling out a drive tray.
This is all made possible by an internal structure that looks like this:
Rethinking how a motherboard is built, how it connects to the outside world and to other components, means that ASUS was able to adjust and change just about everything. The only area that remains the same is for the discrete graphics card. These tend to draw too much power to use any kind of edge connector (though the ASUS story linked above says they are working on a solution) and thus you see short run cables from a break out on the motherboard to the standard ROG graphics card.
The ASUS EdgeUp story has some more images and details and I would encourage you to check it out if you find this topic compelling; I know I do. There are no prices, no release dates, no plans for sampling yet. ASUS has built a prototype that is "right on the edge of what’s possible" and they are looking for feedback from the community to see what direction they should go next.
Will the DIY PC in 2020 be a completely different thing than we build today? It seems ASUS is asking the same question.
Subject: Graphics Cards | May 29, 2016 - 05:46 PM | Scott Michaud
Tagged: msi, GTX 1080, sea hawk, gaming x, armor, Aero, nvidia
Beyond the Founders Edition, MSI has prepared six SKUs of the GTX 1080. These consists of four variants, two of which have an overclocked counterpart to make up the remaining two products. The product stack seems quite interesting, with a steady progression of user needs, but we'll need to wait for price and availability to know for sure.
We'll start at the bottom with the MSI GeForce GTX 1080 AERO 8G and MSI GeForce GTX 1080 AERO 8G OC. These are your typical blower designs that pull air in from within the case, and exhausts it out the back after collecting a bunch of heat from the GPU. It will work, it should be one of the cheapest options for this card, and it will keep the GTX 1080's heat outside of the case. It has a little silver accent on it, too. The non-overclocked version is the standard 1607 MHz / 1733 MHz that NVIDIA advertises, and the OC SKU is a little higher: 1632 MHz / 1771 MHz.
Next up the product stack are the MSI GeForce GTX 1080 ARMOR 8G and MSI GeForce GTX 1080 ARMOR 8G OC versions. This uses MSI's aftermarket, two-fan cooler that should provide much lower temperatures than AERO, but they exhaust back into the case. Personally? I don't really care about that. The only other thing that heats up in my case, to any concerning level at least, is my CPU, and I recently switched that to a closed-loop water cooler anyway. MSI added an extra, six-pin power connector to these cards (totaling 8-pin + 6-pin + slot power = up-to 300W, versus 8-pin + slot power's 225W). The non-overclocked version is NVIDIA's base 1607 MHz / 1733 MHz, but OC brings that up to 1657 MHz / 1797 MHz.
Speaking of closed-loop water coolers... The MSI GeForce GTX 1080 SEA HAWK takes the AERO design, which we mentioned earlier, and puts a Corsair self-contained water cooler inside it, too. Only one SKU of this is available, clocked at 1708 MHz base and 1847 MHz boost, but it should support overclocking fairly easily. That said, unlike other options that add a bonus six-pin connector, the SEA HAWK has just one, eight-pin connector. Good enough for the Founders Edition, but other SKUs (including three of the other cards in this post) suggest that there's a reason to up the power ceiling.
We now get to MSI's top, air-cooled SKU: the MSI GeForce GTX 1080 GAMING X 8G. This one has their new TWIN FROZR VI, which they claim spins quieter and has fans that drag more air to spin slower than previous models. It, as you would assume from reading about ARMOR 8G, has an extra, six-pin power connector to provide more overclocking headroom. It has three modes: Silent, which clocks the card to the standard 1607 MHz / 1733 MHz levels; Gaming, which significantly raises that to 1683 MHz / 1822 MHz; and OC, which bumps that slightly further to 1708 MHz / 1847 MHz.
Currently, no pricing and availability for any of these.
Subject: Graphics Cards | May 28, 2016 - 05:00 PM | Scott Michaud
Tagged: asus, ROG, strix, GTX 1080, nvidia
The Founders Edition versions of the GTX 1080 went on sale yesterday, but we're beginning to see the third-party variants being announced. In this case, the ASUS ROG Strix is a three-fan design that uses their DirectCU III heatsink. More interestingly, ASUS decided to increase the amount of wattage that this card can accept by adding an extra, six-pin PCIe power connector (totaling 8-pin + 6-pin). A Founders Edition card only requires a single, eight-pin connection over the 75W provided by the PCIe slot itself. This provides an extra 75W of play room for the ROG Strix card, raising the maximum power from 225W to 300W.
Some of this power will be used for its on-card, RGB LED lighting, but I doubt that it was the reason for the extra 75W of headroom. The lights follow the edges of the card, acting like hats and bow-ties to the three fans. (Yes, you will never unsee that now.) The shroud is also modular, and ASUS provides the data for enthusiasts to 3D print their own modifications (albeit their warranty doesn't cover damage caused by this level of customization).
As for the actual performance, the card naturally comes with an overclock out of the box. The default “Gaming Mode” has a 1759 MHz base clock with an 1898 MHz boost. You can flip this into “OC Mode” for a slight, two-digit increase to 1784 MHz base and 1936 MHz boost. It is significantly higher than the Founders Edition, though, which has a base clock of 1607 MHz that boosts to 1733 MHz. The extra power will likely help manual overclocks, but it will come down to “silicon lottery” whether your specific chip was abnormally less influenced by manufacturing defects. We also don't know yet whether the Pascal architecture, and the 16nm process it relies upon, has any physical limits that will increasingly resist overclocks past a certain frequency.
Pricing and availability is not yet announced.
Subject: Graphics Cards | May 27, 2016 - 02:58 PM | Allyn Malventano
Tagged: sli, review, led, HB, gtx, evga, Bridge, ACX 3.0, 3dmark, 1080
...so the time where we manage to get multiple GTX 1080's in the office here would, of course, be when Ryan is on the other side of the planet. We are also missing some other semi-required items, like the new 'SLI HB 'bridge, but we should be able to test on an older LED bridge at 2560x1440 (under the resolution where the newer style is absolutely necessary to avoid a sub-optimal experience). That said, surely the storage guy can squeeze out a quick run of 3DMark to check out the SLI scaling, right?
For this testing, I spent just a few minutes with EVGA's OC Scanner to take advantage of GPU Boost 3.0. I cranked the power limits and fans on both cards, ending up at a stable overclock hovering at right around 2 GHz on the pair. I'm leaving out the details of the second GPU we got in for testing as it may be under NDA and I can't confirm that as all of the people to ask are in an opposite time zone, so I'm leaving out that for now (pfft - it has an aftermarket cooler). Then I simply ran Firestrike (25x14) with SLI disabled:
...and then with it enabled:
That works out to a 92% gain in 3DMark score, with the FPS figures jumping by almost exactly 2x. Now remember, this is by no means a controlled test, and the boss will be cranking out a much more detailed piece with frame rated results galore in the future, but for now I just wanted to get some quick figures out to the masses for consumption and confirmation that 1080 SLI is a doable thing, even on an older bridge.
*edit* here's another teaser:
Aftermarket coolers are a good thing as evidenced by the 47c of that second GPU, but the Founders Edition blower-style cooler is still able to get past 2GHz just fine. Both cards had their fans at max speed in this example.
I was able to confirm we are not under NDA on the additional card we received. Behold:
This is the EVGA Superclocked edition with their ACX 3.0 cooler.
More to follow (yes, again)!
Subject: Graphics Cards | May 24, 2016 - 09:46 PM | Sebastian Peak
Tagged: vulkan, radeon, overwatch, graphics driver, Crimson Edition 16.5.3, crimson, amd
AMD has released new drivers for Overwatch (and more) with Radeon Software Crimson Edition 16.5.3.
"Radeon Software Crimson Edition is AMD's revolutionary new graphics software that delivers redesigned functionality, supercharged graphics performance, remarkable new features, and innovation that redefines the overall user experience. Every Radeon Software release strives to deliver new features, better performance and stability improvements."
AMD lists these highlights for Radeon Software Crimson Edition 16.5.3:
- Total War: Warhammer
- Dota 2 (with Vulkan API)
New AMD Crossfire profile available for:
- Total War: Warhammer
The driver is available from AMD from the following direct links:
- AMD Radeon Software Crimson Edition 16.5.3 Driver for Windows® 10, Windows 8.1 & Windows 7 64-bit
- AMD Radeon Software Crimson Edition 16.5.3 Driver for Windows® 10, Windows 8.1 & Windows 7 32-bit
The full release notes with fixed/known issues is available at the source link here.
Subject: Graphics Cards | May 24, 2016 - 06:36 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Yesterday, NVIDIA has released WHQL-certified drivers to align with the release of Overwatch. This version, 368.22, is the first public release of the 367 branch. Pascal is not listed in the documentation as a supported product, so it's unclear whether this will be the launch driver for it. The GTX 1080 comes out on Friday, but two drivers in a week would not be unprecedented for NVIDIA.
While NVIDIA has not communicated this too well, 368.22 will not install on Windows Vista. If you are still using that operating system, then you will not be able to upgrade your graphics drivers past 365.19. 367-branch (and later) drivers will required Windows 7 and up.
Before I continue, I should note that I've experienced so issues getting these drivers to install through GeForce Experience. Long story short, it took two attempts (with a clean install each time) to end up with a successful boot into 368.22. I didn't try the standalone installer that you can download from NVIDIA's website. If the second attempt using GeForce Experience failed, then I would have. That said, after I installed it, it seemed to work out well for me with my GTX 670.
While NVIDIA is a bit behind on documentation, the driver also rolls in other fixes. There were some GPU compute developers who had crashes and other failures in certain OpenCL and CUDA applications, which are now compatible with 368.22. I've also noticed that my taskbar hasn't been sliding around on its own anymore, but I've only been using the driver for a handful of hours.
You can get GeForce 368.22 drivers from GeForce Experience, but you might want to download the standalone installer (or skip a version or two if everything works fine).
Subject: Graphics Cards | May 18, 2016 - 06:11 PM | Ryan Shrout
Tagged: amd, radeon, market share
AMD sent out a note yesterday with some interesting news about how the graphics card market fared in Q1 of 2016. First, let's get to the bad news: sales of new discrete graphics solutions, in both mobile and desktop, dropped by 10.2% quarter to quarter, a decrease that was slightly higher than expected. Though details weren't given in the announcement or data I have from Mercury Research, it seems likely that expectations of upcoming new GPUs from both NVIDIA and AMD contributed to the slowdown of sales on some level.
Despite the shrinking pie, AMD grabbed more of it in Q1 2016 than it had in Q4 of 2015, gaining on total market share by 3.2% for a total of 29.4%. That's a nice gain in a short few months but its still much lower than Radeon has been as recently as 2013. That 3.2% gain includes both notebook and desktop discrete GPUs, but let's break it down further.
|Q1'16 Desktop||Q1'16 Desktop Change||Q1'16 Mobile||Q1'16 Mobile Change|
AMD's gain in the desktop graphics card market was 1.8%, up to 22.7% of the market, while the notebook discrete graphics share jumped an astounding 7.3% to 38.7% of the total market.
NVIDIA obviously still has a commanding lead in desktop add-in cards with more than 75% of the market, but Mercury Research believes that a renewed focus on driver development, virtual reality and the creation of the Radeon Technologies Group attributed to the increases in share for AMD.
Q3 of 2016 is where I think the future looks most interesting. Not only will NVIDIA's newly released GeForce GTX 1080 and upcoming GTX 1070 have time to settle in but the upcoming Polaris architecture based cards from AMD will have a chance to stretch their legs and attempt to continue pushing the needle in the upward direction.
Subject: Editorial, Graphics Cards | May 18, 2016 - 01:18 PM | Tim Verry
Tagged: rumor, Polaris, opinion, HDMI 2.0, gpu, gddr5x, GDDR5, GCN, amd, 4k
While Nvidia's Pascal has held the spotlight in the news recently, it is not the only new GPU architecture debuting this year. AMD will soon be bringing its Polaris-based graphics cards to market for notebooks and mainstream desktop users. While several different code names have been thrown around for these new chips, they are consistently in general terms referred to as Polaris 10 and Polaris 11. AMD's Raja Kudori stated in an interview with PC Perspective that the numbers used in the naming scheme hold no special significance, but eventually Polaris will be used across the entire performance lineup (low end to high end graphics).
Naturally, there are going to be many rumors and leaks as the launch gets closer. In fact, Tech Power Up recently came into a number of interesting details about AMD's plans for Polaris-based graphics in 2016 including specifications and which areas of the market each chip is going to be aimed at.
Citing the usual "industry sources" familiar with the matter (take that for what it's worth, but the specifications do not seem out of the realm of possibility), Tech Power Up revealed that there are two lines of Polaris-based GPUs that will be made available this year. Polaris 10 will allegedly occupy the mid-range (mainstream) graphics option in desktops as well as being the basis for high end gaming notebook graphics chips. On the other hand, Polaris 11 will reportedly be a smaller chip aimed at thin-and-light notebooks and mainstream laptops.
Now, for the juicy bits of the leak: the rumored specifications!
AMD's "Polaris 10" GPU will feature 32 compute units (CUs) which TPU estimates – based on the assumption that each CU still contains 64 shaders on Polaris – works out to 2,048 shaders. The GPU further features a 256-bit memory interface along with a memory controller supporting GDDR5 and GDDR5X (though not at the same time heh). This would leave room for cheaper Polaris 10 derived products with less than 32 CUs and/or cheaper GDDR5 memory. Graphics cards would have as much as 8GB of memory initially clocked at 7 Gbps. Reportedly, the full 32 CU GPU is rated at 5.5 TFLOPS of single precision compute power and runs at a TDP of no more than 150 watts.
Compared to the existing Hawaii-based R9 390X, the upcoming R9 400 Polaris 10 series GPU has fewer shaders and less memory bandwidth. The memory is clocked 1 GHz higher, but the GDDR5X memory bus is half that of the 390X's 512-bit GDDR5 bus which results in 224 GB/s memory bandwidth for Polaris 10 versus 384 GB/s on Hawaii. The R9 390X has a slight edge in compute performance at 5.9 TFLOPS versus Polaris 10's 5.5 TFLOPS however the Polaris 10 GPU is using much less power and easily wins at performance per watt! It almost reaches the same level of single precision compute performance at nearly half the power which is impressive if it holds true!
|R9 390X||R9 390||R9 380||R9 400-Series "Polaris 10"|
|GPU Code name||Grenada (Hawaii)||Grenada (Hawaii)||Antigua (Tonga)||Polaris 10|
|Rated Clock||1050 MHz||1000 MHz||970 MHz||~1343 MHz|
|Memory Clock||6000 MHz||6000 MHz||5700 MHz||7000 MHz|
|Memory Bandwidth||384 GB/s||384 GB/s||182.4 GB/s||224 GB/s|
|TDP||275 watts||275 watts||190 watts||150 watts (or less)|
|Peak Compute||5.9 TFLOPS||5.1 TFLOPS||3.48 TFLOPS||5.5 TFLOPS|
|MSRP (current)||~$400||~$310||~$199||$ unknown|
Note: Polaris GPU clocks esitmated using assumption of 5.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)
Another comparison that can be made is to the Radeon R9 380 which is a Tonga-based GPU with similar TDP. In this matchup, the Polaris 10 based chip will – at a slightly lower TDP – pack in more shaders, twice the amount of faster clocked memory with 23% more bandwidth, and provide a 58% increase in single precision compute horsepower. Not too shabby!
Likely, a good portion of these increases are made possible by the move to a smaller process node and utilizing FinFET "tri-gate" like transistors on the Samsung/Globalfoundries 14LPP FinFET manufacturing process, though AMD has also made some architecture tweaks and hardware additions to the GCN 4.0 based processors. A brief high level introduction is said to be made today in a webinar for their partners (though AMD has come out and said preemptively that no technical nitty-gritty details will be divulged yet). (Update: Tech Altar summarized the partner webinar. Unfortunately there was no major reveals other than that AMD will not be limiting AIB partners from pushing for the highest factory overclocks they can get).
Moving on from Polaris 10 for a bit, Polaris 11 is rumored to be a smaller GCN 4.0 chip that will top out at 14 CUs (estimated 896 shaders/stream processors) and 2.5 TFLOPS of single precision compute power. These chips aimed at mainstream and thin-and-light laptops will have 50W TDPs and will be paired with up to 4GB of GDDR5 memory. There is apparently no GDDR5X option for these, which makes sense at this price point and performance level. The 128-bit bus is a bit limiting, but this is a low end mobile chip we are talking about here...
|R7 370||R7 400 Series "Polaris 11"|
|GPU Code name||Trinidad (Pitcairn)||Polaris 11|
925 MHz base (975 MHz boost)
|Memory||2 or 4GB||4GB|
|Memory Clock||5600 MHz||? MHz|
|Memory Bandwidth||179.2 GB/s||? GB/s|
|TDP||110 watts||50 watts|
|Peak Compute||1.89 TFLOPS||2.5 TFLOPS|
|MSRP (current)||~$140 (less after rebates and sales)||$?|
Note: Polaris GPU clocks esitmated using assumption of 2.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)
Fewer details were unveiled concerning Polaris 11, as you can see from the chart above. From what we know so far, it should be a promising successor to the R7 370 series even with the memory bus limitation and lower shader count as the GPU should be clocked higher, (it also might have more shaders in M series mobile variants versus of the 370 and lower mobile series) and a much lower TDP for at least equivalent if not a decent increase in performance. The lower power usage in particular will be hugely welcomed in mobile devices as it will result in longer battery life under the same workloads, ideally. I picked the R7 370 as the comparison as it has 4 gigabytes of memory and not that many more shaders and being a desktop chip readers may be more widely familiar with it. It also appears to sit between the R7 360 and R7 370 in terms of shader count and other features but is allegedly going to be faster than both of them while using at least (on paper) less than half the power.
Of course these are still rumors until AMD makes Polaris officially, well, official with a product launch. The claimed specifications appear reasonable though, and based on that there are a few important takeaways and thoughts I have.
The first thing on my mind is that AMD is taking an interesting direction here. While NVIDIA has chosen to start out its new generation at the top by announcing "big Pascal" GP100 and actually launching the GP104 GTX 1080 (one of its highest end consumer chips/cards) yesterday and then over the course of the year introducing lower end products AMD has opted for the opposite approach. AMD will be starting closer to the lower end with a mainstream notebook chip and high end notebook/mainstream desktop GPU (Polaris 11 and 10 respectively) and then over a year fleshing out its product stack (remember Raja Kudori stated Polaris and GCN 4 would be used across the entire product stack) and building up with bigger and higher end GPUs over time finally topping off with its highest end consumer (and professional) GPUs based on "Vega" in 2017.
This means, and I'm not sure if this was planned by either Nvidia or AMD or just how it happened to work out based on them following their own GPU philosophies (but I'm thinking the latter), that for some time after both architectures are launched AMD and NVIDIA's newest architectures and GPUs will not be directly competing with each other. Eventually they should meet in the middle (maybe late this year?) with a mid-range desktop graphics card and it will be interesting to see how they stack up at similar price points and hardware levels. Then, of course once "Vega" based GPUs hit (sadly probably in time for NV's big Pascal to launch heh. I'm not sure if Vega is Fury X replacement only or even beyond that to 1080Ti or even GP100 competitor) we should see GCN 4 on the new smaller process node square up against NVIDIA and it's 16nm Pascal products across the board (entire lineup). Which will have the better performance, which will win out in power usage and performance/watt and performance/$? All questions I wish I knew the answers to, but sadly do not!!
Speaking of price and performance/$... Polaris is actually looking pretty good so far at hitting much lower TDPs and power usage targets while delivering at least similar performance if not a good bit more. Both AMD and NVIDIA appear to be bringing out GPUs better than I expected to see as far as technological improvements in performance and power usage (these die shrinks have really helped even though from here on out that trend isn't really going to continue...). I hope that AMD can at least match NV in these areas at the mid range even if they do not have a high end GPU coming out soon (not until sometime after these cards launch and not really until Vega, the high end GCN GPU successor). At least on paper based on the leaked information the GPUs so far look good. My only worry is going to be pricing which I think is going to make or break these cards. AMD will need to price them competitively and aggressively to ensure their adoption and success.
I hope that doing the rollout this way (starting with lower end chips) helps AMD to iron out the new smaller process node and that they are able to get good yields so that they can be aggressive with pricing here and eventually at the hgh end!
I am looking forward to more information on AMD's Polaris architecture and the graphics cards based on it!
- AMD Capsaicin GDC Live Stream and Live Blog TODAY!!
- AMD GPU Roadmap: Capsaicin Names Upcoming Architectures
- AMD's Raja Koduri talks moving past CrossFire, smaller GPU dies, HBM2 and more.
- AMD High-End Polaris Expected for 2016
- CES 2016: AMD Shows Polaris Architecture and HDMI FreeSync Displays
I will admit that I am not 100% up on all the rumors and I apologize for that. With that said, I would love to hear what your thoughts are on AMD's upcoming GPUs and what you think about these latest rumors!
Subject: Graphics Cards | May 18, 2016 - 12:49 PM | Josh Walrath
Tagged: nvidia, pascal, gtx 1070, 1070, gtx, GTX 1080, 16nm FF+, TSMC, Founder's Edition
Several weeks ago when NVIDIA announced the new GTX 1000 series of products, we were given a quick glimpse of the GTX 1070. This upper-midrange card is to carry a $379 price tag in retail form while the "Founder's Edition" will hit the $449 mark. Today NVIDIA released the full specifications of this card on their website.
The interest of the GTX 1070 is incredibly great because of the potential performance of this card vs. the previous generation. Price is also a big consideration here as it is far easier to raise $370 than it is to make the jump to GTX 1080 and shell out $599 once non-Founder's Edition cards are released. The GTX 1070 has all of the same features as the GTX 1080, but it takes a hit when it comes to clockspeed and shader units.
The GTX 1070 is a Pascal based part that is fabricated on TSMC's 16nm FF+ node. It shares the same overall transistor count of the GTX 1080, but it is partially disabled. The GTX 1070 contains 1920 CUDA cores as compared to the 2560 cores of the 1080. Essentially one full GPC is disabled to reach that number. The clockspeeds take a hit as well compared to the full GTX 1080. The base clock for the 1070 is still an impressive 1506 MHz and boost reaches 1683 MHz. This combination of shader counts and clockspeed makes this probably a little bit faster than the older GTX 980 ti. The rated TDP for the card is 150 watts with a single 8 pin PCI-E power connector. This means that there should be some decent headroom when it comes to overclocking this card. Due to binning and yields, we may not see 2+ GHz overclocks with these cards, especially if NVIDIA cut down the power delivery system as compared to the GTX 1080. Time will tell on that one.
The memory technology that NVIDIA is using for this card is not the cutting edge GDDR5x or HBM, but rather the tried and true GDDR5. 8 GB of this memory sits on a 256 bit bus, but it is running at a very, very fast 8 gbps. This gives overall bandwidth in the 256 GB/sec region. When we combine this figure with the memory compression techniques implemented with the Pascal architecture we can see that the GTX 1070 will not be bandwidth starved. We have no information if this generation of products will mirror what we saw with the previous generation GTX 970 in terms of disabled memory controllers and the 3.5 GB/500 MB memory split due to that unique memory subsystem.
Beyond those things, the GTX 1070 is identical to the GTX 1080 in terms of DirectX features, display specifications, decoding support, double bandwidth SLI, etc. There is an obvious amount of excitement for this card considering its potential performance and price point. These supposedly will be available in the Founder's Edition release on June 10 for the $449 MSRP. I know many people are considering using these cards in SLI to deliver performance for half the price of last year's GTX 980ti. From all indications, these cards will be a signficant upgrade for anyone using GTX 970s in SLI. With the greater access to monitors that hit 4K as well as Surround Gaming, this could be a solid purchase for anyone looking to step up their game in these scenarios.
Subject: Graphics Cards | May 17, 2016 - 06:22 PM | Jeremy Hellstrom
Tagged: nvidia, pascal, video, GTX 1080, gtx, GP104, geforce, founders edition
Yes that's right, if you felt Ryan and Al somehow missed something in our review of the new GTX 1080 or you felt the obvious pro-Matrox bios was showing here are the other reviews you can pick and choose from. Start off with [H]ard|OCP who also tested Ashes of the Singularity and Doom as well as the old favourite Battlefield 4. Doom really showed itself off as a next generation game, its Nightmare mode scoffing at any GPU with less than 5GB of VRAM available and pushing the single 1080 hard. Read on to see how the competition stacked up ... or wait for the 1440 to come out some time in the future.
"NVIDIA's next generation video card is here, the GeForce GTX 1080 Founders Edition video card based on the new Pascal architecture will be explored. We will compare it against the GeForce GTX 980 Ti and Radeon R9 Fury X in many games to find out what it is capable of."
Here are some more Graphics Card articles from around the web:
- In the lab: Nvidia's GeForce GTX 1080 graphics card @ The Tech Report
- FCAT GeForce GTX 1080 Framepacing @ Guru of 3D
- NVIDIA GeForce GTX 1080 Review: A Look At 4K & Ultra-wide Gaming @ Techgage
- NVIDIA GeForce GTX 1080 Review - The Advent of Pascal @HiTech Legion
- NVIDIA GeForce GTX 1080 Founders Edition Review @ OCC
- NVIDIA GeForce GTX 1080 Founders Edition Review @ Neoseeker
- Nvidia GTX 1080 @ Kitguru
- NVIDIA GeForce GTX 1080 8 GB @ techPowerUp
- The NVIDIA GeForce GTX 1080 Review @ Hardware Canucks
Subject: Graphics Cards | May 16, 2016 - 03:52 PM | Jeremy Hellstrom
Tagged: amd, r9 380x, crossfire
A pair of R9 380X's will cost you around $500, a bit more $100 less than a single GTX 980Ti and on par or a little less expensive than a straight GTX 980. You have likely seen these cards compared but how often have you seen these cards pitted against a pair of GTX 960's which costs a little bit less than two 380X cards? [H]ard|OCP decided it was worth investigating, perhaps for those who currently have a single one of these cards that are considering a second if the price is right. The results are very tight, overall the two setups performed very similarly with some games favouring AMD and others NVIDIA, check out the full review here.
"We are evaluating two Radeon R9 380X video cards in CrossFire against two GeForce GTX 960 video cards in a SLI arrangement. We will overclock each setup to its highest, to experience the full gaming benefit each configuration has to offer. Additionally we will compare a Radeon R9 380 CrossFire setup to help determine the best value."
Here are some more Graphics Card articles from around the web:
Subject: General Tech, Graphics Cards | May 16, 2016 - 03:19 PM | Ryan Shrout
Tagged: video, tom petersen, pascal, nvidia, live, GTX 1080, gtx, GP104, geforce
Our review of the GeForce GTX 1080 is LIVE NOW, so be sure you check that out before today's live stream!!
Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout and NVIDIA’s Tom Petersen. The general details about consumer Pascal and the GeForce GTX 1080 graphics card are already official and based on the traffic to our stories and the response on Twitter and YouTube, there is more than a little pent-up excitement. .
On hand to talk about the new graphics card, answer questions about technologies in the GeForce family including Pascal, SLI, VR, Simultaneous Multi-Projection and more will be Tom Petersen, well known in our community. We have done quite a few awesome live steams with Tom in the past, check them out if you haven't already.
NVIDIA GeForce GTX 1080 Live Stream
10am PT / 1pm ET - May 17th
Need a reminder? Join our live mailing list!
The event will take place Tuesday, May 17th at 1pm ET / 10am PT at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience, asking questions for me and Tom to answer live.
Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else. Previous streams have produced news as well – including statements on support for Adaptive Sync, release dates for displays and first-ever demos of triple display G-Sync functionality. You never know what’s going to happen or what will be said!
UPDATE! UPDATE! UPDATE! This just in fellow gamers: Tom is going to be providing two GeForce GTX 1080 graphics cards to give away during the live stream! We won't be able to ship them until availability hits at the end of May, but two lucky viewers of the live stream will be able to get their paws on the fastest graphics card we have ever tested!! Make sure you are scheduled to be here on May 17th at 10am PT / 1pm ET!!
Don't you want to win me??!?
If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?
So join us! Set your calendar for this coming Tuesday at 1pm ET / 10am PT and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!
Subject: Graphics Cards | May 11, 2016 - 11:53 PM | Scott Michaud
Tagged: amd, crimson, graphics drivers
For the second time this month, hence the version number, AMD has released a driver to coincide with a major game release. This one is for DOOM, which will be available on Friday. Like the previous driver, which was aligned with Forza, it has not been WHQL-certified. That's okay, though. NVIDIA's Game Ready drivers didn't strive for WHQL certification until just recently, and, even then, WHQL certification doesn't mean what it used to.
But yeah, apart from game-specific optimizations for DOOM, 16.5.2 has a few extra reasons to be used. If you play Battleborn, which launched on May 3rd, then AMD has added a new CrossFire profile for that game. They have also fixed at least eleven issues (plus however many undocumented ones). It comes with ten known issues, but none of them seem particularly troubling. It seems to be mostly CrossFire-related issues.
You can pick up the driver at AMD's website.
Subject: Graphics Cards | May 11, 2016 - 10:57 PM | Scott Michaud
Tagged: sli, nvidia, GTX 1080, GeForce GTX 1080
Update (May 12th, 1:45am): Okay so the post has been deleted, which was originally from Chris Bencivenga, Support Manager at EVGA. A screenshot of it is attached below. Note that Jacob Freeman later posted that "More info about SLI support will be coming soon, please stay tuned." I guess this means take the news with a grain of salt until an official word can be released.
Original Post Below
According to EVGA, NVIDIA will not support three- and four-way SLI on the GeForce GTX 1080. They state that, even if you use the old, multi-way connectors, it will still be limited to two-way. The new SLI connector (called SLI HB) will provide better performance “than 2-way SLI did in the past on previous series”. This suggests that the old SLI connectors can be used with the GTX 1080, although with less performance and only for two cards.
This is the only hard information that we have on this change, but I will elaborate a bit based on what I know about graphics APIs. Basically, SLI (and CrossFire) are simplifications of the multi-GPU load-balancing problems such that it is easy to do from within the driver, without the game's involvement. In DirectX 11 and earlier, the game cannot interface with the driver in that way at all. That does not apply to DirectX 12 and Vulkan, however. In those APIs, you will be able to explicitly load-balance by querying all graphics devices (including APUs) and split the commands yourself.
Even though a few DirectX 12 games exist, it's still unclear how SLI and CrossFire will be utilized in the context of DirectX 12 and Vulkan. DirectX 12 has the tier of multi-GPU called “implicit multi-adapter,” which allows the driver to load balance. How will this decision affect those APIs? Could inter-card bandwidth even be offloaded via SLI HB in DirectX 12 and Vulkan at all? Not sure yet (but you would think that they would at least add a Vulkan extension). You should be able to use three GTX 1080s in titles that manually load-balance to three or more mismatched GPUs, but only for those games.
If it relies upon SLI, which is everything DirectX 11, then you cannot. You definitely cannot.
Subject: Graphics Cards | May 10, 2016 - 07:50 PM | Scott Michaud
Tagged: nvidia, maxwell, GTX 980 Ti, GTX 970, GTX 1080, geforce
The GTX 1080 announcement is starting to ripple into retailers, leading to price cuts on the previous generation, Maxwell-based SKUs. If you were interested in the GTX 1080, or an AMD graphics card of course, then you probably want to keep waiting. That said, you can take advantage of the discounts to get a VR-ready GPU or if you already have a Maxwell card that could use a cheap SLI buddy.
This tip comes from a NeoGAF thread. Microcenter has several cards on sale, but EVGA seems to have the biggest price cuts. This 980 Ti has dropped from $750 USD down to $499.99 (or $474.99 if you'll promise yourself to do that mail-in rebate). That's a whole third of its price slashed, and puts it about a hundred dollars under GTX 1080. Granted, it will also be slower than the GTX 1080, with 2GB less video RAM, but $100 might be worth that for you.
Subject: Graphics Cards | May 10, 2016 - 07:29 PM | Ryan Shrout
Tagged: video, pascal, nvidia, GTX 1080, gtx 1070, geforce
After the live streamed event announcing the GeForce GTX 1080 and GTX 1070, Allyn and I spent a few minutes this afternoon going over the information as it was provided, discussing our excitement about the product and coming to grips with what in the world a "Founder's Edition" even is.
If you haven't yet done so, check out Scott's summary post on the GTX 1080 and GTX 1070 specs right here.