Phanteks Glacier G1080 Water Block

Subject: Graphics Cards | May 30, 2016 - 03:24 PM |
Tagged: G1080, phanteks, computex 2016

IMG_2016_05_27_0416.jpg

Walnut, California, May, 30 th , 2016 - Phanteks a leader in thermal cooling, is excited to launch their very first water block designed for the new Nvidia GTX 1080 Founders Edition. The G1080 is made from premium materials and finest standards of craftsmanship from Phanteks.

From military standard Viton O-ring to the RGB LED lighting, all of our exceptionally high-quality materials are carefully selected. The water block features a nickel-plated copper cold plate, acrylic top and sandblasted cover plates for an elegant look.

IMG_2016_05_27_0429.jpg

The G1080 uses military class Viton O-ring compared to silicone O-rings used by other manufacturers, well known for its excellent heat resistance and its durability. The G1080 also features RGB LED lighting that can sync with Phanteks cases that supports RGB lighting or motherboards’ RGB software by using our Phanteks RGB adapter.

IMG_2016_05_27_0424.jpg

Available at most local retailers - MSRP: $129.99 / €129,90 / £99.99 (VAT included)

Source: Phanteks

ASUS Avalon concept PC merges desktops and DIY with cable-free mindset

Subject: Graphics Cards, Motherboards, Systems, Shows and Expos | May 30, 2016 - 08:04 AM |
Tagged: crazy people, concept, computex 2016, computex, avalon, asus

If you expected Computex to be bland and stale this year, ASUS has something that is going to change your mind. During the company's Republic of Gamers press conference, it revealed a concept PC design it has been working on dubbed Avalon. The goal of this project was to improve on the fundamental design of the PC; something that hasn't changed for decades. ASUS wanted to show that you could build a platform that would allow DIY machines to be "more modular, easier to build, and more tightly integrated."

system-closed.jpg

The result is a proof of concept design that looks more like a high end turntable than a PC. In reality, you are looking at a machine that has been totally redesigned, from the power supply to motherboard and case integration to cooling considerations and more. ASUS has posted a great story that goes into a lot of detail on Avalon, and it's clear this is a project the team has been working on for some time.

The brainchild of Jonathan Chu, the Avalon concept takes a notebook-like approach to desktop design. The motherboard is designed in conjunction with the chassis to enable more seamless cooperation between the two.

system-open.jpg

The first example of changes to Avalon is something as simple as the front panel connectors on a case. Connecting them to your motherboard is the same today, basically, as it has ever been. But if you are the manufacturer or designer of both the chassis and the motherboard itself, it is trivial to have the buttons, lights and even additional capabilities built into a specific location on the PCB that matches with access points on the case. 

io.jpg

Re-thinking the rear IO panel was another target: making it modular and connected to the system via PCI Express means you can swap connectivity options based on the user's needs. Multiple Gigabit NICs a requirement? Done. Maximum USB capability? Sure. Even better, by making the back panel IO a connected device, it can host storage and sound controllers on its own, allowing for improved audio solutions and flexible data configurations. 

psu.jpg

ASUS even worked in a prototype power supply that is based on the SFX form factor but that uses a server-style edge connector, removing wires from the equation. It then becomes the motherboard's responsibility to distribute power through the other components; which again is easy to work through if you are designing these things in tandem. Installing or swapping a power supply becomes as simple as pulling out a drive tray.

This is all made possible by an internal structure that looks like this:

guts1.jpg

Rethinking how a motherboard is built, how it connects to the outside world and to other components, means that ASUS was able to adjust and change just about everything. The only area that remains the same is for the discrete graphics card. These tend to draw too much power to use any kind of edge connector (though the ASUS story linked above says they are working on a solution) and thus you see short run cables from a break out on the motherboard to the standard ROG graphics card.

system-graphics.jpg

The ASUS EdgeUp story has some more images and details and I would encourage you to check it out if you find this topic compelling; I know I do. There are no prices, no release dates, no plans for sampling yet. ASUS has built a prototype that is "right on the edge of what’s possible" and they are looking for feedback from the community to see what direction they should go next.

Will the DIY PC in 2020 be a completely different thing than we build today? It seems ASUS is asking the same question.

Source: ASUS EdgeUp
Author:
Manufacturer: NVIDIA

GP104 Strikes Again

It’s only been three weeks since NVIDIA unveiled the GeForce GTX 1080 and GTX 1070 graphics cards at a live streaming event in Austin, TX. But it feels like those two GPUs, one of which hasn't even been reviewed until today, have already drastically shifted the landscape of graphics, VR and PC gaming.

nvidia1.jpg

Half of the “new GPU” stories are told, with AMD due to follow up soon with Polaris, but it was clear to anyone watching the enthusiast segment with a hint of history that a line was drawn in the sand that day. There is THEN, and there is NOW. Today’s detailed review of the GeForce GTX 1070 completes NVIDIA’s first wave of NOW products, following closely behind the GeForce GTX 1080.

Interestingly, and in a move that is very uncharacteristic of NVIDIA, detailed specifications of the GeForce GTX 1070 were released on GeForce.com well before today’s reviews. With information on the CUDA core count, clock speeds, and memory bandwidth it was possible to get a solid sense of where the GTX 1070 performed; and I imagine that many of you already did the napkin math to figure that out. There is no more guessing though - reviews and testing are all done, and I think you'll find that the GTX 1070 is as exciting, if not more so, than the GTX 1080 due to the performance and pricing combination that it provides.

Let’s dive in.

Continue reading our review of the GeForce GTX  1070 8GB Founders Edition!!

MSI Announces Four Custom GTX 1080s (Six SKUs)

Subject: Graphics Cards | May 29, 2016 - 05:46 PM |
Tagged: msi, GTX 1080, sea hawk, gaming x, armor, Aero, nvidia

Beyond the Founders Edition, MSI has prepared six SKUs of the GTX 1080. These consists of four variants, two of which have an overclocked counterpart to make up the remaining two products. The product stack seems quite interesting, with a steady progression of user needs, but we'll need to wait for price and availability to know for sure.

msi-2016-gtx1080-aero.png

We'll start at the bottom with the MSI GeForce GTX 1080 AERO 8G and MSI GeForce GTX 1080 AERO 8G OC. These are your typical blower designs that pull air in from within the case, and exhausts it out the back after collecting a bunch of heat from the GPU. It will work, it should be one of the cheapest options for this card, and it will keep the GTX 1080's heat outside of the case. It has a little silver accent on it, too. The non-overclocked version is the standard 1607 MHz / 1733 MHz that NVIDIA advertises, and the OC SKU is a little higher: 1632 MHz / 1771 MHz.

msi-2016-gtx1080-armor.png

Next up the product stack are the MSI GeForce GTX 1080 ARMOR 8G and MSI GeForce GTX 1080 ARMOR 8G OC versions. This uses MSI's aftermarket, two-fan cooler that should provide much lower temperatures than AERO, but they exhaust back into the case. Personally? I don't really care about that. The only other thing that heats up in my case, to any concerning level at least, is my CPU, and I recently switched that to a closed-loop water cooler anyway. MSI added an extra, six-pin power connector to these cards (totaling 8-pin + 6-pin + slot power = up-to 300W, versus 8-pin + slot power's 225W). The non-overclocked version is NVIDIA's base 1607 MHz / 1733 MHz, but OC brings that up to 1657 MHz / 1797 MHz.

msi-2016-gtx1080-seahawk.png

Speaking of closed-loop water coolers... The MSI GeForce GTX 1080 SEA HAWK takes the AERO design, which we mentioned earlier, and puts a Corsair self-contained water cooler inside it, too. Only one SKU of this is available, clocked at 1708 MHz base and 1847 MHz boost, but it should support overclocking fairly easily. That said, unlike other options that add a bonus six-pin connector, the SEA HAWK has just one, eight-pin connector. Good enough for the Founders Edition, but other SKUs (including three of the other cards in this post) suggest that there's a reason to up the power ceiling.

msi-2016-gtx1080-gamingx.png

We now get to MSI's top, air-cooled SKU: the MSI GeForce GTX 1080 GAMING X 8G. This one has their new TWIN FROZR VI, which they claim spins quieter and has fans that drag more air to spin slower than previous models. It, as you would assume from reading about ARMOR 8G, has an extra, six-pin power connector to provide more overclocking headroom. It has three modes: Silent, which clocks the card to the standard 1607 MHz / 1733 MHz levels; Gaming, which significantly raises that to 1683 MHz / 1822 MHz; and OC, which bumps that slightly further to 1708 MHz / 1847 MHz.

Currently, no pricing and availability for any of these.

Source: MSI

ASUS Announces ROG Strix GTX 1080

Subject: Graphics Cards | May 28, 2016 - 05:00 PM |
Tagged: asus, ROG, strix, GTX 1080, nvidia

The Founders Edition versions of the GTX 1080 went on sale yesterday, but we're beginning to see the third-party variants being announced. In this case, the ASUS ROG Strix is a three-fan design that uses their DirectCU III heatsink. More interestingly, ASUS decided to increase the amount of wattage that this card can accept by adding an extra, six-pin PCIe power connector (totaling 8-pin + 6-pin). A Founders Edition card only requires a single, eight-pin connection over the 75W provided by the PCIe slot itself. This provides an extra 75W of play room for the ROG Strix card, raising the maximum power from 225W to 300W.

asus-2016-1080strix-lighting.png

Some of this power will be used for its on-card, RGB LED lighting, but I doubt that it was the reason for the extra 75W of headroom. The lights follow the edges of the card, acting like hats and bow-ties to the three fans. (Yes, you will never unsee that now.) The shroud is also modular, and ASUS provides the data for enthusiasts to 3D print their own modifications (albeit their warranty doesn't cover damage caused by this level of customization).

asus-2016-1080strix-explode.png

As for the actual performance, the card naturally comes with an overclock out of the box. The default “Gaming Mode” has a 1759 MHz base clock with an 1898 MHz boost. You can flip this into “OC Mode” for a slight, two-digit increase to 1784 MHz base and 1936 MHz boost. It is significantly higher than the Founders Edition, though, which has a base clock of 1607 MHz that boosts to 1733 MHz. The extra power will likely help manual overclocks, but it will come down to “silicon lottery” whether your specific chip was abnormally less influenced by manufacturing defects. We also don't know yet whether the Pascal architecture, and the 16nm process it relies upon, has any physical limits that will increasingly resist overclocks past a certain frequency.

Pricing and availability is not yet announced.

Source: ASUS

Teaser - GTX 1080's Tested in SLI - EVGA SC ACX 3.0

Subject: Graphics Cards | May 27, 2016 - 02:58 PM |
Tagged: sli, review, led, HB, gtx, evga, Bridge, ACX 3.0, 3dmark, 1080

...so the time where we manage to get multiple GTX 1080's in the office here would, of course, be when Ryan is on the other side of the planet. We are also missing some other semi-required items, like the new 'SLI HB 'bridge, but we should be able to test on an older LED bridge at 2560x1440 (under the resolution where the newer style is absolutely necessary to avoid a sub-optimal experience). That said, surely the storage guy can squeeze out a quick run of 3DMark to check out the SLI scaling, right?

config.png

For this testing, I spent just a few minutes with EVGA's OC Scanner to take advantage of GPU Boost 3.0. I cranked the power limits and fans on both cards, ending up at a stable overclock hovering at right around 2 GHz on the pair. I'm leaving out the details of the second GPU we got in for testing as it may be under NDA and I can't confirm that as all of the people to ask are in an opposite time zone, so I'm leaving out that for now (pfft - it has an aftermarket cooler). Then I simply ran Firestrike (25x14) with SLI disabled:

3dmark-single.png

...and then with it enabled:

3dmark-sli.png

That works out to a 92% gain in 3DMark score, with the FPS figures jumping by almost exactly 2x. Now remember, this is by no means a controlled test, and the boss will be cranking out a much more detailed piece with frame rated results galore in the future, but for now I just wanted to get some quick figures out to the masses for consumption and confirmation that 1080 SLI is a doable thing, even on an older bridge.

*edit* here's another teaser:

heaven-oc-.png

Aftermarket coolers are a good thing as evidenced by the 47c of that second GPU, but the Founders Edition blower-style cooler is still able to get past 2GHz just fine. Both cards had their fans at max speed in this example.

*edit again*

I was able to confirm we are not under NDA on the additional card we received. Behold:

IMG_1692.jpg

IMG_1698.jpg

This is the EVGA Superclocked edition with their ACX 3.0 cooler.

More to follow (yes, again)!

Manufacturer: NVIDIA

First, Some Background

 
TL;DR:
NVIDIA's Rumored GP102
 
Based on two rumors, NVIDIA seems to be planning a new GPU, called GP102, that sits between GP100 and GP104. This changes how their product stack flowed since Fermi and Kepler. GP102's performance, both single-precision and double-precision, will likely signal NVIDIA's product plans going forward.
  • - GP100's ideal 1 : 2 : 4 FP64 : FP32 : FP16 ratio is inefficient for gaming
  • - GP102 either extends GP104's gaming lead or bridges GP104 and GP100
  • - If GP102 is a bigger GP104, the future is unclear for smaller GPGPU devs
    • This is, unless GP100 can be significantly up-clocked for gaming.
  • - If GP102 matches (or outperforms) GP100 in gaming, and has better than 1 : 32 double-precision performance, then GP100 would be the first time that NVIDIA designed an enterprise-only, high-end GPU.
 

 

When GP100 was announced, Josh and I were discussing, internally, how it would make sense in the gaming industry. Recently, an article on WCCFTech cited anonymous sources, which should always be taken with a dash of salt, that claimed NVIDIA was planning a second architecture, GP102, between GP104 and GP100. As I was writing this editorial about it, relating it to our own speculation about the physics of Pascal, VideoCardz claims to have been contacted by the developers of AIDA64, seemingly on-the-record, also citing a GP102 design.

I will retell chunks of the rumor, but also add my opinion to it.

nvidia-titan-black-1.jpg

In the last few generations, each architecture had a flagship chip that was released in both gaming and professional SKUs. Neither audience had access to a chip that was larger than the other's largest of that generation. Clock rates and disabled portions varied by specific product, with gaming usually getting the more aggressive performance for slightly better benchmarks. Fermi had GF100/GF110, Kepler had GK110/GK210, and Maxwell had GM200. Each of these were available in Tesla, Quadro, and GeForce cards, especially Titans.

Maxwell was interesting, though. NVIDIA was unable to leave 28nm, which Kepler launched on, so they created a second architecture at that node. To increase performance without having access to more feature density, you need to make your designs bigger, more optimized, or more simple. GM200 was giant and optimized, but, to get the performance levels it achieved, also needed to be more simple. Something needed to go, and double-precision (FP64) performance was the big omission. NVIDIA was upfront about it at the Titan X launch, and told their GPU compute customers to keep purchasing Kepler if they valued FP64.

Fast-forward to Pascal.

AMD Releases Radeon Software Crimson Edition 16.5.3

Subject: Graphics Cards | May 24, 2016 - 09:46 PM |
Tagged: vulkan, radeon, overwatch, graphics driver, Crimson Edition 16.5.3, crimson, amd

AMD has released new drivers for Overwatch (and more) with Radeon Software Crimson Edition 16.5.3.

amd-2015-crimson-logo.png

"Radeon Software Crimson Edition is AMD's revolutionary new graphics software that delivers redesigned functionality, supercharged graphics performance, remarkable new features, and innovation that redefines the overall user experience. Every Radeon Software release strives to deliver new features, better performance and stability improvements."

AMD lists these highlights for Radeon Software Crimson Edition 16.5.3:

Support for:

  • Total War: Warhammer
  • Overwatch
  • Dota 2 (with Vulkan API)

New AMD Crossfire profile available for:

  • Total War: Warhammer
  • Overwatch

The driver is available from AMD from the following direct links:

The full release notes with fixed/known issues is available at the source link here.

Source: AMD

NVIDIA Releases 368.22 Drivers for Overwatch

Subject: Graphics Cards | May 24, 2016 - 06:36 PM |
Tagged: nvidia, graphics drivers

Yesterday, NVIDIA has released WHQL-certified drivers to align with the release of Overwatch. This version, 368.22, is the first public release of the 367 branch. Pascal is not listed in the documentation as a supported product, so it's unclear whether this will be the launch driver for it. The GTX 1080 comes out on Friday, but two drivers in a week would not be unprecedented for NVIDIA.

While NVIDIA has not communicated this too well, 368.22 will not install on Windows Vista. If you are still using that operating system, then you will not be able to upgrade your graphics drivers past 365.19. 367-branch (and later) drivers will required Windows 7 and up.

nvidia-geforce.png

Before I continue, I should note that I've experienced so issues getting these drivers to install through GeForce Experience. Long story short, it took two attempts (with a clean install each time) to end up with a successful boot into 368.22. I didn't try the standalone installer that you can download from NVIDIA's website. If the second attempt using GeForce Experience failed, then I would have. That said, after I installed it, it seemed to work out well for me with my GTX 670.

While NVIDIA is a bit behind on documentation, the driver also rolls in other fixes. There were some GPU compute developers who had crashes and other failures in certain OpenCL and CUDA applications, which are now compatible with 368.22. I've also noticed that my taskbar hasn't been sliding around on its own anymore, but I've only been using the driver for a handful of hours.

You can get GeForce 368.22 drivers from GeForce Experience, but you might want to download the standalone installer (or skip a version or two if everything works fine).

Source: NVIDIA

AMD Gains Market Share in Q1'16 Discrete GPUs

Subject: Graphics Cards | May 18, 2016 - 06:11 PM |
Tagged: amd, radeon, market share

AMD sent out a note yesterday with some interesting news about how the graphics card market fared in Q1 of 2016. First, let's get to the bad news: sales of new discrete graphics solutions, in both mobile and desktop, dropped by 10.2% quarter to quarter, a decrease that was slightly higher than expected. Though details weren't given in the announcement or data I have from Mercury Research, it seems likely that expectations of upcoming new GPUs from both NVIDIA and AMD contributed to the slowdown of sales on some level.

Despite the shrinking pie, AMD grabbed more of it in Q1 2016 than it had in Q4 of 2015, gaining on total market share by 3.2% for a total of 29.4%. That's a nice gain in a short few months but its still much lower than Radeon has been as recently as 2013. That 3.2% gain includes both notebook and desktop discrete GPUs, but let's break it down further.

  Q1'16 Desktop Q1'16 Desktop Change Q1'16 Mobile Q1'16 Mobile Change
AMD 22.7% +1.8% 38.7% +7.3%
NVIDIA (assumed) ~77% -1.8% ~61% -7.3%

AMD's gain in the desktop graphics card market was 1.8%, up to 22.7% of the market, while the notebook discrete graphics share jumped an astounding 7.3% to 38.7% of the total market.

NVIDIA obviously still has a commanding lead in desktop add-in cards with more than 75% of the market, but Mercury Research believes that a renewed focus on driver development, virtual reality and the creation of the Radeon Technologies Group attributed to the increases in share for AMD.

Q3 of 2016 is where I think the future looks most interesting. Not only will NVIDIA's newly released GeForce GTX 1080 and upcoming GTX 1070 have time to settle in but the upcoming Polaris architecture based cards from AMD will have a chance to stretch their legs and attempt to continue pushing the needle in the upward direction.