FSP's new Hydro X series; technically the less expensive model?

Subject: Cases and Cooling | June 6, 2016 - 02:55 PM |
Tagged: fsp, PSU, Hydro X, 650W, 80 Plus Gold

With all the hoopla and brouhaha caused by Computex last week some smaller launches were missed, such as the FSP Hydro X 650W PSU.  This particular PSU is non-modular but does carry a five year warranty, an 80 Plus Gold rating and a single 12V rail capable of providing 649.92W @ 54.16A.  [H]ard|OCP's testing showed it to be a solid PSU, providing stable power and meeting with the claimed standards.  Unfortunately there is currently a bit of an issue, though FSP is working to resolve it.  This PSU sells for $95 but the previous fully modular model can be picked up for $85 or less, even though the MSRP is technically higher.  [H] reached out to FSP about this issue and you can see how they plan to resolve the issue in the full review.

1463006474rCJn1f6Y40_2_9_l.jpg

"FSP does not have much to say about its Hydro X in terms of marketing speak, but it does hit the high points that enthusiast system builders are looking for: "Silent operation, High efficiency ≧ 90%, Full Japan-made electrolytic capacitors, Powerful single +12V rail design, Ribbon cables, and Complete protection: OCP, OVP, SCP, OPP, OTP."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

What did we just tell you about bloatware?! Now ASUS Live Update is the risk of the day

Subject: General Tech | June 6, 2016 - 02:26 PM |
Tagged: asus, bloatware, security

After last week when several laptop OEMs, including Lenovo once again, were caught installing highly insecure bloatware on their laptop you might hope that this week would be different.  Sadly you would be mistaken as once again software preinstalled on laptops is in the news.  In this case it is ASUS Live Update which transmits requests for updates in plain text and does not check any software updates which come back for authenticity.  This of course leaves you wide open for man in the middle attacks, where someone posing as those update servers could feed you whatever installation files they desired.  As the pull quote from The Inquirer below states, removing it immediately would be a very good idea.

a6e6087353a6c593afc6327b758650a6.jpg

"My advice to anyone who purchased an Asus device: remove LiveUpdate. It's really that simple. If you're an IT administrator, find devices making periodic calls to Asus's domains and blackhole them, get the user to come and see you,"

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

SilverStone Announces SX800-LTI 800W Titanium SFX-L Power Supply

Subject: Cases and Cooling | June 6, 2016 - 08:03 AM |
Tagged: SX800-LTI, small form-factor, Silverstone, SFX-L, SFX, SFF, PSU, power supply, computex 2016, computex, 80 Plus Titanium

SilverStone introduced a 700W SFX-L power supply at CES in January, and with that SX700-LPT PSU now officially released the company has raised the bar again as no less than 800W is coming to the SFX-L form-factor.

SX800LTI_0.jpg

Image credit: TechPowerUp

SilverStone's SX800-LTI not only offers a massive 800W, but does so with an 80 PLUS Titanium certification (!). The power supply pushes this massive wattage along a single +12V rail, and the SX800-LTI features a fully-modular design for a clean build. An added benefit to the SFX-L form-factor, other than the potential for these powerful designs, is the use of a 120 mm fan, which allows for much quieter operation under load compared to the smaller SFX variant.

SX800LTI_1.jpg

Image credit: TechPowerUp

We are now approaching full ATX power with these SFX-L PSUs, and indeed the 800-850W range should be all most users would need for even a dual-GPU system (especially as we enter the era of 14-16nm GPUs with their lower power requirements).

No word yet on price or availability.

Source: TechPowerUp

Linux Gaming Is Growing on Us?

Subject: General Tech | June 6, 2016 - 07:46 AM |
Tagged: steam, pc gaming, linux

According to Phoronix, gaming on Linux has experienced exponential growth in recent times. Over the course of the last two years, Steam's catalog on the platform expanded from 500 games up to over 2200. This is a little over a 4.4x increase over two years. If I'm doing my high-school math correctly, and I seriously hope I am, this corresponds to an average increase of just under 2.1x year-over-year.

In other words, this is litearlly the trend, minus half-life. Snicker snicker snicker.

steam-os.png

The quantity of Linux's games catalog is a very different argument from its quality, of course. Still, you can find many interesting titles there. Valve has been porting their catalog to the OS, as have other, high-end titles, like Tomb Raider, Trine, Civilization V, Civilization: Beyond Earth, XCOM, and a couple Borderlands versions. If interested in specifics, and you enjoy a sense of humor like you would see on our PC Perspective Podcast, check out LinuxGameCast for their reviews of specific titles.

Source: Phoronix

HSA 1.1 Released

Subject: Graphics Cards, Processors, Mobile | June 6, 2016 - 07:11 AM |
Tagged: hsa 1.1, hsa

The HSA Foundation released version 1.1 of their specification, which focuses on “multi-vendor” compatibility. In this case, multi-vendor doesn't refer to companies that refused to join the HSA Foundation, namely Intel and NVIDIA, but rather multiple types of vendors. Rather than aligning with AMD's focus on CPU-GPU interactions, HSA 1.1 includes digital signal processors (DSPs), field-programmable gate arrays (FPGAs), and other accelerators. I can see this being useful in several places, especially on mobile, where cameras, sound processors, and CPU cores, and a GPU regularly share video buffers.

HSA Foundation_Logo.png

That said, the specification also mentions “more efficient interoperation with non-HSA compliant devices”. I'm not quite sure what that specifically refers to, but it could be important to keep an eye on for future details -- whether it is relevant for Intel and NVIDIA hardware (and so forth).

Charlie, down at SemiAccurate, notes that HSA 1.1 will run on all HSA 1.0-compliant hardware. This makes sense, but I can't see where this is explicitly mentioned in their press release. I'm guessing that Charlie was given some time on a conference call (or face-to-face) regarding this, but it's also possible that he may be mistaken. It's also possible that it is explicitly mentioned in the HSA Foundation's press blast and I just fail at reading comprehension.

If so, I'm sure that our comments will highlight my error.

Computex 2016: Corsair Announces Neutron XTI SSDs

Subject: Storage | June 6, 2016 - 03:40 AM |
Tagged: ssd, corsair, neutron, neutron xti, Neutron XT

Corsair announced a new line of SSDs at Computex. We didn't have boots on the ground there this year, and it's not yet on Corsair's website, so we needed to go with Tom's coverage of the product. The Corsair Neutron XTI uses Toshiba's 15nm MLC flash and the Phison S10 controller “with expanded cache”. This added cache addresses some “performance consistency” issues that Corsair identified, but they didn't seem to elaborate on what that is. It is rated at up to 100,000 IOPS Read and 90,000 IOPS Write, but that obviously needs to be tested to specify when, how, and how often.

corsair-2016-neutron-xti-toms.jpg

Image Credit: Tom's Hardware

Speaking of tested Corsair Neutron SSDs, Allyn reviewed the previous model, the Corsair Neutron XT, all the way back in November, 2014. He was impressed with the drive at the time, although, while it was super fast at low queue depths of about ~1-4 items, it slowed down above that. Since that time, he has been developing some interesting testing methods to figure out whether slowdowns could be related to individual hitches that would be lost in benchmarks that aggregate results and implicitly average them out. He didn't have those methods back then, though, so it's unclear whether the queue depth issue was a symptom of a latency problem, and whether the “expanded cache” will help that.

We'll see when it's launched. It will be available in 240, 480, and 960 GB varieties.

A Potentially More Harmful Coil Whine Issue

Subject: General Tech | June 5, 2016 - 02:18 PM |
Tagged: security, Cyber Security, coil whine

As new hardware launches, many readers ask whether they produce any noticeable form of coil whine. For instance, this is an issue for graphics cards that are outputting a very high frame rate. The electronics create sound from the current oscillating as it flows through them. It can also be an issue for motherboards or power supplies as well. You can check out this fairly old video from LinusTechTips for a demonstration.

acm-2016-mic.jpg

Image Credit: ACM

It turns out that, because this whine is related to the signal flowing through the oscillating circuit, security researchers are looking into the types of information that can be inferred from the whine. In particular, the Association for Computing Machinery (ACM) published a paper called Physical Key Extraction Attacks on PCs. It discusses several methods of attacking a device, such as reading minor fluctuations in its grounding plug or monitoring induced radiation with an antenna. Its headlining method is “Acoustic” though, which listens to coil whine sound produced by the computer, as it decrypts RSA messages that are sent to it, to gather the RSA secret key from it.

While they have successfully demonstrated the attack using a parabolic microphone at 33ft away, and a second demonstration using a mobile phone at 1ft away, the news should be taken with a grain of salt. Mostly, it's just interesting to realize that there's nothing really special about a computer. All it does is stores and processes data on whatever physical state we have available in the world. Currently, that's almost always radio-frequency radiation flowing through semiconductors. Whatever we use will have consequences. For instance, as transistors get smaller, to push more complex signals through a given surface area and power, we'll eventually run out of atoms.

This is just another, often forgotten side-effect: electric signals induce the transfer of energy. It could be electromagnetic, acoustic, or even thermal. In the realm of security, this could, itself, carry some of the data that we attached to our world's state, and allow others to access it (or sometimes modify it) without our knowledge or consent.

DigitalFoundry Records The Witcher 3: Blood and Wine at 4K

Subject: General Tech | June 5, 2016 - 03:44 AM |
Tagged: pc gaming, The Witcher 3

The Witcher 3 is one of the best looking games available, and its final DLC, Blood and Wine, intended to raise that graphical bar slightly. Near the base game's initial launch, in early 2015, there was a bit of a controversy surrounding the image quality and how it sort-of rolled back. Righting this issue was apparently one of the design goals for this final DLC, leaving users with fonder memories of the title before CD Projekt Red moves onto newer projects. Granted, the memories weren't all that bad to begin with, but it was nice to address regardless.

cdprojekt-2016-witcher3-lastdlc.jpg

As you can see, this environment is bright, vibrant, and heavily saturated with color. The medieval city is alive with colored cobblestone, flowers, banners, and buildings all under a bright, blue sky. There was quite a bit of texture pop-in that I saw, even at 1080p, but it wasn't too distracting. This, again, is supposed to be the last time that CD Projekt adds substantial content to The Witcher franchise for the foreseeable future, but I hope that the mod community will keep the title alive.

Computex 2016: Gigabyte Grants Sneak Peek at GPU Dock

Subject: Graphics Cards, Mobile | June 5, 2016 - 02:02 AM |
Tagged: gigabyte, external gpu

External GPUs can be a good idea. If it is affordable, easy, and not too big, users can augment their laptop CPU, which is probably good enough to at least run most tasks, with a high-end GPU. While GPUs are more efficient that CPUs, the tasks that they are expected to do are so much larger that a decent graphics chip is difficult to cram into laptop form factor... for the most part.

gigabyte-2016-externalgpu-toms.jpg

Image Credit: Tom's Hardware

Preamble aside, it's been tried and dropped numerous times over the last decade, but the last generation seems to be getting a little traction. Razer added the feature to their relatively popular Blade line of laptops, and AMD, who was one of the companies to try it several years ago, is pushing it now with their XConnect technology. Even Microsoft sort-of does this with their Surface Book, and it's been a small source of problems for them.

Now Gigabyte, at Computex, announced that they are investigating prototypes. According to Tom's Hardware, their current attempt stands upright, which is likely to take up less desk space. Looking at it, I could see it hiding in the space between my monitors and the corner of the room (because my desk slides into the corner). Of course, in my case, I have a desktop PC, so I'm not the target demographic, but who knows? It's possible that a laptop user might have a similar setup to me. It's still pretty big, though.

Currently, Gigabyte limits the power supply to 250W, which drops GPU support to under 175W TDP. In other words? Too small for a GeForce GTX 1080. The company did tell Tom's Hardware that they are considering upping that to 350W, which would allow 260W of load, which allows all 1x PCIe 8-pin graphics cards, and thus many (but not all) GTX 1080s.

No pricing or availability yet, of course. It's just a prototype.

Antec Makes Large Coolers. No-one Tell Morry.

Subject: Cases and Cooling | June 4, 2016 - 11:32 PM |
Tagged: fanless, cpu cooler, antec

The folks down at FanlessTech found a giant heatsink that Antec showed off at Computex. It consists of three large stacks of aluminum, weighing about 3lbs, with potentially four fans moving air slowly across it. The original post doesn't mention whether it could be used in a fanless mode, but come on. It should be able to cool something without a fan directly attached to it.

antec-2016-largecooler-gdm-or-jp.jpg

Image Credit: GDM.or.jp

They don't seem to have price, availability, or even a model number yet, so details are scarce. It will have at least three colors, black, gold, and red, though, so you have a choice about how it will look in your case. Well, at least mostly in your case.

Update (June 5th @ 2:20pm): Turns out that I forgot to add the dimensions and specifications of this cooler. Its total size is 165mm x 142mm x 159mm. Its intended fans spin at 800-1800 RPM. At 800 RPM, they push 12.36 cubic feet per minute of air at 17.5 dBA. At 1800 RPM, they push 65.23 cubic feet per minute of air 25.9 dBA.

Adobe XD Will Apparently Be a UWP Application

Subject: General Tech | June 4, 2016 - 10:55 PM |
Tagged: windows 10, uwp, Adobe

So a company, who refuses to port its applications to Linux, is experimenting with UWP for future products. Adobe's Experience Design (XD) CC is going to arrive on Windows later this year, and a representative from Adobe claimed on Twitter that it will use Microsoft's UWP platform. Granted, we're not talking about something like Photoshop or After Effects, but rather a UX mock-up tool, sort-of along the lines of Pencil Project.

It's unclear whether UWP will be a choice.

adobe-2016-xd.jpg

The logo looks like it's laughing at us with its tongue out.

I still find UWP a concern as Microsoft, while responding to some feedback, still has some key restrictions in play that limit free sharing. Until it becomes technically (or legally) unfeasible for Microsoft to lock down the platform, there will always be the concern that they could, for instance, revoke people's ability to develop software or remove (or prevent installation) of existing software. Even if they don't want to do it themselves, someone with authority over them may just compel it, such as a government who is against encryption.

If you build it, someone will abuse it. The only thing preventing Microsoft from realizing their Windows RT vision, if they still choose to, is the popularity of Win32 applications and how incompatible they are with that framework. We, as a society, want them to remain popular enough that Microsoft cannot afford to abandon it. They want to. They hate the stigma that Windows is where viruses are. That's reasonable, but they're not just throwing out the bathwater.

As an aside: they also want a platform that is less reliant upon x86, and could be recompiled for other hardware if Intel doesn't go where Microsoft wants to be. This is kind-of ironic if you think about it.

Source: WinBeta

LWJGL 3.0.0 Released

Subject: General Tech | June 4, 2016 - 09:29 PM |
Tagged: Java, lwjgl, vulkan

Don't be confused by the date on the LWJGL post -- its release date was June 3rd, as mentioned later in the thread, not February 27th. It looks like they disabled edit timestamps. Regardless, Lightweight Java Game Library (LWJGL) 3.0.0 was just released, which is a library that binds Java code to APIs that are, normally, not directly accessible through that platform.

To be clear: LWJGL is not a library like, say, Qt, which simplifies common tasks into classes. Its goal is to connect you to whatever API you need, and otherwise leave you alone. Unless you're the type who wants full control over everything, or you're actually making a framework yourself, you will want to use existing frameworks, engines, and/or middleware for your projects. The advantage, of course, is that these frameworks, engines, and middleware now have access to newer APIs, and can justify deprecating old features.

java-logo.png

This release adds Vulkan support, which will provide a high-performance (and high-efficiency) base to abstract many other graphics and GPU compute tasks on. DirectX 12 and Vulkan are still being worked on, as an industry, but its mechanism is theoretically better, especially with multiple threads (and multiple graphics devices). They basically add a graphics layer to a GPU compute-style API, basing everything on lists of commands that start and end wherever the host code desires.

While Java has been taking a massive hit in public opinion lately, it is still a good platform for some applications. Gaming seems to having a resurgence of native APIs, especially with “AAA” engines becoming available to the general public, but more frameworks isn't a bad thing.

Source: LWJGL

Computex 2016: Wooting One Analog Keyboard

Subject: Cases and Cooling, Systems | June 4, 2016 - 05:39 PM |
Tagged: gaming keyboard

Wooting, a start-up that is currently running an already-funded Kickstarter, is looking to produce a keyboard with analog inputs. This is not exactly an entirely-new concept. Ben Heck created one back in 2012 by modifying the WASD cluster to include Hall Effect sensors, which were attached to the guts of an Xbox 360 controller to signal thumbstick offsets. The further you press the key, the more intense of an input would be sent to the PC.

wooting-2016-wooting-one-premium-cropped.png

The Wooting One, which, again, is a Kickstarter campaign, does it a bit more... professionally. The keyboard uses the “Flaretech” switch, which I've never heard of before now, from Taiwanese manufacturer Adomax. Unlike Ben Heck's Hall Effect sensors, this one measures offset with light sensing. This raises a petty, pedantic argument about whether it's technically a mechanical keyboard, since the activation isn't performed by a direct, mechanical process, but users typically equate “mechanical keyboard” with its quality and feel, which could be achieved with non-mechanical processes. Semantics aside, the light-sensing mechanism allows precise measurement of how far down the key is. From there, it's just a matter of mapping that distance to an input.

wooting-2016-wooting-one-driver.png

This is where the Wooting One looks quite interesting. The firmware and driver will communicate under XInput and apparently other Gamepad APIs, functioning under most games that allow simultaneous gamepad + keyboard input for a single player. They are also expecting to create an open-source system, with an API, that allows games to access the analog input of apparently all keys on the board. This is interesting, because XInput has fairly restrictive limitations of about six axises of analog input (although the two axises corresponding to the triggers are lower precision and, with the Xbox One controller, joined into a single axis). A new API can circumvent all of this for gaming going forward, and it will be required for analog keyboards to get off the ground. It's not a difficult task itself, as there is quite a bit of bandwidth in external IO connections these days, but getting and entire industry's worth of vendors to agree could be a task (unless you're, like, Microsoft). Hopefully it's open, with a permissive license, and a few, big-name engine vendors add support to push it forward.

And, let's be honest -- XInput is limiting. A new API could be good for obscure gamepads, too.

wooting-2016-wooting-one-switch-inside.png

Outside of analog gaming, they are also milking this “know how far down the key is” feature as much as they can. For instance, they are also allowing users to choose the activation distance in digital mode. Users can set their balance between rejecting partial presses and speed of input based on their ability to touch type.

It's a European Kickstarter, and the lowest backer tier that includes the keyboard ships in November and is worth 100 Euro, ~$115 USD. which apparently includes tax and shipping for North America and Europe. That doesn't correlate to a retail price, if the product even gets off the ground, but it's a data point however reliable. Tax-in and free shipping sounds a bit... sketchy for a crowdfunding campaign... but that could just be a sign that they're more affiliated with an existing company (and its supply chain) than they're letting on, rather than business naivety.

Rumor: GP104 Might Arrive in Laptops... Soon.

Subject: Graphics Cards, Mobile | June 4, 2016 - 04:28 PM |
Tagged: nvidia, GTX 1080, gtx 1070, pascal

Normally, when a GPU developer creates a laptop SKU, they re-use the desktop branding, add an M at the end, but release a very different, significantly slower part. This changed with the GTX 980, as NVIDIA cherry-picked the heck out of their production to find chips that could operate full-speed at a lower-than-usual TDP. With less power (and cooling) to consider, they were sent to laptop manufacturers and integrated into high-end designs.

nvidia-2016-980-not-m-laptop.jpg

They still had the lower-performance 980M, though, which was confusing for potential customers. You needed to know to avoid the M, and trust the product page to correctly add the M as applicable. This is where PCGamer's scoop comes into play. Apparently, NVIDIA will stop “producing separate M versions of its desktop GPUs”. Also, they are expected to release their 10-series desktop GPUs to their laptop partners by late-summer.

Whatttttt?

Last time, NVIDIA took almost a year to bin enough GPUs for laptops. While we don't know how long they've been stockpiling GP104 GPUs, this, if the rumors are true, would just be about three months of lead-time for the desktop SKUs. Granted, Pascal is significantly more efficient than Maxwell. Maxwell tried to squeeze extra performance out of an existing fabrication node, while Pascal is a relatively smaller chip, benefiting from the industry's double-shrink in process technology. It's possible that they didn't need to drop the TDP threshold that far below what they accept for desktop.

For us desktop users, this also suggests that NVIDIA is not having too many issues with yield in general. I mean, if they were expecting GPU shortages to persist for months, you wouldn't expect that they would cut their supply further with a new product segment, particularly one that should require both decent volume and well-binned chips. This, again, might mean that we'll see desktop GPUs restock soon. Either that, or NVIDIA significantly miscalculated demand for new GPUs, and they needed to fulfill partner obligations that they made before reality struck.

Call it wishful thinking, but I don't think it's the latter.

Source: PCGamer

The Wit.nes Runs in an NES Emulator

Subject: General Tech | June 4, 2016 - 03:24 PM |
Tagged: nes, the witness, the wit.nes, pc gaming

The Witness, from Thekla Inc. and Jonathan Blow, caught the attention of a few of us at PC Perspective... mostly Allyn. Anywho, it's based on an island that you explore and solve puzzles along the way. I'm not talking about puzzles in the “Space Quest”, point-and-click adventure sense, but like, puzzles that you would expect to find in a newspaper, which unlock doors and turn on machinery when solved.

thewitness-2016-fan-NES-demake.png

If that sort of game is for you, then you might want to check out a “demake” of it, called The Wit.nes. It is created by an indie developer who goes by the name Dustmop, for NES emulators. Being a game that's based on the NES platform, the entire virtual ROM is currently 40KB. (NES titles varied between ~8kB and ~1MB). It plays from a top-down perspective in its exploration mode, rather than first-person for what should be obvious reasons, but the puzzles are apparently quite faithful to the original style.

It's free and small, so check it out at their Itch.io page if you're interested.

Computex 2016: Can't Forget ZOTAC's Custom GTX 1080s

Subject: Graphics Cards | June 4, 2016 - 02:51 PM |
Tagged: nvidia, GTX 1080, zotac

(Most of) NVIDIA's AIB partners have been flooding out announcements of custom GTX 1080 designs. Looking over them, it seems like they fall into two camps: one believes 1x eight-pin PCIe power is sufficient for the GP104, and the other thinks that 1x eight-pin + 1x six-pin PCIe could be useful.

zotac-2016-gtx1080-amp-regular.jpg

ZOTAC, on the other hand, seems to believe that both are underestimating. Excluding the Founders Edition, both of their GTX 1080 designs utilize 2x eight-pin PCIe connectors. This gives their cards a theoretical maximum of 375W, versus 225W of the Founders Edition. At this point, considering the Founders Edition can reach 2.1 GHz with good enough binning, I'm guessing that it's either there simply because they can, or they just didn't want to alter their existing design. Not that, if you only have 6-pin PCIe connectors on your power supply, ZOTAC provides the dual-six-to-eight-pin adapters in the box.

zotac-2016-gtx1080-amp-extreme.jpg

The two SKUs that they are releasing, again, apart from the Founders Edition, vary by their heatsink. The ZOTAC GeForce GTX 1080 AMP has a dual-fan IceStorm cooler, while the ZOTAC GeForce GTX 1080 AMP Extreme has a triple-fan IceStorm cooler. (IceStorm is the brand name of ZOTAC's custom cooler.) Other than the 2x 8-pin PCIe connector, there's not much else to mention. ZOTAC has not settled on a default base, boost, or memory clock, and you will probably be overclocking it yourself (either manually or by using an automatic overclocker) anyway. Both cards have a back plate, if that's something you're interested in.

Once again, no pricing or availability. It shouldn't be too long, though.

Source: ZOTAC

Computex 2016: Corsair Hydro GFX for GeForce GTX 1080

Subject: Graphics Cards | June 4, 2016 - 01:53 PM |
Tagged: nvidia, msi, hydro gfx, GTX 1080, corsair

Last week, we wrote about the MSI GeForce GTX 1080 SEA HAWK. This design took their AERO cooler and integrated a Corsair self-contained water cooler into it. In response, Corsair, not to be outdone by MSI's Corsair partnership, partnered with MSI to release their own graphics card, the GeForce GTX 1080 version of the Corsair Hydro GFX.

msi-2016-gtx1080-seahawk.png

The MSI SEA HAWK

Basically, like we saw with their previous Hydro GFX card, Corsair and MSI are each selling basically the same graphics card, just with their own branding. It sounds like the two cards, MSI's SEA HAWK and Corsair's Hydro GFX, differ slightly in terms of LED lighting, but it might just be a mismatch between Tom's Hardware's Computex coverage and MSI's product page. Otherwise, I would guess that the choice between these SKUs comes down to the company that you trust most for support, which I believe both Corsair and MSI hold a good reputation for, and the current price at the specific retailer you choose. Maybe some slight variation in clock rate?

corsair-2016-gfx-hydro-toms.jpg

The Corsair Hydro GFX at Computex
(Image Credit: Tom's Hardware)

For the record, both cards use a single, eight-pin PCIe power connector, rather than an eight-pin and a six-pin as we've seen a few, high-end boards opt for.

No idea about pricing or availability. Corsair's page still refers to the GTX 980 Ti model.

Report: NVIDIA GTX 1080 Founders Edition Fan Speed Fix Coming Soon

Subject: Graphics Cards | June 4, 2016 - 12:35 PM |
Tagged: revving, report, nvidia, GTX 1080, gpu cooler, founders edition, fan speed, fan issue

VideoCardz is reporting (via PC Games Hardware, Reddit, and TechPowerUp) that an official fix to the reported Founders Edition fan issue is coming soon via driver update:

“NVIDIA has reportedly found the solution and the problem should will be fixed with the next driver release. NVIDIA rep confirmed that software team was able to reproduce this problem, and their fix has already passed internal testing.”

Nvidia-Geforce-GTX-1080.jpg

Image credit: PC Games Hardware

On the NVIDIA forums customer care representative Manuel Guzman has posted about the issue, and now it seems a fix will be provided with the next driver release:

“This thread is to keep users up to date on the status of the fan randomly spinning up and down rapidly that some users are reporting with their GeForce GTX 1080 Founders Edition card. Thank you for your patience.

Updates Sticky Post
Update 6/1/16 - We are testing a driver fix to address the random spin up/down fan issue.
Update 6/2/16 - Driver fix so far has passed internal testing. Fix will be part of our next driver release.”

For those who have experienced the “revving” issue, described as a rapid rise and fall from 2000 RPM to 3000 RPM in the post, this will doubtless come as welcome news. We will have to see how these cards perform once the updated driver has been released and is in user hands.

Source: VideoCardz

Computex 2016: EVGA Is Making Custom SLI HB Bridges

Subject: Graphics Cards | June 4, 2016 - 02:39 AM |
Tagged: evga, sli, SLI HB, GTX 1080, nvidia, gtx 1070

Still no idea when these are coming out, or how much they'll cost, but EVGA will introduce their own custom SLI High-Bandwidth (HB) bridges. These are designed for the GTX 1080 and GTX 1070 cards in two-way SLI mode to share frames in high-speed 1440p, 4K, 5K, and Surround configurations. As TAP mentioned on our live stream, if the bridge was overloaded, it would fall back to communicating PCIe, which should be doing other things at the time.

nvidia-2016-dreamhack-newsli.png

NVIDIA's SLI HB connector.
We didn't have a presence at Computex, so you'll need to check out Fudzilla's photo for EVGA's.

As for EVGA's model? Without pricing and availability, all we can say is that they have a different aesthetic from NVIDIA's. They also, unlike NVIDIA's version, have RGB LEDs on them to add a splash (or another splash) of colored light inside your case. Three versions will be offered, varying with the distances between your cards, but, as the SLI HB spec demands, each of them only supports two at a time.

Source: Fudzilla

AMD Published AMD GPU-PRO Beta Driver (for Linux)

Subject: Graphics Cards | June 3, 2016 - 11:15 PM |
Tagged: linux, graphics drivers, AMDGPU, amd

On Windows, we really only have one graphics driver per GPU. On Linux, however, there is a choice between open drivers and closed, binary-only blobs. Open drivers allow users to perpetuate support, for either really old hardware or pre-release software, without needing the GPU vendor to step in. It can also be better for security, because open-source software can be audited, which is better (albeit how much better is up for debate) than just having a few eyes on it... if any at all.

AMD_2016-Logo.png

As we reported a few months ago, AMD has been shifting their structure. Rather than two completely different code-bases, AMDGPU is an open-source driver, officially supported by AMD, that communicates with the Linux kernel. This chunk is compliant with the GPL, so it can be bundled with the operating system. Above this, a user space driver adds the various APIs, game-specific optimizations, and so forth. AMD calls this plug-in component AMD GPU-PRO.

This component has now been released for Ubuntu 16.04, which includes OpenGL 4.5, OpenCL 1.2, and Vulkan 1.0.

Open-source developers can create their own components, using the same AMDGPU hooks that AMD uses, and release those on their own. This is not a perfect solution, though. If, at any point, AMD disagrees with a necessary, proposed change, then the only way forward could be to fork the project, which AMD wouldn't support with their closed-source blob, leading to the previous situation. That said, AMD is putting a lot of effort into this, so it would stand to reason that they aren't intending to throw all of that away over a pull request.

Either way, you can get AMD GPU-PRO Beta from AMD's page for Ubuntu 16.04. SteamOS added AMD GPU-PRO with their 2.80 update last week.

Source: AMD