All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Cases and Cooling | June 6, 2016 - 08:01 PM | Jeremy Hellstrom
Tagged: water cooling, Tundra Series, TD02-E, Silverstone, cpu cooler, All-in-One cooler
A few years back you may remember that Morry did a review of the SilverStone Tundra Series TD02 AiO watercooler. More recently, Modders Inc reviewed the newer model the TD02-E, part of their high performance line. The waterblock is compatible most modern processors but you will need a decent sized case to accommodate the radiator as it measures 278x124x27mm with two 120mm fans. The cooler performed admirably, especially for its ~$90 price tag and did so at reasonable noise levels, going full out at 2500RPM it measured 50.2 dBA, or 38dBA at a more modest 1400RPM.
"Silverstone Technologies has made quite a career making cooling solutions for the PC DIY market. Their solutions are also quite often a unique alternative with out-of-the-box oriented ideas and far from having a "me too" design philosophy. With the all-in-one liquid cooling solution's popularity, Silverstone also has thrown their hat in the ring with alternatives from the typical Asetek OEM"
Here are some more Cases & Cooling reviews from around the web:
- CRYORIG A40 Hybrid Liquid Cooling System Review @ NikKTech
- Noctua NH-C14S @ techPowerUp
- Zalman Z9 NEO @ techPowerUp
- Thermaltake Core P5 Mid-Tower Review @ NikKTech
- InWin 303 Mid-Tower @ eTeknix
Subject: General Tech | June 6, 2016 - 07:46 PM | Scott Michaud
Tagged: windows, pc gaming, osx, linux
The next week-and-a-half should be good for video game enthusiasts. E3 2016 starts on June 14th, although EA, Bethesda, Microsoft, Ubisoft, Sony, and AMD (with PCGamer) have press conferences throughout the 12th and the 13th. Of course, not to get lost in the traffic, many entities are releasing their announcements prior to those conferences. For instance, Watch Dogs 2 will have a reveal on this Wednesday, June 8th, five days prior to Ubisoft's press conference.
This post is about a Kickstarter project called Yooka-Laylee, though. This title is being created by Playtonic Games, which contains several past employees of Rare, apparently to create a proper Banjo-Kazooie-style platform title. It raised over two million British Pounds (~3 million USD) and targeted an October 2016 release date. That has since slipped to Q1 2017, but that should be expected for a crowdfunding project, especially when the stretch goals start piling up. It is scheduled to be released on Windows, Mac, and Linux... and a few other boxes.
Of course, they couldn't resist making a Banjo-Kazooie: Nuts & Bolts joke at the end...
... I chuckled.
Subject: Cases and Cooling | June 6, 2016 - 06:55 PM | Jeremy Hellstrom
Tagged: fsp, PSU, Hydro X, 650W, 80 Plus Gold
With all the hoopla and brouhaha caused by Computex last week some smaller launches were missed, such as the FSP Hydro X 650W PSU. This particular PSU is non-modular but does carry a five year warranty, an 80 Plus Gold rating and a single 12V rail capable of providing 649.92W @ 54.16A. [H]ard|OCP's testing showed it to be a solid PSU, providing stable power and meeting with the claimed standards. Unfortunately there is currently a bit of an issue, though FSP is working to resolve it. This PSU sells for $95 but the previous fully modular model can be picked up for $85 or less, even though the MSRP is technically higher. [H] reached out to FSP about this issue and you can see how they plan to resolve the issue in the full review.
"FSP does not have much to say about its Hydro X in terms of marketing speak, but it does hit the high points that enthusiast system builders are looking for: "Silent operation, High efficiency ≧ 90%, Full Japan-made electrolytic capacitors, Powerful single +12V rail design, Ribbon cables, and Complete protection: OCP, OVP, SCP, OPP, OTP."
Here are some more Cases & Cooling reviews from around the web:
- FSP Hydro X Series 550 W @ techPowerUp
- FSP Hydro X 650 Power Supply @ Kitguru
- Silverstone SST-SX550 550W PSU @ Kitguru
- SilverStone Strider Titanium Series 800W SST-ST80F-TI Power Supply Unit Review @ NikKTech
Subject: General Tech | June 6, 2016 - 06:26 PM | Jeremy Hellstrom
Tagged: asus, bloatware, security
After last week when several laptop OEMs, including Lenovo once again, were caught installing highly insecure bloatware on their laptop you might hope that this week would be different. Sadly you would be mistaken as once again software preinstalled on laptops is in the news. In this case it is ASUS Live Update which transmits requests for updates in plain text and does not check any software updates which come back for authenticity. This of course leaves you wide open for man in the middle attacks, where someone posing as those update servers could feed you whatever installation files they desired. As the pull quote from The Inquirer below states, removing it immediately would be a very good idea.
"My advice to anyone who purchased an Asus device: remove LiveUpdate. It's really that simple. If you're an IT administrator, find devices making periodic calls to Asus's domains and blackhole them, get the user to come and see you,"
Here is some more Tech News from around the web:
- Siemens Now Commands An Army Of Spider Robots @ Slashdot
- Quieting Scary Web Browser SSL Alerts @ Linux.com
- Microsoft thinks it's fixed Windows Server mess its last fix 'fixed' @ The Register
- AMD Technologies Revealed at Computex 2016 @ Tech ARP
- Computex 2016 Live Coverage Day 5 @ Tech ARP
- Computex 2016 Live Coverage Day 4 @ Tech ARP
Subject: Cases and Cooling | June 6, 2016 - 12:03 PM | Sebastian Peak
Tagged: SX800-LTI, small form-factor, Silverstone, SFX-L, SFX, SFF, PSU, power supply, computex 2016, computex, 80 Plus Titanium
SilverStone introduced a 700W SFX-L power supply at CES in January, and with that SX700-LPT PSU now officially released the company has raised the bar again as no less than 800W is coming to the SFX-L form-factor.
Image credit: TechPowerUp
SilverStone's SX800-LTI not only offers a massive 800W, but does so with an 80 PLUS Titanium certification (!). The power supply pushes this massive wattage along a single +12V rail, and the SX800-LTI features a fully-modular design for a clean build. An added benefit to the SFX-L form-factor, other than the potential for these powerful designs, is the use of a 120 mm fan, which allows for much quieter operation under load compared to the smaller SFX variant.
Image credit: TechPowerUp
We are now approaching full ATX power with these SFX-L PSUs, and indeed the 800-850W range should be all most users would need for even a dual-GPU system (especially as we enter the era of 14-16nm GPUs with their lower power requirements).
No word yet on price or availability.
Subject: General Tech | June 6, 2016 - 11:46 AM | Scott Michaud
Tagged: steam, pc gaming, linux
According to Phoronix, gaming on Linux has experienced exponential growth in recent times. Over the course of the last two years, Steam's catalog on the platform expanded from 500 games up to over 2200. This is a little over a 4.4x increase over two years. If I'm doing my high-school math correctly, and I seriously hope I am, this corresponds to an average increase of just under 2.1x year-over-year.
In other words, this is litearlly the trend, minus half-life. Snicker snicker snicker.
The quantity of Linux's games catalog is a very different argument from its quality, of course. Still, you can find many interesting titles there. Valve has been porting their catalog to the OS, as have other, high-end titles, like Tomb Raider, Trine, Civilization V, Civilization: Beyond Earth, XCOM, and a couple Borderlands versions. If interested in specifics, and you enjoy a sense of humor like you would see on our PC Perspective Podcast, check out LinuxGameCast for their reviews of specific titles.
Subject: Graphics Cards, Processors, Mobile | June 6, 2016 - 11:11 AM | Scott Michaud
Tagged: hsa 1.1, hsa
The HSA Foundation released version 1.1 of their specification, which focuses on “multi-vendor” compatibility. In this case, multi-vendor doesn't refer to companies that refused to join the HSA Foundation, namely Intel and NVIDIA, but rather multiple types of vendors. Rather than aligning with AMD's focus on CPU-GPU interactions, HSA 1.1 includes digital signal processors (DSPs), field-programmable gate arrays (FPGAs), and other accelerators. I can see this being useful in several places, especially on mobile, where cameras, sound processors, and CPU cores, and a GPU regularly share video buffers.
That said, the specification also mentions “more efficient interoperation with non-HSA compliant devices”. I'm not quite sure what that specifically refers to, but it could be important to keep an eye on for future details -- whether it is relevant for Intel and NVIDIA hardware (and so forth).
Charlie, down at SemiAccurate, notes that HSA 1.1 will run on all HSA 1.0-compliant hardware. This makes sense, but I can't see where this is explicitly mentioned in their press release. I'm guessing that Charlie was given some time on a conference call (or face-to-face) regarding this, but it's also possible that he may be mistaken. It's also possible that it is explicitly mentioned in the HSA Foundation's press blast and I just fail at reading comprehension.
If so, I'm sure that our comments will highlight my error.
Subject: Storage | June 6, 2016 - 07:40 AM | Scott Michaud
Tagged: ssd, corsair, neutron, neutron xti, Neutron XT
Corsair announced a new line of SSDs at Computex. We didn't have boots on the ground there this year, and it's not yet on Corsair's website, so we needed to go with Tom's coverage of the product. The Corsair Neutron XTI uses Toshiba's 15nm MLC flash and the Phison S10 controller “with expanded cache”. This added cache addresses some “performance consistency” issues that Corsair identified, but they didn't seem to elaborate on what that is. It is rated at up to 100,000 IOPS Read and 90,000 IOPS Write, but that obviously needs to be tested to specify when, how, and how often.
Image Credit: Tom's Hardware
Speaking of tested Corsair Neutron SSDs, Allyn reviewed the previous model, the Corsair Neutron XT, all the way back in November, 2014. He was impressed with the drive at the time, although, while it was super fast at low queue depths of about ~1-4 items, it slowed down above that. Since that time, he has been developing some interesting testing methods to figure out whether slowdowns could be related to individual hitches that would be lost in benchmarks that aggregate results and implicitly average them out. He didn't have those methods back then, though, so it's unclear whether the queue depth issue was a symptom of a latency problem, and whether the “expanded cache” will help that.
We'll see when it's launched. It will be available in 240, 480, and 960 GB varieties.
Subject: General Tech | June 5, 2016 - 06:18 PM | Scott Michaud
Tagged: security, Cyber Security, coil whine
As new hardware launches, many readers ask whether they produce any noticeable form of coil whine. For instance, this is an issue for graphics cards that are outputting a very high frame rate. The electronics create sound from the current oscillating as it flows through them. It can also be an issue for motherboards or power supplies as well. You can check out this fairly old video from LinusTechTips for a demonstration.
Image Credit: ACM
It turns out that, because this whine is related to the signal flowing through the oscillating circuit, security researchers are looking into the types of information that can be inferred from the whine. In particular, the Association for Computing Machinery (ACM) published a paper called Physical Key Extraction Attacks on PCs. It discusses several methods of attacking a device, such as reading minor fluctuations in its grounding plug or monitoring induced radiation with an antenna. Its headlining method is “Acoustic” though, which listens to coil whine sound produced by the computer, as it decrypts RSA messages that are sent to it, to gather the RSA secret key from it.
While they have successfully demonstrated the attack using a parabolic microphone at 33ft away, and a second demonstration using a mobile phone at 1ft away, the news should be taken with a grain of salt. Mostly, it's just interesting to realize that there's nothing really special about a computer. All it does is stores and processes data on whatever physical state we have available in the world. Currently, that's almost always radio-frequency radiation flowing through semiconductors. Whatever we use will have consequences. For instance, as transistors get smaller, to push more complex signals through a given surface area and power, we'll eventually run out of atoms.
This is just another, often forgotten side-effect: electric signals induce the transfer of energy. It could be electromagnetic, acoustic, or even thermal. In the realm of security, this could, itself, carry some of the data that we attached to our world's state, and allow others to access it (or sometimes modify it) without our knowledge or consent.
Subject: General Tech | June 5, 2016 - 07:44 AM | Scott Michaud
Tagged: pc gaming, The Witcher 3
The Witcher 3 is one of the best looking games available, and its final DLC, Blood and Wine, intended to raise that graphical bar slightly. Near the base game's initial launch, in early 2015, there was a bit of a controversy surrounding the image quality and how it sort-of rolled back. Righting this issue was apparently one of the design goals for this final DLC, leaving users with fonder memories of the title before CD Projekt Red moves onto newer projects. Granted, the memories weren't all that bad to begin with, but it was nice to address regardless.
As you can see, this environment is bright, vibrant, and heavily saturated with color. The medieval city is alive with colored cobblestone, flowers, banners, and buildings all under a bright, blue sky. There was quite a bit of texture pop-in that I saw, even at 1080p, but it wasn't too distracting. This, again, is supposed to be the last time that CD Projekt adds substantial content to The Witcher franchise for the foreseeable future, but I hope that the mod community will keep the title alive.
Subject: Graphics Cards, Mobile | June 5, 2016 - 06:02 AM | Scott Michaud
Tagged: gigabyte, external gpu
External GPUs can be a good idea. If it is affordable, easy, and not too big, users can augment their laptop CPU, which is probably good enough to at least run most tasks, with a high-end GPU. While GPUs are more efficient that CPUs, the tasks that they are expected to do are so much larger that a decent graphics chip is difficult to cram into laptop form factor... for the most part.
Image Credit: Tom's Hardware
Preamble aside, it's been tried and dropped numerous times over the last decade, but the last generation seems to be getting a little traction. Razer added the feature to their relatively popular Blade line of laptops, and AMD, who was one of the companies to try it several years ago, is pushing it now with their XConnect technology. Even Microsoft sort-of does this with their Surface Book, and it's been a small source of problems for them.
Now Gigabyte, at Computex, announced that they are investigating prototypes. According to Tom's Hardware, their current attempt stands upright, which is likely to take up less desk space. Looking at it, I could see it hiding in the space between my monitors and the corner of the room (because my desk slides into the corner). Of course, in my case, I have a desktop PC, so I'm not the target demographic, but who knows? It's possible that a laptop user might have a similar setup to me. It's still pretty big, though.
Currently, Gigabyte limits the power supply to 250W, which drops GPU support to under 175W TDP. In other words? Too small for a GeForce GTX 1080. The company did tell Tom's Hardware that they are considering upping that to 350W, which would allow 260W of load, which allows all 1x PCIe 8-pin graphics cards, and thus many (but not all) GTX 1080s.
No pricing or availability yet, of course. It's just a prototype.
Subject: Cases and Cooling | June 5, 2016 - 03:32 AM | Scott Michaud
Tagged: fanless, cpu cooler, antec
The folks down at FanlessTech found a giant heatsink that Antec showed off at Computex. It consists of three large stacks of aluminum, weighing about 3lbs, with potentially four fans moving air slowly across it. The original post doesn't mention whether it could be used in a fanless mode, but come on. It should be able to cool something without a fan directly attached to it.
Image Credit: GDM.or.jp
They don't seem to have price, availability, or even a model number yet, so details are scarce. It will have at least three colors, black, gold, and red, though, so you have a choice about how it will look in your case. Well, at least mostly in your case.
Update (June 5th @ 2:20pm): Turns out that I forgot to add the dimensions and specifications of this cooler. Its total size is 165mm x 142mm x 159mm. Its intended fans spin at 800-1800 RPM. At 800 RPM, they push 12.36 cubic feet per minute of air at 17.5 dBA. At 1800 RPM, they push 65.23 cubic feet per minute of air 25.9 dBA.
Subject: General Tech | June 5, 2016 - 02:55 AM | Scott Michaud
Tagged: windows 10, uwp, Adobe
So a company, who refuses to port its applications to Linux, is experimenting with UWP for future products. Adobe's Experience Design (XD) CC is going to arrive on Windows later this year, and a representative from Adobe claimed on Twitter that it will use Microsoft's UWP platform. Granted, we're not talking about something like Photoshop or After Effects, but rather a UX mock-up tool, sort-of along the lines of Pencil Project.
It's unclear whether UWP will be a choice.
The logo looks like it's laughing at us with its tongue out.
I still find UWP a concern as Microsoft, while responding to some feedback, still has some key restrictions in play that limit free sharing. Until it becomes technically (or legally) unfeasible for Microsoft to lock down the platform, there will always be the concern that they could, for instance, revoke people's ability to develop software or remove (or prevent installation) of existing software. Even if they don't want to do it themselves, someone with authority over them may just compel it, such as a government who is against encryption.
If you build it, someone will abuse it. The only thing preventing Microsoft from realizing their Windows RT vision, if they still choose to, is the popularity of Win32 applications and how incompatible they are with that framework. We, as a society, want them to remain popular enough that Microsoft cannot afford to abandon it. They want to. They hate the stigma that Windows is where viruses are. That's reasonable, but they're not just throwing out the bathwater.
As an aside: they also want a platform that is less reliant upon x86, and could be recompiled for other hardware if Intel doesn't go where Microsoft wants to be. This is kind-of ironic if you think about it.
Subject: General Tech | June 5, 2016 - 01:29 AM | Scott Michaud
Tagged: Java, lwjgl, vulkan
Don't be confused by the date on the LWJGL post -- its release date was June 3rd, as mentioned later in the thread, not February 27th. It looks like they disabled edit timestamps. Regardless, Lightweight Java Game Library (LWJGL) 3.0.0 was just released, which is a library that binds Java code to APIs that are, normally, not directly accessible through that platform.
To be clear: LWJGL is not a library like, say, Qt, which simplifies common tasks into classes. Its goal is to connect you to whatever API you need, and otherwise leave you alone. Unless you're the type who wants full control over everything, or you're actually making a framework yourself, you will want to use existing frameworks, engines, and/or middleware for your projects. The advantage, of course, is that these frameworks, engines, and middleware now have access to newer APIs, and can justify deprecating old features.
This release adds Vulkan support, which will provide a high-performance (and high-efficiency) base to abstract many other graphics and GPU compute tasks on. DirectX 12 and Vulkan are still being worked on, as an industry, but its mechanism is theoretically better, especially with multiple threads (and multiple graphics devices). They basically add a graphics layer to a GPU compute-style API, basing everything on lists of commands that start and end wherever the host code desires.
While Java has been taking a massive hit in public opinion lately, it is still a good platform for some applications. Gaming seems to having a resurgence of native APIs, especially with “AAA” engines becoming available to the general public, but more frameworks isn't a bad thing.
Subject: Cases and Cooling, Systems | June 4, 2016 - 09:39 PM | Scott Michaud
Tagged: gaming keyboard
Wooting, a start-up that is currently running an already-funded Kickstarter, is looking to produce a keyboard with analog inputs. This is not exactly an entirely-new concept. Ben Heck created one back in 2012 by modifying the WASD cluster to include Hall Effect sensors, which were attached to the guts of an Xbox 360 controller to signal thumbstick offsets. The further you press the key, the more intense of an input would be sent to the PC.
The Wooting One, which, again, is a Kickstarter campaign, does it a bit more... professionally. The keyboard uses the “Flaretech” switch, which I've never heard of before now, from Taiwanese manufacturer Adomax. Unlike Ben Heck's Hall Effect sensors, this one measures offset with light sensing. This raises a petty, pedantic argument about whether it's technically a mechanical keyboard, since the activation isn't performed by a direct, mechanical process, but users typically equate “mechanical keyboard” with its quality and feel, which could be achieved with non-mechanical processes. Semantics aside, the light-sensing mechanism allows precise measurement of how far down the key is. From there, it's just a matter of mapping that distance to an input.
This is where the Wooting One looks quite interesting. The firmware and driver will communicate under XInput and apparently other Gamepad APIs, functioning under most games that allow simultaneous gamepad + keyboard input for a single player. They are also expecting to create an open-source system, with an API, that allows games to access the analog input of apparently all keys on the board. This is interesting, because XInput has fairly restrictive limitations of about six axises of analog input (although the two axises corresponding to the triggers are lower precision and, with the Xbox One controller, joined into a single axis). A new API can circumvent all of this for gaming going forward, and it will be required for analog keyboards to get off the ground. It's not a difficult task itself, as there is quite a bit of bandwidth in external IO connections these days, but getting and entire industry's worth of vendors to agree could be a task (unless you're, like, Microsoft). Hopefully it's open, with a permissive license, and a few, big-name engine vendors add support to push it forward.
And, let's be honest -- XInput is limiting. A new API could be good for obscure gamepads, too.
Outside of analog gaming, they are also milking this “know how far down the key is” feature as much as they can. For instance, they are also allowing users to choose the activation distance in digital mode. Users can set their balance between rejecting partial presses and speed of input based on their ability to touch type.
It's a European Kickstarter, and the lowest backer tier that includes the keyboard ships in November and is worth 100 Euro, ~$115 USD. which apparently includes tax and shipping for North America and Europe. That doesn't correlate to a retail price, if the product even gets off the ground, but it's a data point however reliable. Tax-in and free shipping sounds a bit... sketchy for a crowdfunding campaign... but that could just be a sign that they're more affiliated with an existing company (and its supply chain) than they're letting on, rather than business naivety.
Subject: Graphics Cards, Mobile | June 4, 2016 - 08:28 PM | Scott Michaud
Tagged: nvidia, GTX 1080, gtx 1070, pascal
Normally, when a GPU developer creates a laptop SKU, they re-use the desktop branding, add an M at the end, but release a very different, significantly slower part. This changed with the GTX 980, as NVIDIA cherry-picked the heck out of their production to find chips that could operate full-speed at a lower-than-usual TDP. With less power (and cooling) to consider, they were sent to laptop manufacturers and integrated into high-end designs.
They still had the lower-performance 980M, though, which was confusing for potential customers. You needed to know to avoid the M, and trust the product page to correctly add the M as applicable. This is where PCGamer's scoop comes into play. Apparently, NVIDIA will stop “producing separate M versions of its desktop GPUs”. Also, they are expected to release their 10-series desktop GPUs to their laptop partners by late-summer.
Last time, NVIDIA took almost a year to bin enough GPUs for laptops. While we don't know how long they've been stockpiling GP104 GPUs, this, if the rumors are true, would just be about three months of lead-time for the desktop SKUs. Granted, Pascal is significantly more efficient than Maxwell. Maxwell tried to squeeze extra performance out of an existing fabrication node, while Pascal is a relatively smaller chip, benefiting from the industry's double-shrink in process technology. It's possible that they didn't need to drop the TDP threshold that far below what they accept for desktop.
For us desktop users, this also suggests that NVIDIA is not having too many issues with yield in general. I mean, if they were expecting GPU shortages to persist for months, you wouldn't expect that they would cut their supply further with a new product segment, particularly one that should require both decent volume and well-binned chips. This, again, might mean that we'll see desktop GPUs restock soon. Either that, or NVIDIA significantly miscalculated demand for new GPUs, and they needed to fulfill partner obligations that they made before reality struck.
Call it wishful thinking, but I don't think it's the latter.
Subject: General Tech | June 4, 2016 - 07:24 PM | Scott Michaud
Tagged: nes, the witness, the wit.nes, pc gaming
The Witness, from Thekla Inc. and Jonathan Blow, caught the attention of a few of us at PC Perspective... mostly Allyn. Anywho, it's based on an island that you explore and solve puzzles along the way. I'm not talking about puzzles in the “Space Quest”, point-and-click adventure sense, but like, puzzles that you would expect to find in a newspaper, which unlock doors and turn on machinery when solved.
If that sort of game is for you, then you might want to check out a “demake” of it, called The Wit.nes. It is created by an indie developer who goes by the name Dustmop, for NES emulators. Being a game that's based on the NES platform, the entire virtual ROM is currently 40KB. (NES titles varied between ~8kB and ~1MB). It plays from a top-down perspective in its exploration mode, rather than first-person for what should be obvious reasons, but the puzzles are apparently quite faithful to the original style.
It's free and small, so check it out at their Itch.io page if you're interested.
Subject: Graphics Cards | June 4, 2016 - 06:51 PM | Scott Michaud
Tagged: nvidia, GTX 1080, zotac
(Most of) NVIDIA's AIB partners have been flooding out announcements of custom GTX 1080 designs. Looking over them, it seems like they fall into two camps: one believes 1x eight-pin PCIe power is sufficient for the GP104, and the other thinks that 1x eight-pin + 1x six-pin PCIe could be useful.
ZOTAC, on the other hand, seems to believe that both are underestimating. Excluding the Founders Edition, both of their GTX 1080 designs utilize 2x eight-pin PCIe connectors. This gives their cards a theoretical maximum of 375W, versus 225W of the Founders Edition. At this point, considering the Founders Edition can reach 2.1 GHz with good enough binning, I'm guessing that it's either there simply because they can, or they just didn't want to alter their existing design. Not that, if you only have 6-pin PCIe connectors on your power supply, ZOTAC provides the dual-six-to-eight-pin adapters in the box.
The two SKUs that they are releasing, again, apart from the Founders Edition, vary by their heatsink. The ZOTAC GeForce GTX 1080 AMP has a dual-fan IceStorm cooler, while the ZOTAC GeForce GTX 1080 AMP Extreme has a triple-fan IceStorm cooler. (IceStorm is the brand name of ZOTAC's custom cooler.) Other than the 2x 8-pin PCIe connector, there's not much else to mention. ZOTAC has not settled on a default base, boost, or memory clock, and you will probably be overclocking it yourself (either manually or by using an automatic overclocker) anyway. Both cards have a back plate, if that's something you're interested in.
Once again, no pricing or availability. It shouldn't be too long, though.
Subject: Graphics Cards | June 4, 2016 - 05:53 PM | Scott Michaud
Tagged: nvidia, msi, hydro gfx, GTX 1080, corsair
Last week, we wrote about the MSI GeForce GTX 1080 SEA HAWK. This design took their AERO cooler and integrated a Corsair self-contained water cooler into it. In response, Corsair, not to be outdone by MSI's Corsair partnership, partnered with MSI to release their own graphics card, the GeForce GTX 1080 version of the Corsair Hydro GFX.
The MSI SEA HAWK
Basically, like we saw with their previous Hydro GFX card, Corsair and MSI are each selling basically the same graphics card, just with their own branding. It sounds like the two cards, MSI's SEA HAWK and Corsair's Hydro GFX, differ slightly in terms of LED lighting, but it might just be a mismatch between Tom's Hardware's Computex coverage and MSI's product page. Otherwise, I would guess that the choice between these SKUs comes down to the company that you trust most for support, which I believe both Corsair and MSI hold a good reputation for, and the current price at the specific retailer you choose. Maybe some slight variation in clock rate?
The Corsair Hydro GFX at Computex
(Image Credit: Tom's Hardware)
For the record, both cards use a single, eight-pin PCIe power connector, rather than an eight-pin and a six-pin as we've seen a few, high-end boards opt for.
No idea about pricing or availability. Corsair's page still refers to the GTX 980 Ti model.
Subject: Graphics Cards | June 4, 2016 - 04:35 PM | Sebastian Peak
Tagged: revving, report, nvidia, GTX 1080, gpu cooler, founders edition, fan speed, fan issue
“NVIDIA has reportedly found the solution and the problem should will be fixed with the next driver release. NVIDIA rep confirmed that software team was able to reproduce this problem, and their fix has already passed internal testing.”
Image credit: PC Games Hardware
On the NVIDIA forums customer care representative Manuel Guzman has posted about the issue, and now it seems a fix will be provided with the next driver release:
“This thread is to keep users up to date on the status of the fan randomly spinning up and down rapidly that some users are reporting with their GeForce GTX 1080 Founders Edition card. Thank you for your patience.
Updates Sticky Post
Update 6/1/16 - We are testing a driver fix to address the random spin up/down fan issue.
Update 6/2/16 - Driver fix so far has passed internal testing. Fix will be part of our next driver release.”
For those who have experienced the “revving” issue, described as a rapid rise and fall from 2000 RPM to 3000 RPM in the post, this will doubtless come as welcome news. We will have to see how these cards perform once the updated driver has been released and is in user hands.