Subject: Graphics Cards | September 17, 2015 - 01:14 PM | Sebastian Peak
Tagged: nvidia, msi, liquid cooled, GTX980Ti SEA HAWK, GTX 980 Ti, graphics card, corsair
We reported last night on Corsair's new Hydro GFX, a liquid-cooled GTX 980 Ti powered by an MSI GPU, and MSI has their own new product based on this concept as well.
"The MSI GTX 980Ti SEA HAWK utilizes the popular Corsair H55 closed loop liquid-cooling solution. The micro-fin copper base takes care of an efficient heat transfer to the high-speed circulation pump. The low-profile aluminum radiator is easy to install and equipped with a super silent 120 mm fan with variable speeds based on the GPU temperature. However, to get the best performance, the memory and VRM need top-notch cooling as well. Therefore, the GTX 980Ti SEA HAWK is armed with a ball-bearing radial fan and a custom shroud design to ensure the best cooling performance for all components."
The MSI GTX 980 Ti Sea Hawk actually appears identical to the Corsair Hydro GFX, and a looking through the specs confirms the similarities:
With a 1190 MHz Base and 1291 MHz Boost clock the SEA HAWK has the same factory overclock speeds as the Corsair-branded unit, and MSI is also advertising the card's potential to go further:
"Even though the GTX 980Ti SEA HAWK boasts some serious clock speeds out-of-the-box, the MSI Afterburner overclocking utility allows users to go even further. Explore the limits with Triple Overvoltage, custom profiles and real-time hardware monitoring."
I imagine the availability of this MSI branded product will be greater than the Corsair branded equivalent, but in either case you get a GTX 980 Ti with the potential to run as fast and cool as a custom cooled solution, without any of the extra work. Pricing wasn't immediately available this morning but expect something close to the $739 MSRP we saw with Corsair.
Subject: Graphics Cards | September 17, 2015 - 01:00 AM | Sebastian Peak
Tagged: nvidia, msi, liquid cooler, GTX 980 Ti, geforce, corsair, AIO
A GPU with attached closed-loop liquid cooler is a little more mainstream these days with AMD's Fury X a high-profile example, and now a partnership between Corsair and MSI is bringing a very powerful NVIDIA option to the market.
The new product is called the Hydro GFX, with NVIDIA's GeForce GTX 980 Ti supplying the GPU horsepower. Of course the advantage of a closed-loop cooler would be higher (sustained) clocks and lower temps/noise, which in turns means much better performance. Corsair explains:
"Hydro GFX consists of a MSI GeForce GTX 980 Ti card with an integrated aluminum bracket cooled by a Corsair Hydro Series H55 liquid cooler.
Liquid cooling keeps the card’s hottest, most critical components - the GPU, memory, and power circuitry - 30% cooler than standard cards while running at higher clock speeds with no throttling, boosting the GPU clock 20% and graphics performance up to 15%.
The Hydro Series H55 micro-fin copper cooling block and 120mm radiator expels the heat from the PC reducing overall system temperature and noise. The result is faster, smoother frame rates at resolutions of 4K and beyond at whisper quiet levels."
The factory overclock this 980 Ti is pretty substantial out of the box with a 1190 MHz Base (stock 1000 MHz) and 1291 MHz Boost clock (stock 1075 MHz). Memory is not overclocked (running at the default 7096 MHz), so there should still be some headroom for overclocking thanks to the air cooling for the RAM/VRM.
A look at the box - and the Corsair branding
Specs from Corsair:
- NVIDIA GeForce GTX 980 Ti GPU with Maxwell 2.0 microarchitecture
- 1190/1291 MHz base/boost clock
- Clocked 20% faster than standard GeForce GTX 980 Ti cards for up to a 15% performance boost.
- Integrated liquid cooling technology keeps GPU, video RAM, and voltage regulator 30% cooler than standard cards
- Corsair Hydro Series H55 liquid cooler with micro-fin copper block, 120mm radiator/fan
- Memory: 6GB GDDR5, 7096 MHz, 384-bit interface
- Outputs: 3x DisplayPort 1.2, HDMI 2.0, and Dual Link DVI
- Power: 250 watts (600 watt PSU required)
- Requirements: PCI Express 3.0 16x dual-width slot, 8+6-pin power connector, 600 watt PSU
- Dimensions: 10.5 x 4.376 inches
- Warranty: 3 years
- MSRP: $739.99
As far as pricing/availability goes Corsair says the new card will debut in October in the U.S. with an MSRP of $739.99.
Subject: Graphics Cards | September 16, 2015 - 01:16 PM | Sebastian Peak
Tagged: TSMC, Samsung, pascal, nvidia, hbm, graphics card, gpu
According to a report by BusinessKorea TSMC has been selected to produce the upcoming Pascal GPU after initially competing with Samsung for the contract.
Though some had considered the possibility of both Samsung and TSMC sharing production (albeit on two different process nodes, as Samsung is on 14 nm FinFET), in the end the duties fall on TSMC's 16 nm FinFET alone if this report is accurate. The move is not too surprising considering the longstanding position TSMC has maintained as a fab for GPU makers and Samsung's lack of experience in this area.
The report didn't make the release date for Pascal any more clear, naming it "next year" for the new HBM-powered GPU, which will also reportedly feature 16 GB of HBM 2 memory for the flagship version of the card. This would potentially be the first GPU released at 16 nm (unless AMD has something in the works before Pascal's release), as all current AMD and NVIDIA GPUs are manufactured at 28 nm.
Subject: General Tech, Shows and Expos | September 15, 2015 - 05:07 PM | Sebastian Peak
Tagged: VR, virtual reality, Tilt Brush, PAX Prime 2015, paint, nvidia, art
A group of six artists from the gaming industry were brought together at this month's PAX Prime event in Seattle in a joint vebture between NVIDIA, Valve, Google and HTC. The idea? To use virtual reality to create art. The result was very interesting, to say the least.
Wearing HTC’s VR headset the artists had 30 minutes each to create their work using Tilt Brush. What is Tilt Brush, exactly?
"Tilt Brush uses the HTC Vive’s unique hand controllers and positional tracking to allow artists to paint in three dimensions. The software includes a remarkable digital palette, letting users draw GPU-powered real-time effects like fire, smoke and light."
The artists included Chandana Ekanayake from Uber Entertainment, Lee Petty from Double Fine Productions, Michael Shilliday from Whiterend Creative, Mike Krahulik from Penny Arcade, Sarah Northway from Northway Games and Tristan Reidford from Valve.
NVIDIA is hosting a contest to pick the winner on their Facebook page; so what's in it for you? "The artist with the most votes will win ultimate bragging rights, and voters will be entered to win a new GeForce GTX 980 Ti!" Not bad.
This is certainly a novel application of VR, but serves to illustrate (pun intended) that the tech really does provide endless possibilities - far beyond 3D art or gameplay immersion.
Subject: Graphics Cards | September 8, 2015 - 09:56 PM | Jeremy Hellstrom
Tagged: STRIX DirectCU III OC, nvidia, factory overclocked, asus, 980 Ti
The ASUS GTX 980 Ti STRIX DCIII OC comes with the newest custom cooler from ASUS and a fairly respectable factory overclock of 1216MHz, 1317MHz boost and a 7.2GHz effective clock on the impressive 6GB of VRAM. Once [H]ard|OCP had a chance to use GPUTweak II those values were increased to 1291MHz, 1392MHz boost and a 6GB VRAM clock with manual tweaking, for those who prefer automated OCing there are three modes which range from Silent to OC mode that will instantly get you ready to use the card. With an MSRP of $690 and a street price usually over $700 you have to be ready to invest a lot of hard earned cash into this card but at 4k resolutions it does outperform the Fury X by a noticeable margin.
"Today we have the custom built ASUS GTX 980 Ti STRIX DirectCU III OC 6GB video card. It features a factory overclock, extreme cooling capabilities and state of the art voltage regulation. We compare it to the AMD Radeon R9 Fury, and overclock the ASUS GTX 980 Ti STRIX DCIII to its highest potential and look at some 4K playability."
Here are some more Graphics Card articles from around the web:
- EVGA GTX 980 Ti Classified ACX 2.0+ @ Kitguru
- Gigabyte G1 Gaming GTX 980Ti 6GB @ eTeknix
- Colorful iGame GTX 980 Ti 6GB @ techPowerUp
- MSI GTX 980 Ti Lightning Review @ OCC
- PNY GTX 980 XLR8 Review @ OCC
- MSI GeForce GTX 950 Gaming 2 GB @ techPowerUp
- GTX 780 Ti vs R9 290X; The Rematch @ Hardware Canucks
- ARCTIC Accelero Hybrid III-140 vga cooler @ HardwareOverclock
- AMD Linux Graphics: The Latest Open-Source RadeonSI Driver Moves On To Smacking Catalyst @ Phoronix
- Running The AMD Radeon R9 Fury With AMD's New Open-Source Linux Driver @ Phoronix
- HIS R7 360 iCooler OC 2GB Video Card Review @ Madshrimps
- PowerColor Radeon R9 380 PCS+ Graphics Card Review @ Techgage
- Tiny Radeon R9 Nano to pack a wallop at $650 @ The Tech Report
To the Max?
Much of the PC enthusiast internet, including our comments section, has been abuzz with “Asynchronous Shader” discussion. Normally, I would explain what it is and then outline the issues that surround it, but I would like to swap that order this time. Basically, the Ashes of the Singularity benchmark utilizes Asynchronous Shaders in DirectX 12, but they disable it (by Vendor ID) for NVIDIA hardware. They say that this is because, while the driver reports compatibility, “attempting to use it was an unmitigated disaster in terms of performance and conformance”.
AMD's Robert Hallock claims that NVIDIA GPUs, including Maxwell, cannot support the feature in hardware at all, while all AMD GCN graphics cards do. NVIDIA has yet to respond to our requests for an official statement, although we haven't poked every one of our contacts yet. We will certainly update and/or follow up if we hear from them. For now though, we have no idea whether this is a hardware or software issue. Either way, it seems more than just politics.
So what is it?
Simply put, Asynchronous Shaders allows a graphics driver to cram workloads in portions of the GPU that are idle, but not otherwise available. For instance, if a graphics task is hammering the ROPs, the driver would be able to toss an independent physics or post-processing task into the shader units alongside it. Kollock from Oxide Games used the analogy of HyperThreading, which allows two CPU threads to be executed on the same core at the same time, as long as it has the capacity for it.
Kollock also notes that compute is becoming more important in the graphics pipeline, and it is possible to completely bypass graphics altogether. The fixed-function bits may never go away, but it's possible that at least some engines will completely bypass it -- maybe even their engine, several years down the road.
But, like always, you will not get an infinite amount of performance by reducing your waste. You are always bound by the theoretical limits of your components, and you cannot optimize past that (except for obviously changing the workload itself). The interesting part is: you can measure that. You can absolutely observe how long a GPU is idle, and represent it as a percentage of a time-span (typically a frame).
And, of course, game developers profile GPUs from time to time...
According to Kollock, he has heard of some console developers getting up to 30% increases in performance using Asynchronous Shaders. Again, this is on console hardware and so this amount may increase or decrease on the PC. In an informal chat with a developer at Epic Games, so massive grain of salt is required, his late night ballpark “totally speculative” guesstimate is that, on the Xbox One, the GPU could theoretically accept a maximum ~10-25% more work in Unreal Engine 4, depending on the scene. He also said that memory bandwidth gets in the way, which Asynchronous Shaders would be fighting against. It is something that they are interested in and investigating, though.
This is where I speculate on drivers. When Mantle was announced, I looked at its features and said “wow, this is everything that a high-end game developer wants, and a graphics developer absolutely does not”. From the OpenCL-like multiple GPU model taking much of the QA out of SLI and CrossFire, to the memory and resource binding management, this should make graphics drivers so much easier.
It might not be free, though. Graphics drivers might still have a bunch of games to play to make sure that work is stuffed through the GPU as tightly packed as possible. We might continue to see “Game Ready” drivers in the coming years, even though much of that burden has been shifted to the game developers. On the other hand, maybe these APIs will level the whole playing field and let all players focus on chip design and efficient injestion of shader code. As always, painfully always, time will tell.
Subject: Systems, Mobile | September 2, 2015 - 10:00 AM | Sebastian Peak
Tagged: V Nitro, Skylake, NVMe, nvidia, notebook, mu-mimo, laptop, IFA 2015, geforce, aspire V, acer
Acer’s updated V Nitro notebook series has been announced, and the notebooks have received the newest Intel mobile processors and have been fully updated with the latest connectivity some advanced wireless tech.
The Aspire V 13
"The refreshed Aspire V Nitro Series notebooks and Aspire V 13 support the latest USB 3.1 Type-C port, while 'Black Edition' Aspire V Nitro models support Thunderbolt 3, which brings Thunderbolt to USB Type-C at speeds up to 40Gbps. All models include Qualcomm VIVE 2x2 802.11ac Wi-Fi with Qualcomm MU | EFX MU-MIMO technology."
MU-MIMO devices are just starting to hit the market and the tech promises to eliminate bottlenecks when multiple devices are in use on the same network – with compatible adapters/routers, that is.
The Aspire V 15 Nitro
What kind of hardware will be offered? Here’s a brief overview:
- 6th Gen Intel Core processors
- Up to 32GB DDR4 system memory
- NVIDIA GeForce graphics
- (SATA) SSD/SSHD/HDD storage options
- Touchscreen option added for the 15-inch model
Additionally, the “Black Edition” models offer a 4K 100% Adobe RGB display option, NVIDIA GeForce GTX 960M up to 4GB, NVMe SSDs, and something called “AeroBlade” thermal exhaust, which Acer said has “the world’s thinnest metallic blades of just 0.1mm thin, which are stronger and quieter”.
The Aspire V 17 Nitro
Pricing will start at $599 for the V Nitro 13, $999 for the V Nitro 15, and $1099 for the V Nitro 17. All versions will be available in the U.S. in October.
Subject: Systems, Mobile | September 2, 2015 - 07:00 AM | Sebastian Peak
Tagged: nvidia, notebooks, Lenovo, laptops, Intel Skylake, Intel Braswell, IFA 2015, ideapad 500S, ideapad 300S, ideapad 100S, Ideapad, gtx, APU, amd
Lenovo has unveiled their reinvented their ideapad (now all lowercase) lineup at IFA 2015 in Berlin, and the new laptops feature updated processors including Intel Braswell and Skylake, as well as some discrete AMD and NVIDIA GPU options.
At the entry-level price-point we find the ideapad 100S which does not contain one of the new Intel chips, instead running an Intel Atom Z3735F CPU and priced accordingly at just $189 for the 11.6” version and $259 for the 14” model. While low-end specs (2GB RAM, 32GB/64GB eMMC storage, 1366x768 screen) aren’t going to blow anyone away, these at least provide a Windows 10 alternative to a Chromebook at about the same cost, and to add some style Lenovo is offering the laptop in four colors: blue, red, white, and silver.
Moving up to the 300S we find a 14” laptop (offered in red, black, or white) with Intel Pentium Braswell processors up to the quad-core N3700, and the option of a FHD 1920x1080 display. Memory and storage options will range up to 8GB DDR3L and up to either 256GB SSD or 1TB HDD/SSHD. At 0.86" thick the 300S weighs 2.9 lbs, and prices will start at $479.
A lower-cost ideapad 300, without the “S” and with more basic styling, will be available in sizes ranging from 14” to 17” and prices starting between $399 and $549 for their respective models. A major distinction will be the inclusion of both Braswell and Intel 6th Gen Skylake CPUs, as well at the option of a discrete AMD GPU (R5 330M).
Last we have the ideapad 500S, available in 13.3”, 14”, and 15.6” versions. With Intel 6th Gen processors up to Core i7 like the 300S, these also offer optional NVIDIA GPUs (GTX 920M for the 13.3", 940M for the 14"+) and up to FHD screen resolution. Memory and storage options range up to 8GB DDR3L and up to either 256GB SSD or 1TB HDD/SSHD, and the 500S is a bit thinner and lighter than the 300S, with the 13.3” version 0.76” thick and 3.4 lbs, moving up to 0.81” and 4.6 lbs with the 15.6” version.
A non-S version of the ideapad 500 will also be available, and this will be the sole AMD CPU representative with the option of an all-AMD solution powered by up to the A10-7300 APU, or a combination of R7 350M graphics along with 6th Gen Intel Core processors. 14” and 15” models will be available starting at $399 for the APU model and $499 with an Intel CPU.
All of the new laptops ship with Windows 10 as Microsoft’s newest OS arrived just in time for the back-to-school season.
Subject: Graphics Cards | August 31, 2015 - 11:19 PM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce, drivers
Unlike last week's 355.80 Hotfix, today's driver is fully certified by both NVIDIA and Microsoft (WHQL). According to users on GeForce Forums, this driver includes the hotfix changes, although I am still seeing a few users complain about memory issues under SLI. The general consensus seems to be that a number of bugs were fixed, and that driver quality is steadily increasing. This is also a “Game Ready” driver for Mad Max and Metal Gear Solid V: The Phantom Pain.
NVIDIA's GeForce Game Ready 355.82 WHQL Mad Max and Metal Gear Solid V: The Phantom Pain drivers (inhale, exhale, inhale) are now available for download at their website. Note that Windows 10 drivers are separate from Windows 7 and Windows 8.x ones, so be sure to not take shortcuts when filling out the “select your driver” form. That, or just use GeForce Experience.
Subject: Graphics Cards | August 27, 2015 - 09:23 PM | Scott Michaud
Tagged: windows 10, nvidia, geforce, drivers, graphics drivers
While GeForce Hotfix driver 355.80 is not certified, or even beta, I know that a lot of our readers have issues with SLI in Windows 10. Especially in games like Battlefield 4, memory usage would expand until, apparently, a crash occurs. Since I run a single GPU, I have not experienced this issue and so I cannot comment on what happens. I just know that it was very common in the GeForce forums and in our comment section, so it was probably a big problem for many users.
If you are not experiencing this problem, then you probably should not install this driver. This is a hotfix that, as stated above, was released outside of NVIDIA's typical update process. You might experience new, unknown issues. Affected users, on the other hand, have the choice to install the fix now, which could very well be stable, or wait for a certified release later.
You can pick it up from NVIDIA's support site.