Computex 2017: NVIDIA GeForce GTX Max-Q Design Notebooks are Thinner, Lighter

Subject: Graphics Cards, Mobile | May 30, 2017 - 12:48 AM |
Tagged: nvidia, mobile, max-q design, max-q, GTX 1080, geforce

During CEO Jensen Huang’s keynote at Computex tonight, NVIDIA announced a new initiative called GeForce GTX with Max-Q Design, targeting the mobile gaming markets with a product that is lighter, thinner yet more powerful than previously available gaming notebooks.

slide1.jpg

The idea behind this technology differentiation centers around gaming notebooks that have seen limited evolution over the last several years in form factor and design. The biggest stereotype of gaming notebooks today is that they must big, bulky and heavy to provide a competitive gaming experience when compared to desktop computers. NVIDIA is taking it upon itself to help drive innovation forward in this market, in some ways similar to how Intel created the Ultrabook.

slide2.jpg

Using “typical” specifications from previous machines using a GeForce GTX 880M (admittedly a part that came out in early 2014), NVIDIA claims that Max-Q Designs will offer compelling gaming notebooks with half the weight, nearly a third of the thinness yet still see 3x the performance. Utilizing a GeForce GTX 1080 GP104 GPU, the team is focusing on four specific hardware data points to achieve this goal.

slide3.jpg

First, NVIDIA is setting specifications of the GPUs in this design to run at their maximum efficiency point, allowing the notebook to get the best possible gaming performance from Pascal with the smallest amount of power draw. This is an obvious move and is likely something that has been occurring for a while, but further down the product stack. It’s also likely that NVIDIA is highly binning the GP104 parts to filter those that require the least amount of power to hit the performance target of Max-Q Designs.

Second, NVIDIA is depending on the use of GeForce Experience software to set in-game settings optimally for power consumption. Though details are light, this likely means running the game with frame rate limiting enabled, keeping gamers from running at refresh rates well above their screen’s refresh rate (static or G-Sync) which is an unnecessary power drain. It could also mean lower quality settings than we might normally associate with a GeForce GTX 1080 graphics card.

systemcomparison.jpg

Comparing a 3-year old notebook versus a Max-Q Design

The third and fourth points are heavily related: using the best possible cooling solutions and integrating the best available power regulators targeting efficiency. The former allows the GPU to be cooled quickly, and quietly (with a quoted sub-40 dbA goal), keeping the GTX 1080 at its peak efficiency curve. And putting the GPU in that state without inefficient power delivery hardware would be a waste, so NVIDIA is setting standards here too.

UPDATE: From the NVIDIA news release just posted on the company's website, we learned of a couple of new additions to Max-Q Design:

NVIDIA WhisperMode Technology
NVIDIA also introduced WhisperMode technology, which makes laptops run much quieter while gaming. WhisperMode intelligently paces the game's frame rate while simultaneously configuring the graphics settings for optimal power efficiency. This reduces the overall acoustic level for gaming laptops. Completely user adjustable and available for all Pascal GPU-based laptops, WhisperMode will be available soon through a GeForce Experience software update.

Availability
MaxQ-designed gaming laptops equipped with GeForce GTX 1080, 1070 and 1060 GPUs will be available starting June 27 from the world's leading laptop OEMs and system builders, including Acer, Aftershock, Alienware, ASUS, Clevo, Dream Machine, ECT, Gigabyte, Hasee, HP, LDLC, Lenovo, Machenike, Maingear, Mechrevo, MSI, Multicom, Origin PC, PC Specialist, Sager, Scan, Terrans Force, Tronic'5, and XoticPC. Features, pricing and availability may vary.

Jensen showed an upcoming ASUS Republic of Gamers notebook called Zephyrus that hit all of these targets – likely NVIDIA’s initial build partner. On it they demonstrated Project Cars 2, an impressive looking title for certain. No information was given on image quality settings, resolutions, frame rates, etc.

zephyrus.jpg

The ASUS ROG Zephyrus Max-Q Design Gaming Notebook

This design standard is impressive, and though I assume many gamers and OEMs will worry about having an outside party setting requirements for upcoming designs, I err on the side this being a necessary step. If you remember notebooks before the Intel Ultrabook push, they were stagnant and uninspiring. Intel’s somewhat forceful move to make OEMs innovate and compete in a new way changed the ecosystem at a fundamental level. It is very possible that GeForce GTX with Max-Q Design will do the same thing for gaming notebooks.

An initiative like this continues NVIDIA’s seeming goal of creating itself as the “PC brand”, competing more with Xbox and PlayStation than with Radeon. Jensen claimed that more than 10 million GeForce gaming notebooks were sold in the last year, exceeding the sales of Xbox hardware in the same time frame. He also called out the ASUS prototype notebook as having compute capability 60% higher than that of the PS4 Pro. It’s clear that NVIDIA wants to be more than just the add-in card leader, more than just the leader in computer graphics. Owning the ecosystem vertical gives them more control and power to drive the direction of software and hardware.

asusrog.jpg

The ASUS ROG Zephyrus Max-Q Design Gaming Notebook

So, does the Max-Q Design technology change anything? Considering the Razer Blade B5 is already under 18mm thin, the argument could be made that the market was already going down this path, and NVIDIA is simply jumping in to get credit for the move. Though Razer is a great partner for NVIDIA, they are likely irked that NVIDIA is going to push all OEMs to steal some of the thunder from this type of design that Razer started and evangelized.

That political discussion aside, Max-Q Design will bring new, better gaming notebook options to the market from many OEMs, lowering the price of entry for these flagship designs. NVIDIA did not mention anything about cost requirements or segments around Max-Q, so I do expect the first wave of these to be on the premium end of the scale. Over time, as cost cutting measures come into place, and the necessity of thinner, lighter gaming notebooks is well understood, Max-Q Designs could find itself in a wide range of price segments.

Source: NVIDIA

Computex 2017: EVGA Unveils GTX 1080 Ti Kingpin With Guaranteed 2GHz+ Overclock

Subject: Graphics Cards | May 29, 2017 - 08:30 PM |
Tagged: Kingpin, gtx 1080 ti, gpu, evga, computex 2017

EVGA today took the wraps off its latest and highest-end NVIDIA GPU with the announcement of the EVGA GeForce GTX 1080 Ti Kingpin Edition. Part of the company's continuing line of "K|NGP|N" licensed graphics cards, the 1080 Ti Kingpin includes performance, cooling, and stability-minded features that are intended to set it apart from all of the other 1080 Ti models currently available.

evga-1080-ti-kingpin-top.jpg

From a design standpoint, the 1080 Ti Kingpin features an oversized PCB, triple-fan iCX cooler, an expansive copper heat sink, and right-edge PCIe connectors (2 x 8pin), meaning that those with an obsession for cable management won't need to pick up something like the EVGA PowerLink. The card's design is also thin enough that owners can convert it into a true single-slot card by removing the iCX cooler, allowing enthusiasts to pack more water- or liquid nitrogen-cooled GPUs into a single chassis.

The GTX 1080 Ti Kingpin also features a unique array of display outputs, with dual-link DVI, HDMI 2.0, and three Mini DisplayPort 1.3 connectors. This compares with the three full-size DisplayPort and single HDMI outputs found on the 1080 Ti reference design. The presence of the DVI port on the Kingpin edition also directly addresses the concerns of some NVIDIA customers who weren't fans of NVIDIA's decision to ditch the "legacy" connector.

evga-1080-ti-kingpin-power.jpg

With its overbuilt PCB and enhanced cooling, EVGA claims that users will be able to achieve greater performance from the Kingpin Edition compared to any other currently shipping GTX 1080 Ti. That includes a "guaranteed" overclock of at least 2025MHz right out of the box, which compares to the 1480MHz base / 1600MHz boost clock advertised for the 1080 Ti's reference design (although it's important to note that NVIDIA's advertised boost clocks have become quite conservative in recent years, and many 1080 Ti owners are able to easily exceed 1600MHz with modest overclocking).

EVGA has yet to confirm an exact release date for the GeForce GTX 1080 Ti Kingpin, but it is expected to launch in late June or July. As for price, EVGA has also declined to provide specifics, but interested enthusiasts should start saving their pennies now. Based on previous iterations of the "K|NGP|N" flagship model, expect a price premium of anywhere between $100 and $400.

Source: EVGA

SoftBank Invests $4 Billion In NVIDIA, Becomes Fourth Largest Shareholder

Subject: General Tech, Graphics Cards | May 27, 2017 - 12:18 AM |
Tagged: vision fund, softbank, nvidia, iot, HPC, ai

SoftBank, the Tokyo, Japan based Japanese telecom and internet technology company has reportedly quietly amassed a 4.9% stake in graphics chip giant NVIDIA. Bloomberg reports that SoftBank has carefully invested $4 billion into NVIDIA avoiding the need to get regulatory approval in the US by keeping its investment under 5% of the company. SoftBank has promised the current administration that it will invest $50 billion into US tech companies and it seems that NVIDIA is the first major part of that plan.

SXM2-VoltaChipDetails.png

NVIDIA's Tesla V100 GPU.

Led by Chairman and CEO Masayoshi Son, SoftBank is not afraid to invest in technology companies it believes in with major past acquisitions and investments in companies like ARM Holdings, Sprint, Alibaba, and game company Supercell.

The $4 billion-dollar investment makes SoftBank the fourth largest shareholder in NVIDIA, which has seen the company’s stock rally from SoftBank’s purchases and vote of confidence. The (currently $93) $100 billion Vision Fund may also follow SoftBank’s lead in acquiring a stake in NVIDIA which is involved in graphics, HPC, AI, deep learning, and gaming.

Overall, this is good news for NVIDIA and its shareholders. I am curious what other plays SoftBank will make for US tech companies.

What are your thoughts on SoftBank investing heavily in NVIDIA?

EVGA's Hydro Copper waterblock for GTX 1080

Subject: Graphics Cards | May 26, 2017 - 03:56 PM |
Tagged: evga, Hydro Copper GTX 1080, water cooler, nvidia

EVGA's Hydro Copper GTX 1080 is purpose built to fix any GTX 1080 on the market with thermal pads for the memory and VRMs already attached with a tube of EVGA Frostbite thermal paste for the GPU.  The ports to connect into your watercooling loop are further apart than usual, something that TechPowerUp were initially skeptical about, once they tested the cooler those doubts soon disappeared though they had other concerns about the design. Check out the review for the full details on this coolers performance.

block-2.jpg

"The EVGA Hydro Copper GTX 1080 is a full-cover waterblock that offers integrated lighting with no cable management needed, a six-port I/O port manifold, and an aluminum front cover for aesthetics and rigidity alike. It also aims to simplify installation by incorporating pre-installed thermal pads out of the box."

Here is some more Tech News from around the web:

Tech Talk

 

Source: TechPowerUp

ZOTAC also announced an External VGA Box

Subject: Graphics Cards, Shows and Expos | May 25, 2017 - 07:24 PM |
Tagged: external gpu, zotac, thunderbolt 3, computex 2017

zotac-external-vga-box_image01.jpg

They haven't given us much detail but as you would expect the ZOTAC external GPU box connects an GPU to your system via a Thunderbolt 3 connector, allowing you to add more GPU power to a mobile system or any other computer which needs a little boost to its graphics.  You can fit cards of up to 9" in length, which makes it a perfect match for the two Mini-GPUs just below or other lower powered cards which are not as well endowed as your average GTX 1080 or 1080 Ti.  It also adds four USB 3.0 ports and a Quick Charge 3.0 port to your system so you can leave it at home and simply attach your laptop via the Thunderbolt cable and get right to gaming.

zotac-external-vga-box_image02.jpg

Source: Zotac

Zotac announces a pair of really Mini GTX 1080 Ti's

Subject: Graphics Cards, Shows and Expos | May 25, 2017 - 07:00 PM |
Tagged: zotac, GTX 1080 Ti Mini, GTX 1080 Ti Arctic Storm Mini, gtx 1080 ti, computex 2017

ZOTAC is claiming bragging rights about the size of their new GTX 1080 Ti's, that they are the smallest of their kind.  The two new cards measure a miniscule 210.8mm (8.3") in length and in the case of the Arctic Storm mini it is the lightest watercooled GPU on the market. 

ZT-P10810G-10P_image2.jpg

You can see the size of the ZOTAC GeForce GTX 1080 Ti Mini by how much of the length is taken up by the PCIe connector, compared to most 1080 Ti's which are over a foot long.  This card is not long enough to fit a third fan on.

1080Ti-ArcticStorm-Mini-05.png

The Arctic Storm version is the same size as the air-cooled model but opts for the worlds lightest watercooler.  That may mean you want a powerful pump attached to the GPU as there is less metal to transfer heat but it means small silent builds can pack a lot of graphical power.

Both these cards will use dual 8-pin PCIe power connectors, expect to see more of them at Computex.

 

Source: Zotac
Manufacturer: The Khronos Group

The Right People to Interview

Last week, we reported that OpenCL’s roadmap would be merging into Vulkan, and OpenCL would, starting at some unspecified time in the future, be based “on an extended version of the Vulkan API”. This was based on quotes from several emails between myself and the Khronos Group.

Since that post, I had the opportunity to have a phone interview with Neil Trevett, president of the Khronos Group and chairman of the OpenCL working group, and Tom Olson, chairman of the Vulkan working group. We spent a little over a half hour going over Neil’s International Workshop on OpenCL (IWOCL) presentation, discussing the decision, and answering a few lingering questions. This post will present the results of that conference call in a clean, readable way.

khronos-officiallogo-Vulkan_500px_Dec16.png

First and foremost, while OpenCL is planning to merge into the Vulkan API, the Khronos Group wants to make it clear that “all of the merging” is coming from the OpenCL working group. The Vulkan API roadmap is not affected by this decision. Of course, the Vulkan working group will be able to take advantage of technologies that are dropping into their lap, but those discussions have not even begun yet.

Neil: Vulkan has its mission and its roadmap, and it’s going ahead on that. OpenCL is doing all of the merging. We’re kind-of coming in to head in the Vulkan direction.

Does that mean, in the future, that there’s a bigger wealth of opportunity to figure out how we can take advantage of all this kind of mutual work? The answer is yes, but we haven’t started those discussions yet. I’m actually excited to have those discussions, and are many people, but that’s a clarity. We haven’t started yet on how Vulkan, itself, is changed (if at all) by this. So that’s kind-of the clarity that I think is important for everyone out there trying to understand what’s going on.

Tom also prepared an opening statement. It’s not as easy to abbreviate, so it’s here unabridged.

Tom: I think that’s fair. From the Vulkan point of view, the way the working group thinks about this is that Vulkan is an abstract machine, or at least there’s an abstract machine underlying it. We have a programming language for it, called SPIR-V, and we have an interface controlling it, called the API. And that machine, in its full glory… it’s a GPU, basically, and it’s got lots of graphics functionality. But you don’t have to use that. And the API and the programming language are very general. And you can build lots of things with them. So it’s great, from our point of view, that the OpenCL group, with their special expertise, can use that and leverage that. That’s terrific, and we’re fully behind it, and we’ll help them all we can. We do have our own constituency to serve, which is the high-performance game developer first and foremost, and we are going to continue to serve them as our main mission.

So we’re not changing our roadmap so much as trying to make sure we’re a good platform for other functionality to be built on.

Neil then went on to mention that the decision to merge OpenCL’s roadmap into the Vulkan API took place only a couple of weeks ago. The purpose of the press release was to reach OpenCL developers and get their feedback. According to him, they did a show of hands at the conference, with a room full of a hundred OpenCL developers, and no-one was against moving to the Vulkan API. This gives them confidence that developers will accept the decision, and that their needs will be served by it.

Next up is the why. Read on for more.

EK is releasing new GeForce GTX FE Full-Cover water blocks

Subject: Graphics Cards | May 23, 2017 - 03:58 PM |
Tagged: ek cooling, pascal, nvidia, waterblock, GTX FE

The current series of EK Cooling waterblocks for Pascal based GPUs, up to and including the new Titan X are being replaced with a new family of coolers.  The new GTX FE water blocks will be compatible with the previous generation of backplates, so you can do a partial upgrade or keep an eye out for discounts on the previous generation.

ek.jpg

These new coolers will fit on any Founders Edition reference card, from GTX 1060's through to the Titan X, currently that count stands at 106 unique graphics cards so your card is likely to be compatible.  You can choose between four models, a plain design, one with acetal, one with nickel and one with both acetal and nickel, whichever one you choose it will still run you 109.95€/$125USD

EK-FC1080_GTX_FE_CP_PCB+Coolant.jpg

Full PR is below.

EK Water Blocks, the Slovenia-based premium computer liquid cooling gear manufacturer, is releasing several new EK-FC GeForce GTX FE water blocks that are compatible with multiple reference design Founders Edition NVIDIA® GeForce GTX 1060, 1070, 1080, 1080 Ti, Titan X Pascal and Titan Xp based graphics cards. All the water blocks feature recently introduced aesthetic terminal cover as well! FE blocks come as a replacement to current GeForce GTX 10x0 / TITAN X Series of water blocks.

All current GeForce GTX 10x0 / TITAN X Series of water blocks are going to be discontinued after the stock runs out and FE blocks come as a complete replacement. FE blocks are designed to fit all reference design Founders Edition NVIDIA GeForce GTX 1060, 1070, 1080, 1080 Ti, Titan X Pascal and Titan Xp based graphics cards. The current compatibility list rounds up a total of 106 graphics cards that are on the market, but as always, we recommend that you refer to the EK Cooling Configurator for a precise compatibility match.

The new EK-FC GeForce GTX FE water blocks are also backward compatible with all EK-FC1080 GTX Backplates, EK-FC1080 GTX Ti Backplates, and EK-FC Titan X Pascal Backplates.

Availability and pricing
These water blocks are made in Slovenia, Europe and are available for purchase through EK Webshop and Partner Reseller Network. In the table below you can see manufacturer suggested retail price (MSRP) with VAT included.

Capture.PNG

 

AMD Releases Radeon Software Crimson ReLive 17.5.2

Subject: Graphics Cards | May 20, 2017 - 07:01 AM |
Tagged: graphics drivers, amd

The second graphics driver of the month from AMD, Radeon Software Crimson ReLive 17.5.2, adds optimizations for Bethesda’s new shooter, Prey. AMD claims that it will yield up to a 4.5% performance improvement, as measured on an RX 580 (versus the same card with 17.5.1). This is over and above the up to 4.7% increase that 17.5.1 had over 17.4.4.

amd-2016-crimson-relive-logo.png

Outside of that game, 17.5.2 also addresses four issues. The first is a crash in NieR: Automata. The second is long load times in Forza Horizon 3. The third is a system hang with the RX 550 when going sleep. The fourth fixed issue is a bit more complicated; apparently, in a multi-GPU system, where monitors are attached to multiple graphics cards, the primary graphics card can appear disabled in Radeon Settings. All four are now fixed, so, if they affect you, then pick up the driver.

As always, they are available from AMD’s website.

Source: AMD

Google Daydream Standalone VR Headset Powered by Snapdragon 835

Subject: Graphics Cards, Mobile | May 17, 2017 - 02:30 PM |
Tagged: snapdragon 835, snapdragon, qualcomm, google io 2017, google, daydream

During the Google I/O keynote, Google and Qualcomm announced a partnership to create a reference design for a standalone Daydream VR headset using Snapdragon 835 to enable the ecosystem of partners to have deliverable hardware in consumers’ hands by the end of 2017. The time line is aggressive, impressively so, thanks in large part to the previous work Qualcomm had done with the Snapdragon-based VR reference design we first saw in September 2016. At the time the Qualcomm platform was powered by the Snapdragon 820. Since then, Qualcomm has updated the design to integrate the Snapdragon 835 processor and platform, improving performance and efficiency along the way.

Google has now taken the reference platform and made some modifications to integrate Daydream support and will offer it to partners to show case what a standalone, untethered VR solution can do. Even though Google Daydream has been shipping in the form of slot-in phones with a “dummy” headset, integrating the whole package into a dedicate device offers several advantages.

First, I expected the free standalone units to have better performance than the phones used as a slot-in solution. With the ability to tune the device to higher thermal limits, Qualcomm and Google will be able to ramp up the clocks on the GPU and SoC to get optimal performance. And, because there is more room for a larger battery on the headset design, there should be an advantage in battery life along with the increase in performance.

sd835vr.jpg

The Qualcomm Snapdragon 835 VR Reference Device

It is also likely that the device will have better thermal properties than those using high smartphones today. In other words, with more space, there should be more area for cooling and thus the unit shouldn’t be as warm on the consumers face.

I would assume as well that the standalone units will have improved hardware over the smartphone iterations. That means better gyros, cameras, sensors, etc. that could lead to improved capability for the hardware in this form. Better hardware, tighter and more focused integration and better software support should mean lower latency and better VR gaming across the board. Assuming everything is implemented as it should.

The only major change that Google has made to this reference platform is the move away from Qualcomm’s 6DOF technology (6 degrees of freedom, allowing you to move in real space and have all necessary tracking done on the headset itself) and to Google calls WorldSense. Based on the Google Project Tango technology, this is the one area I have questions about going forward. I have used three different Tango enabled devices thus far with long-term personal testing and can say that while the possibilities for it were astounding, the implementations had been…slow. For VR that 100% cannot be the case. I don’t yet know how different its integration is from what Qualcomm had done previously, but hopefully Google will leverage the work Qualcomm has already done with its platform.

Google is claiming that consumers will have hardware based on this reference design in 2017 but no pricing has been shared with me yet. I wouldn’t expect it to be inexpensive though – we are talking about all the hardware that goes into a flagship smartphone plus a little extra for the VR goodness. We’ll see how aggressive Google wants its partners to be and if it is willing to absorb any of the upfront costs with subsidy.

Let me know if this is the direction you hope to see VR move – away from tethered PC-based solutions and into the world of standalone units.

Source: Qualcomm