AMD Will Sell Wraith Max CPU Cooler Separately

Subject: Memory | August 6, 2017 - 11:41 AM |
Tagged: wraith max, Wraith, ryzen, fm2, amd, AM4

Amidst all the big AMD announcements recently, the company quietly revealed that it would begin selling the Wraith Max CPU cooler separately at retail. The Wraith Max heatsink and fan was previously only available in OEM systems and in boxed SKUs of the highest end Ryzen processors (mainly the 1700X and 1800X). The cooler is a refreshed and upgraded version of the company’s original Wraith cooler that measures 105 x 105 x 85mm and features a boxy horizontal cooler with a copper baseplate and heatpipes with a shrouded 92mm fan along with a RGB LED ring around the fan that can be controlled via motherboard software.

View Full Size

The Wraith Max is rated at 140W TDP and is connected to the system using a fan header and USB (for controlling the lighting). AMD further rates the cooler at a fairly quiet 38 dBA. The Wraith Max supports all of the usual AMD sockets including AM4, AM3, and FM2 (no Threadripper support of course heh), but there is no official support for Intel sockets.

The Wraith Max cooler will retail for $59 USD. I have been keeping an eye on the usual online retailers and have not yet seen it listed, but it should be available soon. Hopefully there will be more reviews of the cooler now that it is a retail product on its own, and maybe we can get Sebastian to take a look at it and compare it to the original Wraith cooler (and his usual lineup of course) he reviewed last year.

Source: AMD

Video News

August 6, 2017 | 03:18 PM - Posted by TumBillWeedz (not verified)

"Jul 28, 2017 - AMD sends flagship Wraith Max CPU cooler to retail for $59" from PCGamer!

This is rather old news by now, with the more interesting news not noticed.

Here is some more intresting News that need more attention snad discussion concerning Vega/Vega's HBCC/HBC, and this has more implications for any descrete mobile/laptop GPU based on the Vega micro-arch:

"But it's the High Bandwidth Cache and High Bandwidth Controller silicon which looks the most exciting and that's all related to moving outside of the limits of the graphics card's video memory. In normal GPUs, developers have to fit all the data they need to render into the frame buffer, meaning all the polygons, shaders, and textures have to squeeze into your card's VRAM.

That can be restrictive and devs have to find clever workarounds for large, open-world games. The revolution with AMD's Vega design is to break free of those limits. The High Bandwidth Cache and High Bandwidth Controller mean the GPU can now stream in rendering data from your PC's system memory or even an SSD, meaning it doesn't have to come via the card's frame buffer.

AMD Vega High Bandwidth Cache and Controller

"You are no longer limited by the amount of graphics memory you have on the chip," Wasson explains. "It's only limited by the amount of memory or storage you attach to your sytem."

The Vega architecture is capable of scaling right up to a maximum of 512TB as the virtual address space available to the graphics silicon. AMD are calling Vega 'the world's most scalable GPU memory architecture' and they look on the money with that claim at first glance. But it will still depend on just how many developers will jump onboard with the new programming techniques when Vega launches. "(1)


"AMD Vega - release date, price, specs, and performance"

August 6, 2017 | 03:32 PM - Posted by TumBillWeedz (not verified)

P.S here is another interesting part of the same PCGamer article and any discrete mobile Vega GPU SKU's with 2GB/4GB of HBM2/cache are not going to be limited as much for having less video memory/HBM2 under Vega with that HBCC/HBC IP:

"In order to show what HBCC can offer games AMD showed a version of Vega limited to just 2GB of video memory where one card had HBCC enabled and one with the memory caching tech turned off. This was to simulate where the new technology can help out when the frame buffer is being maxed out.

The Rise of the Tomb Raider demo showed a massive difference in both the average and minimum frame rates outputted by the same spec of GPU. It's the minimums that are the most interesting part of this, though, with the 2GB card without HBCC bottoming out at 13.7fps while the equivalent GPU running the new HBCC tech scores 46.5fps. That's between 2x and 3x the minimum frame rate which will have a huge impact on just how smooth a game feels to play."

August 9, 2017 | 05:12 PM - Posted by Just an Nvidia User (not verified)

As for the article on topic, I just don't think many people will buy this for an astounding $59 when there are many cheaper options available‚Äč. You can double it and get water cooling. Or get a more premium cooler for slightly more. This screams fanboy product more than anything.

Off Topic: I've been gaming on a 4 gig card since 2013. HBCC comparison with a 2 gig card is pathetic. Why didn't they show how effective it is against a card with 4 gigs or more. I'm guessing it wouldn't look impressive at all.

It has more to do with memory hogging settings used in the game rather than the effectiveness of HBCC. Games won't start using massive amounts of memory just because they make cards with it.

11 gigs on 1080ti is overkill. 8 gigs is overkill. Unless games start being 200gb, I don't see more being needed. Games are usually no more than 50 gigs for storage medium reasons. Fits on one disc compressed and don't want to take up too much hdd/ssd storage for example.

HBCC doesn't seem like worthwhile tech. Video cards are already able to use system memory for a cache so this just extends it to possibly slower cache sources such as hard drives or ssds.

Unless they start including it on said 2 gig cards for $150 or less instead of putting it on a card that already has 8 gigs at cost of $400 or more, I don't see this helping the consumer market much.

August 10, 2017 | 02:16 PM - Posted by fanbois suck (not verified)

Single Rad AIO coolers have been proven to be some of the worse coolers on the market. Fitting the typical EVO 212 tower into a small form factor setup will not work and even if it did provides zero cooling to the VRM's on the mobo. Most quality down blowing coolers like the Noctua L9-65 provide similar performance for roughly the same price minus the LED factor (I have one on my Xeon rig). This is a fairly priced high quality cooler. "Just an Nvidia User"...your name screams fanboy.

August 8, 2017 | 12:34 AM - Posted by James

It doesn't seem like there is any new information there. I believe GPUs have had a form of virtual memory for a while. It doesn't seem to have been used much. I think it required developer support to make use of it. The implantation in Vega looks like full fledged, transparent virtual memory system. It should reduce memory consumption considerably. It may make it such that the developers don't have to micro-manage memory as they do now. I want more specific information though. I would want to know what the page (or whatever they are calling it) size is. Also, they must have implemented prefetching of some kind. I would wonder if it is almost the exact same as the virtual memory management unit in Ryzen. Vega may have an essentially exact copy of the fabric and router that is Ryzen also. I have seen rumors of some APUs that look like they have a 16-core Ryzen connected a small Vega based GPU with HBM using 4 links for the CPU to GPU connection. All of that could be placed onto an MCM. If it is a real product, it will make an excellent GPU compute device. They would be great made into a blade system.

August 7, 2017 | 03:39 AM - Posted by JohnGR

If someone have a look at ebay and how much people are trying to sell original old AMD coolers, maybe the price is not so high.

OK, seriously, AMD is not trying to sell many coolers here. You don't put a $59 price tag if you want to sell. This price tag is more marketing, trying to make the free cooler with some AMD processors to look like a very expensive gift, making those processors even more attractive than what they already are.

August 7, 2017 | 08:48 AM - Posted by collie

If it was me I'd add some brackets for Intel boards, just as a big middle finger. LED is a nice touch.

August 7, 2017 | 03:04 PM - Posted by agello24 (not verified)

if my hyper 212 cant keep my proc under 59, then i know the AMD cooler wont help much.

August 8, 2017 | 09:21 AM - Posted by malurt

OK. I guess trolling was a great post.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.