Subject: General Tech | June 12, 2015 - 01:14 PM | Jeremy Hellstrom
Tagged: hbm, leak, fury x, amd, Fiji, radeon, 390x
The rumours are flying today, with some purportedly leaked performance results of AMD's upcoming Fiji XT based card, the Fury X. The leak at Videocardz shows the results of 3DMark's Firestrike Ultra and Extreme for an AMD Radeon Graphics Processor in single card configuration and Crossfire results for Extreme only. The results show a card that can keep up with the Titan X and by extension the new GTX 980 Ti as well. At 1440p resolution, the Firestrike Extreme benchmark, the new AMD card seems to lag slightly behind NVIDIA in single and dual GPU configurations, but not by much while in the Ultra test at 4K the AMD GPU pulls ahead, likely thanks to the new HBM-1 memory.
They also claim to have a source who has run the new GPU though the CompuBench suite which gives us more information about the general architecture. The tests show a card with 64 Compute Units, which translates into 4096 Stream Cores if it is designed similarly to current Radeons. The tests also confirm the 1050MHz core clock and more interestingly the 4GB of HBM-1 will be clocked at 500MHz memory clock with a 4096-bit bus, which is good news for those who like their resolutions as high as they can go. Nothing is confirmed yet but these numbers bode well for the new Radeon architecture if they are true.
(Image credit: VideoCardz.com)
Subject: Graphics Cards | June 10, 2015 - 02:34 AM | Sebastian Peak
Tagged: rumor, Radeon 390X, radeon 390, radeon, leak, Hawaii XT, hawaii, amd
Here we go again...
Image credit: WCCFtech
Even more information has allegedly leaked out ahead of AMD’s official announcement of new 300-series Radeon GPUs, this time from rumor site WCCFtech. This information is totally unverified at least from any public source, but it is very specific regarding both price and GPU.
Here is the list published by WCCFtech in their report:
|R9 390X 8GB||Enhanced Hawaii XT||$389|
|R9 390 8GB||Enhanced Hawaii Pro||$329|
|R9 380X 3GB/6GB||Tonga XT (NOT CONFIRMED)|
|R9 380 4GB||Tonga Pro||$235|
|R9 380 2GB||Tonga Pro||$195|
|R7 370 4GB||Pitcairn||$175|
|R7 370 2GB||Pitcairn||$135|
|R7 360 2GB||Bonaire||$107|
As to whether this comes via leaked slides or is complete guesswork, we’ll likely have no answer until the official unveiling. Such an announcement is likely the purpose of the AMD gaming event at E3 which is now just days away. We can only hope that Fiji will in fact be making an appearance at the show as it does not appear on this list (again, if accurate).
Subject: Graphics Cards | June 3, 2015 - 08:39 PM | Ryan Shrout
Tagged: amd, Fiji, radeon, R9, 390x, maybe
Sorry for all of these single item news posts I keep making, but this is how the information is coming out about AMD's upcoming Fiji GPU using new HBM (high bandwidth memory) technology. (And make no mistake this is exactly the way that AMD marketing dreamed it would happen.) Below we have an image of Fiji: the GPU die, the interposer and the four stacks of HBM.
That chip is massive, quite simply, measuring about 70mm x 70mm based on the information presented during our HBM technical session last month. That is gigantic when compared to other GPU dies alone but is smaller than previous generation GPUs and the required memories on the PCB separately.
In case you missed it earlier today, AMD also released a teaser video of a CG Radeon card using Fiji. We'll know everything (maybe?) about AMD's latest flagship on June 16th.
Subject: Graphics Cards | June 3, 2015 - 03:25 PM | Ryan Shrout
Tagged: radeon, hype train, hbm, Fiji, amd
The AMD Fiji hype train keeps rolling. Here is a video that AMD posted to its AMD Radeon Graphics Twitter account with a several second long teaser video.
— AMD Radeon Graphics (@AMDRadeon) June 3, 2015
First and foremost, it looks like all of the leaks about the cooler and card design of Fiji were at least mostly accurate. Also, note that AMD included the #AMD300 tag in the tweet, leading us to believe that the R9 390X is indeed going to be the branding.
Looks like we'll know more on June 16th during E3.
Subject: Graphics Cards | May 29, 2015 - 11:05 AM | Sebastian Peak
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd
Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.
The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:
"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."
Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?
Subject: Graphics Cards | May 22, 2015 - 09:39 AM | Ryan Shrout
Tagged: wce, radeon, Fiji, amd, 390x
UPDATE (5/22/15): Johan Andersson tweeted out this photo this morning, with the line: "This new island is one seriously impressive and sweet GPU. wow & thanks @AMDRadeon ! They will be put to good use :)" Looks like we can confirm that at least one of the parts AMD is releasing does have the design of the images we showed you before, though the water cooling implementation is missing or altered.
File this under "rumor" for sure, but a cool one none the less...
After yesterday's official tidbit of information surrounding AMD's upcoming flagship graphics card for enthusiasts and its use of HBM (high bandwidth memory), it appears we have another leak on our hands. The guys over at Chiphell have apparently acquired some stock footage of the new Fiji flagship card (whether or not it will be called the 390X has yet to be seen) and it looks...awesome.
In that post from yesterday I noted that with an HBM design AMD could in theory build an add-in card that is of a different form factor than anything we have previously seen for a high end part. Based on the image above, if this turns out to be the high end Fiji offering, it appears the PCB will indeed be quite small as it no longer requires memory surrounding the GPU itself. You can also see that it will in fact be water cooled though it looks like it has barb inlets rather than a pre-attached cooler in this image.
The second leaked image shows display outputs consisting of three full-size DisplayPort connections and a single HDMI port.
All of this could be faked of course, but if it is, the joker did a damn good job of compiling all the information into one design. If it's real, I think AMD might finally have a match for the look and styling of the high-end GeForce offerings.
What do you think: real or fake? Cool or meh? Let us know!
Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM | Scott Michaud
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5
Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.
The current list is a little different:
- NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
- Intel Core i5-4590 (or higher)
- 8GB RAM (or higher)
- A compatible HDMI 1.3 output
- 2x USB 3.0 ports
- Windows 7 SP1 (or newer).
I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.
This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.
The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.
Subject: Graphics Cards | May 14, 2015 - 07:00 AM | Scott Michaud
Tagged: tonga, radeon, R9, pitcairn, Fiji, bonaire, amd
Benchlife.info, via WCCFTech, believes that AMD's Radeon R9 300-series GPUs will launch in late June. Specifically, the R9 380, the R7 370, and the R7 360 will arrive on the 18th of June. These are listed as OEM parts, as we have mentioned on the podcast, which Ryan speculates could mean that the flagship Fiji XT might go by a different name. Benchlife.info seems to think that it will be called by the R9 390(X) though, and that it will be released on the 24th of June.
WCCFTech is a bit more timid, calling it simply “Fiji XT”.
In relation to industry events, this has the OEM lineup launching on the last day of E3 and Fiji XT launching in the middle of the following week. This feels a little weird, especially because AMD's E3 event with PC Gamer is on the 16th. While it makes sense for AMD to announce the launch a few days before it happens, that doesn't make sense for OEM parts unless they were going to announce a line of pre-built PCs. The most likely candidate to launch gaming PCs is Valve, and they're one of the few companies that are absent from AMD's event.
And this is where I run out of ideas. Launching a line of OEM parts at E3 is weird unless it was to open the flood gates for OEMs to make their own announcements. Unless Valve is scheduled to make an announcement earlier in the day, or a surprise appearance at the event, that seems unlikely. Something seems up, though.
Subject: Graphics Cards | May 6, 2015 - 02:21 PM | Ryan Shrout
Tagged: amd, hbm, radeon, gpu
During today's 2015 AMD Financial Analyst Day, CEO Dr. Lisa Su discussed some of the details of the upcoming enthusiast Radeon graphics product. Though it wasn't given a name, she repeatedly said that the product would be announced "in the coming weeks...at upcoming industry events."
You won't find specifications here but understanding the goals and targets that AMD has for this new flagship product will help tell the story of this new Radeon product. Dr. Su sees AMD investing at very specific inflection points, the most recent of which are DirectX 12, 4K displays and VR technology. With adoption of HBM (high bandwidth memory) that sits on-die with the GPU, rather than across a physical PCB, we will see both a reduction in power consumption as well as a significant increase in GPU memory bandwidth.
HBM will accelerate the performance improvements at those key inflection points Dr. Su mentioned. Additional memory bandwidth will aid the ability for discrete GPUs to push out 4K resolutions and beyond, no longer limited by texture sizes. AMD's LiquidVR software, in conjunction with HBM, will be able to improve latency and reduce performance concerns on current and future generations of virtual reality hardware.
One interesting comment made during the conference was that HBM would enable new form factors for the GPUs now that you now longer need to have memory spread out on a PCB. While there isn't much room in the add-in card market for differentiation, in the mobile space that could mean some very interesting things for higher performance gaming notebooks.
Mark Papermaster, AMD CTO, said earlier in the conference call that HBM would aid in performance but maybe more importantly will lower power and improve total GPU efficiency. HBM will offer more than 3x improved performance/watt compared to GDDR5 while also running more than 50% lower power than GDDR5. Lower power and higher performance upgrades don't happen often so I am really excited to see what AMD does with it.
There weren't any more details on the next flagship Radeon GPU but it doesn't look like we'll have to wait much longer.
It's more than just a branding issue
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.