Subject: Graphics Cards | May 14, 2015 - 07:00 AM | Scott Michaud
Tagged: tonga, radeon, R9, pitcairn, Fiji, bonaire, amd
Benchlife.info, via WCCFTech, believes that AMD's Radeon R9 300-series GPUs will launch in late June. Specifically, the R9 380, the R7 370, and the R7 360 will arrive on the 18th of June. These are listed as OEM parts, as we have mentioned on the podcast, which Ryan speculates could mean that the flagship Fiji XT might go by a different name. Benchlife.info seems to think that it will be called by the R9 390(X) though, and that it will be released on the 24th of June.
WCCFTech is a bit more timid, calling it simply “Fiji XT”.
In relation to industry events, this has the OEM lineup launching on the last day of E3 and Fiji XT launching in the middle of the following week. This feels a little weird, especially because AMD's E3 event with PC Gamer is on the 16th. While it makes sense for AMD to announce the launch a few days before it happens, that doesn't make sense for OEM parts unless they were going to announce a line of pre-built PCs. The most likely candidate to launch gaming PCs is Valve, and they're one of the few companies that are absent from AMD's event.
And this is where I run out of ideas. Launching a line of OEM parts at E3 is weird unless it was to open the flood gates for OEMs to make their own announcements. Unless Valve is scheduled to make an announcement earlier in the day, or a surprise appearance at the event, that seems unlikely. Something seems up, though.
Subject: Graphics Cards | May 6, 2015 - 02:21 PM | Ryan Shrout
Tagged: amd, hbm, radeon, gpu
During today's 2015 AMD Financial Analyst Day, CEO Dr. Lisa Su discussed some of the details of the upcoming enthusiast Radeon graphics product. Though it wasn't given a name, she repeatedly said that the product would be announced "in the coming weeks...at upcoming industry events."
You won't find specifications here but understanding the goals and targets that AMD has for this new flagship product will help tell the story of this new Radeon product. Dr. Su sees AMD investing at very specific inflection points, the most recent of which are DirectX 12, 4K displays and VR technology. With adoption of HBM (high bandwidth memory) that sits on-die with the GPU, rather than across a physical PCB, we will see both a reduction in power consumption as well as a significant increase in GPU memory bandwidth.
HBM will accelerate the performance improvements at those key inflection points Dr. Su mentioned. Additional memory bandwidth will aid the ability for discrete GPUs to push out 4K resolutions and beyond, no longer limited by texture sizes. AMD's LiquidVR software, in conjunction with HBM, will be able to improve latency and reduce performance concerns on current and future generations of virtual reality hardware.
One interesting comment made during the conference was that HBM would enable new form factors for the GPUs now that you now longer need to have memory spread out on a PCB. While there isn't much room in the add-in card market for differentiation, in the mobile space that could mean some very interesting things for higher performance gaming notebooks.
Mark Papermaster, AMD CTO, said earlier in the conference call that HBM would aid in performance but maybe more importantly will lower power and improve total GPU efficiency. HBM will offer more than 3x improved performance/watt compared to GDDR5 while also running more than 50% lower power than GDDR5. Lower power and higher performance upgrades don't happen often so I am really excited to see what AMD does with it.
There weren't any more details on the next flagship Radeon GPU but it doesn't look like we'll have to wait much longer.
It's more than just a branding issue
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.
Subject: Graphics Cards | March 5, 2015 - 04:46 PM | Ryan Shrout
Tagged: GDC, gdc 15, amd, radeon, R9, 390x, VR, Oculus
Don't get too excited about this news, but AMD tells me that its next flagship Radeon R9 graphics card is up and running at GDC, powering an Oculus-based Epic "Showdown" demo.
Inside the box...
During my meeting with AMD today I was told that inside that little PC sits the "upcoming flagship Radeon R9 graphics card" but, of course, no other information was given. The following is an estimated transcript of the event:
Ryan: Can I see it?
Ryan: I can't even take the side panel off it?
Ryan. How can I know you're telling the truth then? Can I open up the driver or anything?
Ryan: GPU-Z? Anything?
Well, I tried.
Is this the rumored R9 390X with the integrated water cooler? Is it something else completely? AMD wouldn't even let me behind the system to look for a radiator so I'm afraid that is where my speculation will end.
Hooked up to the system was a Crescent Bay Oculus headset running the well-received Epic "Showdown" demo. The experience was smooth though of course there were no indications of frame rate, etc. while it was going on. After our discussion with AMD earlier in the week about its LiquidVR SDK, AMD is clearly taking the VR transition seriously. NVIDIA's GPUs might be dominating the show-floor demos but AMD wanted to be sure it wasn't left out of the discussion.
Can I just get this Fiji card already??
Subject: Graphics Cards | February 21, 2015 - 12:18 PM | Ryan Shrout
Tagged: radeon, nvidia, marketshare, market share, geforce, amd
One of the perennial firms that measures GPU market share, Jon Peddie Research, has come out with a report on Q4 of 2014 this weekend and the results are eye opening. According to the data, NVIDIA and AMD each took dramatic swings from Q4 of 2013 to Q4 of 2014.
|Q4 2014||Q3 2014||Q4 2013||Year-to-year Change|
Data source: Jon Peddie Research
Here is the JPR commentary to start us out:
JPR's AIB Report tracks computer add-in graphics boards, which carry discrete graphics chips. AIBs used in desktop PCs, workstations, servers, and other devices such as scientific instruments. They are sold directly to customers as aftermarket products, or are factory installed. In all cases, AIBs represent the higher end of the graphics industry using discrete chips and private high-speed memory, as compared to the integrated GPUs in CPUs that share slower system memory.
The news was encouraging and seasonally understandable, quarter-to-quarter, the market decreased -0.68% (compared to the desktop PC market, which decreased 3.53%).
On a year-to-year basis, we found that total AIB shipments during the quarter fell -17.52% , which is more than desktop PCs, which fell -0.72%.
However, in spite of the overall decline, somewhat due to tablets and embedded graphics, the PC gaming momentum continues to build and is the bright spot in the AIB market.
NVIDIA's Maxwell GPU
The overall PC desktop market increased quarter-to-quarter including double-attach-the adding of a second (or third) AIB to a system with integrated processor graphics-and to a lesser extent, dual AIBs in performance desktop machines using either AMD's Crossfire or Nvidia's SLI technology.
The attach rate of AIBs to desktop PCs has declined from a high of 63% in Q1 2008 to 36% this quarter.
The year to year change that JPR is reporting is substantial and shows a 20+ point change in market share in favor of NVIDIA over AMD. According to this data, AMD's market share has now dropped from 35% at the end of 2013 to just 24% at the end of 2014. Meanwhile, NVIDIA continues to truck forward, going from 64.9% at the end of 2013 to 76% at the end of 2014.
The Radeon R9 285 release didn't have the impact AMD had hoped
Clearly the release of NVIDIA's Maxwell GPUs, the GeForce GTX 750 Ti, GTX 970 and GTX 980 have impacted the market even more than we initially expected. In recent weeks the GTX 970 has been getting a lot of negative press with the memory issue and I will be curious to see what effect this has on sales in the near future. But the 12 month swing that you see in the table above is the likely cause for the sudden departure of John Byrne, Collette LaForce and Raj Naik.
AMD has good products, even better pricing and a team of PR and marketing folks that are talented and aggressive. So how can the company recover from this? Products, people; new products. Will the rumors circling around the Radeon R9 390X develop into such a product?
Hopefully 2015 will provide it.
Subject: General Tech | February 12, 2015 - 01:46 PM | Ken Addison
Tagged: podcast, video, gtx 960, plextor, m6e black edition, M6e, r9 390, amd, radeon, nvidia, Silverstone, tegra, tx1, Tegra X1, corsair, H100i GTX, H80i GT
PC Perspective Podcast #336 - 02/12/2015
Join us this week as we discuss GTX 960 Overlocking, Plextor M6e Black Edition, AMD R9 3xx Rumors and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:11:53
Week in Review:
News item of interest:
0:46:10 NVIDIA Event on March 3rd. Why?
Hardware/Software Picks of the Week:
Subject: Graphics Cards | February 7, 2015 - 06:03 PM | Scott Michaud
Tagged: amd, R9, r9 300, r9 390x, radeon
According to WCCFTech, AMD commented on Facebook that they are “putting the finishing touches on the 300 series to make sure they live up to expectation”. I tried look through AMD's “Posts to Page” for February 3rd and I did not see it listed, so a grain of salt is necessary (either with WCCF or with my lack of Facebook skills).
Image Credit: WCCFTech
The current rumors claim that Fiji XT will have 4096 graphics cores that are fed by a high-bandwidth, stacked memory architecture, which is supposedly rated at 640 GB/s (versus 224 GB/s of the GeForce GTX 980). When you're dealing with data sets at the scale that GPUs are, bandwidth is a precious resource. That said, they also have cache and other methods to reduce this dependency, but let's just say that, if you offer a graphics vendor a free, order-of-magnitude speed-up in memory bandwidth -- you will have friend, and possibly one for life. Need a couch moved? No problem!
The R9 Series is expected to be launched next quarter, which could be as early as about a month.
Subject: General Tech, Systems | January 30, 2015 - 03:03 AM | Tim Verry
Tagged: visionx, SFF, radeon, m270x, haswell, asrock, amd
ASRock has unleashed an update to its small form factor VisionX series. The new VisionX 471D adds a faster Haswell processor and dedicated Radeon mobile graphics to the mini PC.
The 7.9” x 7.9” x 2.8” PC chassis comes in black or silver with rounded corners. External I/O is quite expansive with a DVD optical drive, two audio jacks, one USB 3.0 port, one MHSL* port (MHL compatible port that carries both data and video), and a SD card reader on the front. Further, the back of the PC holds the following ports:
- 5 x Analog audio jacks
- 1 x Optical audio out
- 1 x DVI
- 1 x HDMI
- 1 x Gigabit Ethernet jack
- 802.11ac (2 antennas)
- 5 x USB 3.0
- 1 x USB 2.0
- 1 x eSATA
ASRock has gone with the Intel Core i7-4712MQ processor. This is a 37W Haswell quad core (with eight threads) clocked at up to 3.3GHz. Graphics are handled by the AMD Radeon R9 M270X which is a mobile “Venus” GCN-based GPU with 1GB of memory. The 28nm GPU with 640 cores, 40 TMUs, and 16 ROPs is clocked at 725 MHz base and up to 775 MHz boost. The PC further supports two SO-DIMMS, two 2.5” drives, one mSATA connector, and the above-mentioned DVD drive (DL-8A4SH-01 comes pre-installed).
The VisionX 471D is a “barebones” system where you will have to provide your own OS but does come with bundled storage and memory. Specifically, for $999, the SFF computer comes with 8GB of DDR3 memory, a 2TB mechanical hard drive, and a 256GB mSATA SSD (the ASint SSDMSK256G-M1 using a JMF667 controller and 64GB 20nm IMFT NAND). This leaves room for one additional 2.5” drive for expansion. Although it comes without an operating system, it does ship with a Windows Media Center compatible remote.
This latest addition to the VisionX series succeeds the 420D and features a faster processor. At the time of this writing, the PC is not available for purchase, but it is in the hands of reviewers (such as this review from AnandTech) and will be coming soon to retailers for $999 USD.
The price is on the steep side especially compared to some other recent tiny PCs, but you are getting a top end mobile Haswell chip and good I/O for a small system with enough hardware to possibly be "enough" PC for many people (or at least a second PC or a HTPC in the living room).
Subject: Graphics Cards | January 13, 2015 - 12:22 PM | Ryan Shrout
Tagged: rumor, radeon, r9 380x, 380x
Spotted over at TechReport.com this morning and sourced from a post at 3dcenter.org, it appears that some additional information about the future Radeon R9 380X is starting to leak out through AMD employee LinkedIn pages.
Ilana Shternshain is a ASIC physical design engineer at AMD with more than 18 years of experience, 7-8 years of that with AMD. Under the background section is the line "Backend engineer and team leader at Intel and AMD, responsible for taping out state of the art products like Intel Pentium Processor with MMX technology and AMD R9 290X and 380X GPUs." A bit further down is an experience listing of the Playstation 4 APU as well as "AMD R9 380X GPUs (largest in “King of the hill” line of products)."
Interesting - though not entirely enlightening. More interesting were the details found on Linglan Zhang's LinkedIn page (since removed):
Developed the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer.
Now we have something to work with! A 300 watt TDP would make the R9 380X more power hungry than the current R9 290X Hawaii GPU. High bandwidth memory likely implies memory located on the substrate of the GPU itself, similar to what exists on the Xbox One APU, though configurations could differ in considerable ways. A bit of research on the silicon interposer reveals it as an implementation method for 2.5D chips:
There are two classes of true 3D chips which are being developed today. The first is known as 2½D where a so-called silicon interposer is created. The interposer does not contain any active transistors, only interconnect (and perhaps decoupling capacitors), thus avoiding the issue of threshold shift mentioned above. The chips are attached to the interposer by flipping them so that the active chips do not require any TSVs to be created. True 3D chips have TSVs going through active chips and, in the future, have potential to be stacked several die high (first for low-power memories where the heat and power distribution issues are less critical).
An interposer would allow the GPU and stacked die memory to be built on different process technology, for example, but could also make the chips more fragile during final assembly. Obviously there a lot more questions than answers based on these rumors sourced from LinkedIn, but it's interesting to attempt to gauge where AMD is headed in its continued quest to take back market share from NVIDIA.
Subject: Graphics Cards, Displays, Shows and Expos | January 7, 2015 - 03:11 AM | Ryan Shrout
Tagged: video, radeon, monitor, g-sync, freesync, ces 2015, CES, amd
It finally happened - later than I had expected - we got to get hands on with nearly-ready FreeSync monitors! That's right, AMD's alternative to G-Sync will bring variable refresh gaming technology to Radeon gamers later this quarter and AMD had the monitors on hand to prove it. On display was an LG 34UM67 running at 2560x1080 on IPS technology, a Samsung UE590 with a 4K resolution and AHVA panel and BenQ XL2730Z 2560x1440 TN screen.
The three monitors sampled at the AMD booth showcase the wide array of units that will be available this year using FreeSync, possibly even in this quarter. The LG 34UM67 uses the 21:9 aspect ratio that is growing in popularity, along with solid IPS panel technology and 60 Hz top frequency. However, there is a new specification to be concerned with on FreeSync as well: minimum frequency. This is the refresh rate that monitor needs to maintain to avoid artifacting and flickering that would be visible to the end user. For the LG monitor it was 40 Hz.
What happens below that limit and above it differs from what NVIDIA has decided to do. For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it off you would reintroduce frame judder when you cross between V-Sync steps.
There are potential pitfalls to this solution though; what happens when you cross into that top or bottom region can cause issues depending on the specific implementation. We'll be researching this very soon.
Notice this screen shows FreeSync Enabled and V-Sync Disabled, and we see a tear.
FreeSync monitors have the benefit of using industry standard scalers and that means they won't be limited to a single DisplayPort input. Expect to see a range of inputs including HDMI and DVI though the VRR technology will only work on DP.
We have much more to learn and much more to experience with FreeSync but we are eager to get one in the office for testing. I know, I know, we say that quite often it seems.
Follow all of our coverage of the show at http://pcper.com/ces!