Oops: Leaked AMD APUs. A8-6500T and A10-6700T specs. Richland to hit 45W in quad-core APUs?

Subject: General Tech, Processors | July 3, 2013 - 03:12 AM |
Tagged: Richland, APU, amd

Accidents happen. AMD has been rolling out their Richland APUs for the last month and partners have been keeping up with supporting products. While common, the problem with rolling releases is the potential confusion over what has and what has not been released. Unfortunately for MSI, their support chart for FM2 CPUs includes a couple of products which are news to us.

amd-new.png

AMD will be able to hit the 45W TDP with the, apparently, upcoming A8-6500T and A10-6700T APUs. Tom's Hardware seemed to have slightly different information, their chart does not exactly jive with the one posted by MSI; for instance, they claimed the T suffix implied a low power variant when MSI's chart confirmed a 45W TDP... fairly loud and clear. As such, our table will be my best attempt at combining both charts along with a bit more leaked GPU information from TechPowerUP.

  A8-6500T A10-6700T
Core Count 4 4
Base Clock 2.1 GHz 2.5 GHz
Boost Clock (Unknown) (Unknown)
TDP 45W 45W
L2 Cache 4 MB 4 MB
L3 Cache 0 MB (N/A) 0 MB (N/A)
GPU Model
Radeon HD 8550D
(Not HD 8650D)
Radeon HD 8650D
GPU Clock 720 MHz 720 MHz
GPU Boost Clock 844 MHz (???) 844 MHz (???)
GPU Shader Count 256 384
TMU/ROP/Compute 16/8/4 24/8/4

It is impossible to know expected price, release window, or even whether the product still exists. For that, we will need to wait for an official unveiling... or at least another unofficial one.

Source: MSI

AMD Catalyst for Windows 8.1 Release Preview

Subject: General Tech, Graphics Cards | June 26, 2013 - 06:05 PM |
Tagged: Windows 8.1, radeon, amd

You should be extremely cautious about upgrading to the Windows 8.1 Release Preview. Each of your apps, and all of your desktop software, must be reinstalled when the final code is released later this year; it is a detour to a dead end.

AMD-Catalyst.jpg

If curiosity overwhelms reason, and your graphics card was made by AMD withing the last few years, you will at least have a driver available.

It would be a good idea to refer to the AMD article to ensure that your specific model is supported. The driver covers many graphics cards from the Radeon, APU, and FirePro product categories. Many models are certified against Windows Display Driver Model version 1.3 (WDDM 1.3) although some, pre-Graphics Core Next architecture (as far as I can tell), are left behind with WDDM 1.2 introduced with Windows 8.

WDDM 1.3, new to Windows 8.1, allows for a few new developer features:

  • Enumerating GPU engine capabilities
    • A DirectX interface to query card capabilities
    • Helps schedule work, especially in "Linked Display Adapter" (LDA, think Crossfire) configurations.
  • Using cross-adapter resources in a hybrid system
    • For systems with both discrete and embedded GPUs, such as an APU and a Radeon Card
    • Allows for automatic loading of both GPUs simultaneously for appropriate applications
    • Cool, but I've already loaded separate OpenCL kernels simultaneously on both GTX 670 and Intel HD 4000 in Windows 7. Admittedly, it would be nice if it were officially supported functionality, though.
  • Choice in YUV format ranges, studio or extended, for Microsoft Media Foundation (MMF)
    • Formerly, MMF video processing assumed 16-235 black-white, which professional studios use.
    • Webcam and Point-and-Shoot use 0-255 (a full byte), which are now processed properly.
  • Wireless Display (Miracast)
    • Attach your PC wirelessly to a Miracast display adapter attached to TV by HDMI, or whatever.
  • Multiplane overlay support
    • Allows GPU to perform complicated compositing, such as video over a website.
    • If it's the same as proposed for Linux, will also allow translucency.

AMD's advertised enhancements for Windows 8.1 are:

  • Wireless Display
    • Already covered, a part of WDDM 1.3.
  • 48 Hz Dynamic Refresh rates for Video Playback
    • Not a clue, unless it is part of an upcoming HFR format for consumers.
  • Aggressive V-sync interrupt optimization
    • Again, not a clue, but it sounds like something to be Frame Rated?
  • Skype/Lync video conferencing acceleration
    • ... just when we move to a dual-machine Skype broadcasting setup...
  • DX 11.1 feature: Tiled Resources
    • Some sources claim DirectX 11.2???
    • Will render the most apparent details to a player with higher quality.

If you own Windows 8, you can check out 8.1 by downloading it from the Windows Store... if you dare. By tomorrow, Microsoft will provide ISO version for users to create install media for users who want to fresh-install to a, hopefully unimportant, machine.

The drivers, along with (again) the list of supported cards, are available at AMD.

Source: AMD

Frame Rating: AMD plans driver release to address frame pacing for July 31st

Subject: Graphics Cards | June 20, 2013 - 04:05 PM |
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd

Well, the date has been set.  AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st.  For a problem that many in the industry didn't think existed.  

 

 

Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring.  Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology. 

AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

BF3_2560x1440_PLOT_0.png

So what can we expect on July 31st?  A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous.  More to come in the next 30 days!

Source: Twitter

Running at 8.0GHz on All Four Cylinders

Subject: General Tech, Processors | June 19, 2013 - 08:37 PM |
Tagged: overclock, amd

Thankfully, they were not "firing" on all four cylinders; while Ryan does prefer thermite, overclockers tend to prefer liquid nitrogen. There are some distinct advantages of ice over fire, the main one for computer users is the potential for massive bumps in frequency and voltage. Of course, you cannot really get any effective use out of a machine that relies on a steady stream of fluid cold enough that it takes less digits to write out its temperature in Kelvin, but a large bump makes good bragging rights.

AMD-Overclock-CPUz.png

How about an A10-6800K overclocked to just over 8.0 GHz, with all four cores enabled?

Finnish overclocker, "The Stilt", managed to push his four-core part to 8000.39 MHz just long enough to have CPU-Z validate his accomplishment. With a frequency multiplier of 63.0 atop a bus speed of 126.99, this gets within 800MHz of the AMD FX-8350 running on just one module (6 of 8 cores disabled) recorded by ASUS late last year.

But no, it will probably not run Crysis.

AMD's plans to keep their ARMs in the server room

Subject: General Tech | June 19, 2013 - 03:02 PM |
Tagged: amd, Kyoto, berlin, seattle, warsaw, arm

DigiTimes named the four new families of server chip that AMD will be using to keep their products in the server room.  Kyoto is known as the Opteron X-series and is available now, based on Jaguar and offering GPU compute enhancements as well as increased CPU performance.  The Seattle family will replace these CPUs in the near future and will represent a new era for AMD as these chips will be clusters of ARM Cortex-A57 on AMD's advanced Freedom Fabric.  Berlin will be a true x86 AMD chip with the new Steamroller architecture which will replace Piledriver and support HSA compliant optimizations.  Last is Warsaw, which will be the most powerful chip, uniting 12 or 16 Piledriver cores in a chip which is compatible with the current Socket G43 used by the Opteron 6300 family, offering a simple drop in upgrade solution.

roadmap.jpg

"AMD has publicly disclosed its strategy and roadmap to recapture market share in enterprise and data center servers by unveiling products that address key technologies and meet the requirements of the fastest-growing data center and cloud computing workloads."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes
Manufacturer: Adobe

OpenCL Support in a Meaningful Way

Adobe had OpenCL support since last year. You would never benefit from its inclusion unless you ran one of two AMD mobility chips under Mac OSX Lion, but it was there. Creative Cloud, predictably, furthers this trend with additional GPGPU support for applications like Photoshop and Premiere Pro.

This leads to some interesting points:

  • How OpenCL is changing the landscape between Intel and AMD
  • What GPU support is curiously absent from Adobe CC for one reason or another
  • Which GPUs are supported despite not... existing, officially.

adobe-cs-products.jpg

This should be very big news for our readers who do production work whether professional or for a hobby. If not, how about a little information about certain GPUs that are designed to compete with the GeForce 700-series?

Read on for our thoughts, after the break.

Rumor: AMD Gets Exclusive Optimization for all Frostbite 3 Games

Subject: Graphics Cards | June 18, 2013 - 03:39 PM |
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd

UPDATE #3

The original source article at IGN.com has been updated with some new information.  Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported." 

The quote from an EA rep says as follows:

DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.

END UPDATE #3

This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine.  That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014.  Here is the quote that is getting my attention:

Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

bf4.jpg

Battlefield 4 will be exclusive optimized for AMD hardware.

This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles.  The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams. 

I am particularly interested in this line as well:

While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming.  What is bothersome to me is that both EA and AMD are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners.  In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release.  Without these builds, NVIDIA would be at a big disadvantage.  This is exactly what happend with the recent Tomb Raider release.

UPDATE

AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy.  In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"

So what do we take away from that statement, made in a story published in March, and today's rumor?  We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way."  That stance just happens to be counter to this rumor. 

END UPDATE

tombraider.jpg

NVIDIA had performance and compatibility issues with Tomb Raider upon release.

The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA.  When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing.  In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards.  Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales.  At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc.  But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business."  If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

ps4controller.jpg

Will the advantage be with AMD thanks to PS4 and Xbox One hardware?

At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months.  AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space.  NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side.  In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.

Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority.  There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion. 

batmanaa.jpg

Remember the issues with Batman: Arkham Asylum?  I do.

I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far.  Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:

It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.

Not much there, but he is also not denying of the original report coming from IGN.  It might just be too early for a more official statement.  I will update this story with information from NVIDIA if I hear anything else.

What do YOU think about this announcement though?  Is this good news for AMD and bad news for NVIDIA?  Is it good or bad for the gamer and in particular, the PC gamer?  Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject. 

UPDATE #2

Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:

taylorquote.png

END UPDATE #2

Source: IGN

More information on AMD's mysterious 5GHz chip

Subject: General Tech | June 13, 2013 - 04:45 PM |
Tagged: vishera, piledriver, FX-9590, FX-9370, Centurion, amd

The Tech Report managed to get some more information out of AMD about the new FX-9000 series that the net has been buzzing about.  We now have confirmation that the base clocks for the FX-9590 and FX-9370 are 4.7GHz and 4.4GHz.  They also confirmed that 220W TFP is relatively accurate which will make these the hottest chips on the market.  While you won't see these chips officially for sale outside of specially built systems, there is a chance a few might pop up on eBay and if you are curious how they might perform there is a link in The Tech Report's article to an overclocked Vishera which will give you a rough idea.

fx_logo.jpg

"On Tuesday, AMD introduced its new FX-9000-series processors. The company quoted their peak Turbo speeds (5GHz for the FX-9590, 4.7GHz for the FX-9370) and a rough time frame for availability ("this summer"), but it revealed little else. We were left wondering about base clocks, power envelopes, and potential retail availability."

Here is some more Tech News from around the web:

Tech Talk

Podcast #255 - AMD's 5 GHz Processor, 1080p Oculus Rift, and more news from Computex!

Subject: General Tech | June 13, 2013 - 02:33 PM |
Tagged: wwdc, video, titan, podcast, oculus rift, nvidia, FX, apple, amd, a10-6800k, 5ghz

PC Perspective Podcast #255 - 06/13/2013

Join us this week as we discuss AMD's 5 GHz Processor, 1080p Oculus Rift, and more news from Computex!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Jeremy Hellstrom, Josh Walrath and Morry Teitelman

Program length: 57:27

  1. Week in Review:
  2. News items of interest:
    1. 0:40:40
  3. 0:49:00 Hardware/Software Picks of the Week:
    1. Ryan: LA Traffic
    2. Jeremy: The mighty can of air
    3. Allyn: Cold Medication
    4. Morry: more pump for your pump - Swiftech MCP35X
    5. Scott: Now with 100% more compelling. Alienware X51
  4. 1-888-38-PCPER or podcast@pcper.com

 

E3 2013: AMD tells the press their gaming initiatives

Subject: General Tech, Graphics Cards, Processors, Shows and Expos | June 13, 2013 - 02:26 AM |
Tagged: E3, E3 13, amd

The Electronic Entertainment Expo (E3) is the biggest event of the year for millions of gamers. The majority of coverage ends up gawking over the latest news out of Microsoft, Sony, or Nintendo, and we certainly will provide our insights in those places if we believe they have been insufficiently explained, but E3 is also a big time for PC gamers too.

AMD_fx.jpg

5 GHz and unlocked to go from there.

AMD, specifically, has a lot to say this year. In the year of the next-gen console reveals, AMD provides the CPU architecture for two of the three devices and have also designed each of the three GPUs. This just leaves a slight win for IBM, who is responsible for the WiiU main processor, for whatever that is worth. Unless the Steam Box comes to light and without ties to AMD, it is about as close to a clean sweep as any hardware manufacturer could get.

But for the PCs among us...

For those who have seen the EA press conference, you have probably seen lots of sports. If you stuck around after the sports, you probably saw Battlefield 4 being played by 64 players on stage. AMD has been pushing, very strongly, for developer relations over the last year. DICE, formerly known for being an NVIDIA-friendly developer, did not exhibit Battlefield 4 "The Way It's Meant to be Played" at the EA conference. According to one of AMD's Twitter accounts:

 

 

On the topic of "Gaming Evolved" titles, AMD is partnering with Square Enix to optimize Thief for GCN and A-Series APUs. The Press Release specifically mentioned Eyefinity and Crossfire support along with a DirectX 11 rendering engine; of course, the enhancements with real, interesting effects are the seemingly boring ones they do not mention.

The last major point from their E3 event was the launch of their 5 GHz FX processors. For more information on that part, check out Josh's thoughts from a couple of days ago.

Source: AMD