Frame Rating: AMD plans driver release to address frame pacing for July 31st

Subject: Graphics Cards | June 20, 2013 - 04:05 PM |
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd

Well, the date has been set.  AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st.  For a problem that many in the industry didn't think existed.  

 

 

Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring.  Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology. 

AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

BF3_2560x1440_PLOT_0.png

So what can we expect on July 31st?  A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous.  More to come in the next 30 days!

Source: Twitter

Podcast #256 - Mobile Frame Rating, NVIDIA licensing Kepler, Xbox One DRM and more!

Subject: General Tech | June 20, 2013 - 02:03 PM |
Tagged: video, podcast, 780m, frame rating, nvidia, kepler, xbox one, Adobe, CC, opencl

PC Perspective Podcast #256 - 06/20/2013

Join us this week as we discuss Mobile Frame Rating, NVIDIA licensing Kepler, Xbox One DRM and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Morry Teitelman

Program length: 1:33:43

  1. Week in Review:
  2. News items of interest:
    1. 0:43:30 Ryan's summary of E3
      1. Oculus 1080p, Razer Blade, Monoprice, SHIELD
  3. 1:22:00 Hardware/Software Picks of the Week:
    1. Allyn: Swiss-Tech Keychain Tools
  4. 1-888-38-PCPER or podcast@pcper.com
  5. Closing/outro

Rumored NVIDIA GTX 760 Specifications Leaked: The First Mid-Range 700-Series GPU Option

Subject: Graphics Cards | June 20, 2013 - 12:28 PM |
Tagged: nvidia, kepler, gtx 760, GK104, gk-104, gaming

There have been rumors of a new mid-range Kepler-based graphics card coming that will be next entry in the GTX 700-series. This new GPU is rumored to be called the GeForce GTX 760. If the specifications are true, the card will fit between the existing GTX 660 and GTX 660 Ti graphics cards as far as hardware specifications and pricing. While it will be under the GTX 700-series, it will not have the faster 7Gbps memory clockspeed of the other 700-series cards.

As far as specifications, Videocardz claims to have the final specifications list in a recent news post. The GTX 760 is rumored to be the latest graphics card to use NVIDIA's GK-104 "Kepler" GPU. The GTX 760 will have some units disabled for a GPU with 1,152 CUDA cores, 96 Texture Manipulation Units (TMUs), and 32 Raster Operations Processors (ROPs). The GPU supports NVIDIA's latest GPU Boost 2.0 technology which will automatically ratchet up the Boost clockspeed so long as temperature allows. It has a base clockspeed of 980 MHz and a boost clockspeed of 1,033 Mhz.

GTX 760 graphics cards will allegedly come in both 2GB and 4GB GDDR5 memory flavors. NVIDIA is clocking the memory at 6 Gbps (1502 MHz), which makes it the fist 700-series part to not take advantage of faster memory chips. However, there is a bit of saving grace as NVIDIA has moved to a 256-bit memory bus. This allows the card to still see a respectable bump in memory bandwidth of 192 GB/s on the GTX 760 versus the GTX 660/GTX 660 Ti's 144.2 GB/s bandwidth.

Compared to the existing mid-range 600-series cards, the GTX 760 has base and boost GPU clockspeeds equal to the GTX 660 (and faster than the GTX 660 TI). Memory clockspeed is also unchanged on the new card, though it has a wider memory bus. The GTX 760 has 192 more CUDA cores than the GTX 660, but 192 fewer CUDA cores versus the GTX 660 Ti. TMUs are also sit evenly between the two 600-series cards, but the GTX 760 does have 8 more ROPs enabled than both the 660 and 660 Ti.

Graphics cards with the upcoming GTX 760 GPU will be powered by two 6-pin PCI-E power connectors, and it has a 170W TDP. That power consumption puts the card between the 150W GTX 660 Ti and the higher-end 230W GTX 770. It appears that the card will not come with the high-end stock metallic cooler used in the other 700-series cards, though the various AIBs are likely to fit the GPU with their own custom aftermarket coolers. Video outputs on the cards will incluce DVI-I, DVI-D, HDMI, and DisplayPort.

The chart below compares the specifications between the GTX 660, GTX 660 Ti, GTX 770, and the rumored GTX 760.

  GTX 760 GTX 660 GTX 660 Ti GTX 770
CUDA Cores 1,152 960 1,344 1536
TMUs 96 80 112 128
ROPs 32 24 24 32
GPU Base 980 MHz 980 Mhz 915 Mhz 1046 Mhz
GPU Boost 1033 MHz 1033 MHz 980 Mhz 1085 MHz
Memory Bus 256-bit 192-bit 192-bit 256-bit
Memory Clock 1502 MHz 1502 MHx 1502 Mhz 1752 Mhz
Bandwidth 192 GB/s 144.2 GB/s 144.2 GB/s 224 GB/s
TDP 170 W 140 W 150 W 230 W
Architecture GK-104 GK-106 GK-104 GK-104

The card is supposedly going to be released on June 25th for around $300. It will compete with AMD's 7950 with boost graphics card. Further, the card will be an alternative to NVIDIA's own GTX 660 Ti and an upgrade for gamers still running GTX 560 cards with the company's older Fermi-based GPU.

Source: Videocardz

NVIDIA Enters the Licensing World: My Quick Analysis

Subject: General Tech | June 19, 2013 - 09:51 PM |
Tagged: Volta, nvidia, maxwell, licensing, kepler, Denver, Blogs, arm

Yesterday we all saw the blog piece from NVIDIA that stated that they were going to start licensing their IP to interested third parties.  Obviously, there was a lot of discussion about this particular move.  Some were in favor, some were opposed, and others yet thought that NVIDIA is now simply roadkill.  I believe that it is an interesting move, but we are not yet sure of the exact details or the repercussions of such a decision on NVIDIA’s part.

The biggest bombshell of the entire post was that NVIDIA would be licensing out their latest architecture to interested clients.  The Kepler architecture powers the very latest GTX 700 series of cards and at the top end it is considered one of the fastest and most efficient architectures out there.  Seemingly, there is a price for this though.  Time to dig a little deeper.

keplerdieshot.jpg

Kepler will be the first technology licensed to third party manufacturers.  We will not see full GPUs, these will only be integrated into mobile products.

The very latest Tegra parts from NVIDIA do not feature the Kepler architecture for the graphics portion.  Instead, the units featured in Tegra can almost be described as GeForce 7000 series parts.  The computational units are split between pixel shaders and vertex shaders.  They support a maximum compatibility of D3D 9_3 and OpenGL ES 2.0.  This is a far cry from a unified shader architecture and support for the latest D3D 11 and OpenGL ES 3.0 specifications.  Other mobile units feature the latest Mali and Adreno series of graphics units which are unified and support DX11 and OpenGL ES 3.0.

So why exactly does the latest Tegras not share the Kepler architecture?  Hard to say.  It could be a variety of factors that include time to market, available engineering teams, and simulations which could dictate if power and performance can be better served by a less complex unit.  Kepler is not simple.  A Kepler unit that occupies the same die space could potentially consume more power with any given workload, or conversely it could perform poorly given the same power envelope.

We can look at the desktop side of this argument for some kind of proof.  At the top end Kepler is a champ.  The GTX 680/770 has outstanding performance and consumes far less power than the competition from AMD.  When we move down a notch and see the GTX 660 Ti/HD 7800 series of cards, we see much greater parity in performance and power consumptions.  Going to the HD 7790 as compared to the 650 Ti Boost, we see the Boost part have slightly better performance but consumes significantly more power.  Then we move down to the 650 and 650 Ti and these parts do not consume any more power than the competing AMD parts, but they also perform much more poorly.  I know these are some pretty hefty generalizations and the engineers at NVIDIA could very effectively port Kepler over to mobile applications without significant performance or power penalties.  But so far, we have not seen this work.

Power, performance, and die area aside there is also another issue to factor in.  NVIDIA just announced that they are doing this.  We have no idea how long this effort has been going, but it is very likely that it has only been worked on for the past six months.  In that time NVIDIA needs to hammer out how they are going to license the technology, how much manpower they must provide licensees to get those parts up and running, and what kind of fees they are going to charge.  There is a lot of work going on there and this is not a simple undertaking.

So let us assume that some three months ago an interested partner such as Rockchip or Samsung comes knocking to NVIDIA’s door.  They work out the licensing agreements and this takes several months.  Then we start to see the transfer of technology between the companies.  Obviously Samsung and Rockchip are not going to apply this graphics architecture to currently shipping products, but will instead bundle it in with a next generation ARM based design.  These designs are not spun out overnight.  For example, the 64 bit ARMv8 designs have been finalized for around a year, and we do not expect to see initial parts being shipped until late 1H 2014.  So any partner that decides to utilize NVIDIA’s Kepler architecture for such an application will not see this part be released until 1H 2015 at the very earliest.

project-shield.jpg

Sheild is still based on a GPU posessing separate pixel and vertex shaders.  DX11 and OpenGL ES 3.0?  Nope!

If someone decides to license this technology from NVIDIA, it will not be of great concern.  The next generation of NVIDIA graphics will already be out by that time, and we could very well be approaching the next iteration for the desktop side.  NVIDIA plans on releasing a Kepler based mobile unit in 2014 (Logan), which would be a full year in advance of any competing product.  In 2015 NVIDIA is planning on releasing an ARM product based on the Denver CPU and Maxwell GPU.  So we can easily see that NVIDIA will only be licensing out an older generation product so it will not face direct competition when it comes to GPUs.  NVIDIA obviously is hoping that their GPU tech will still be a step ahead of that of ARM (Mali), Qualcomm (Adreno), and Imagination Technologies (PowerVR).

This is an easy and relatively painfree way to test the waters that ARM, Imagination Technologies, and AMD are already treading.  ARM only licenses IP and have shown the world that it can not only succeed at it, but thrive.  Imagination Tech used to produce their own chips much like NVIDIA does, but they changed direction and continue to be profitable.  AMD recently opened up about their semi-custom design group that will design specific products for customers and then license those designs out.  I do not think this is a desperation move by NVIDIA, but it certainly is one that probably is a little late in coming.  The mobile market is exploding, and we are approaching a time where nearly every electricity based item will have some kind of logic included in it, billions of chips a year will be sold.  NVIDIA obviously wants a piece of that market.  Even a small piece of “billions” is going to be significant to the bottom line.

Source: NVIDIA
Manufacturer: Adobe

OpenCL Support in a Meaningful Way

Adobe had OpenCL support since last year. You would never benefit from its inclusion unless you ran one of two AMD mobility chips under Mac OSX Lion, but it was there. Creative Cloud, predictably, furthers this trend with additional GPGPU support for applications like Photoshop and Premiere Pro.

This leads to some interesting points:

  • How OpenCL is changing the landscape between Intel and AMD
  • What GPU support is curiously absent from Adobe CC for one reason or another
  • Which GPUs are supported despite not... existing, officially.

adobe-cs-products.jpg

This should be very big news for our readers who do production work whether professional or for a hobby. If not, how about a little information about certain GPUs that are designed to compete with the GeForce 700-series?

Read on for our thoughts, after the break.

Rumor: AMD Gets Exclusive Optimization for all Frostbite 3 Games

Subject: Graphics Cards | June 18, 2013 - 03:39 PM |
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd

UPDATE #3

The original source article at IGN.com has been updated with some new information.  Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported." 

The quote from an EA rep says as follows:

DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.

END UPDATE #3

This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine.  That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014.  Here is the quote that is getting my attention:

Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

bf4.jpg

Battlefield 4 will be exclusive optimized for AMD hardware.

This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles.  The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams. 

I am particularly interested in this line as well:

While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming.  What is bothersome to me is that both EA and AMD are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners.  In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release.  Without these builds, NVIDIA would be at a big disadvantage.  This is exactly what happend with the recent Tomb Raider release.

UPDATE

AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy.  In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"

So what do we take away from that statement, made in a story published in March, and today's rumor?  We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way."  That stance just happens to be counter to this rumor. 

END UPDATE

tombraider.jpg

NVIDIA had performance and compatibility issues with Tomb Raider upon release.

The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA.  When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing.  In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards.  Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales.  At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc.  But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business."  If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

ps4controller.jpg

Will the advantage be with AMD thanks to PS4 and Xbox One hardware?

At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months.  AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space.  NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side.  In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.

Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority.  There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion. 

batmanaa.jpg

Remember the issues with Batman: Arkham Asylum?  I do.

I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far.  Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:

It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.

Not much there, but he is also not denying of the original report coming from IGN.  It might just be too early for a more official statement.  I will update this story with information from NVIDIA if I hear anything else.

What do YOU think about this announcement though?  Is this good news for AMD and bad news for NVIDIA?  Is it good or bad for the gamer and in particular, the PC gamer?  Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject. 

UPDATE #2

Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:

taylorquote.png

END UPDATE #2

Source: IGN
Author:
Manufacturer: NVIDIA

Kepler-based Mobile GPUs

Late last month, just before the tech world blew up from the mess that is Computex, NVIDIA announced a new line of mobility discrete graphics parts under the GTX 700M series label.  At the time we simply posted some news and specifications about the new products but left performance evaluation for a later time.  Today we have that for the highest end offering, the GeForce GTX 780M. 

As with most mobility GPU releases it seems, the GTX 700M series is not really a new GPU and only offers cursory feature improvements.  Based completely on the Kepler line of parts, the GTX 700M will range from 1536 CUDA cores on the GTX 780M to 768 cores on the GTX 760M. 

slide2.jpg

The flagship GTX 780M is essentially a desktop GTX 680 card in a mobile form factor with lower clock speeds.  With 1536 CUDA cores running at 823 MHz and boosting to higher speeds depending on the notebook configuration, a 256-bit memory controller running at 5 GHz, the GTX 780M will likely be the fastest mobile GPU you can buy.  (And we’ll be testing that in the coming pages.) 

The GTX 760M, 765M and 770M offering ranges of performance that scale down to 768 cores at 657 MHz.  NVIDIA claims we’ll see the GTX 760M in systems as small as 14-in and below with weights at 2kg or so from vendors like MSI and Acer.  For Ultrabooks and thinner machines you’ll have to step down to smaller, less power hungry GPUs like the GT 750 and 740 but even then we expect NVIDIA to have much faster gaming performance than the Haswell-based processor graphics.

Continue reading our performance review of the new NVIDIA GeForce GTX 780M mobility GPU!!

Podcast #255 - AMD's 5 GHz Processor, 1080p Oculus Rift, and more news from Computex!

Subject: General Tech | June 13, 2013 - 02:33 PM |
Tagged: wwdc, video, titan, podcast, oculus rift, nvidia, FX, apple, amd, a10-6800k, 5ghz

PC Perspective Podcast #255 - 06/13/2013

Join us this week as we discuss AMD's 5 GHz Processor, 1080p Oculus Rift, and more news from Computex!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Jeremy Hellstrom, Josh Walrath and Morry Teitelman

Program length: 57:27

  1. Week in Review:
  2. News items of interest:
    1. 0:40:40
  3. 0:49:00 Hardware/Software Picks of the Week:
    1. Ryan: LA Traffic
    2. Jeremy: The mighty can of air
    3. Allyn: Cold Medication
    4. Morry: more pump for your pump - Swiftech MCP35X
    5. Scott: Now with 100% more compelling. Alienware X51
  4. 1-888-38-PCPER or podcast@pcper.com

 

Author:
Manufacturer: NVIDIA

A necessary gesture

NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC.  But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem.  But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.

nv14.jpg

The slide above shows NVIDIA targeting for each segment – expect for consoles obviously.  NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces.  I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?

The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers.  AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets.  But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.

nv02.jpg

With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine.  They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software.  They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.

nv03.jpg

This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more. 

nv04.jpg

Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming. 

Continue reading our editorial on NVIDIA's stance on it's future in PC gaming!!

Podcast #254 - NVIDIA GTX 770, Haswell and Z87 Reviews, AMD Richland APUs and ton of Computex news!

Subject: General Tech | June 6, 2013 - 01:42 PM |
Tagged: podcast, video, haswell, gtx 770, amd, Richland, nvidia, computex, asus, Transformer, 4k

PC Perspective Podcast #254 - 06/06/2013

Join us this week as we discuss the NVIDIA GTX 770, Haswell and Z87 Reviews, AMD Richland APUs and ton of Computex news!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano and Morry Teitelman

Program length: 1:41:16

  1. Week in Review:
  2. News items of interest:
    1. Jeremy: Something I have to test for work
  3. 1-888-38-PCPER or podcast@pcper.com
  4. Closing/outro