OpenCL Support in a Meaningful Way
Adobe had OpenCL support since last year. You would never benefit from its inclusion unless you ran one of two AMD mobility chips under Mac OSX Lion, but it was there. Creative Cloud, predictably, furthers this trend with additional GPGPU support for applications like Photoshop and Premiere Pro.
This leads to some interesting points:
- How OpenCL is changing the landscape between Intel and AMD
- What GPU support is curiously absent from Adobe CC for one reason or another
- Which GPUs are supported despite not... existing, officially.
This should be very big news for our readers who do production work whether professional or for a hobby. If not, how about a little information about certain GPUs that are designed to compete with the GeForce 700-series?
Subject: Graphics Cards | June 18, 2013 - 03:39 PM | Ryan Shrout
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd
The original source article at IGN.com has been updated with some new information. Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported."
The quote from an EA rep says as follows:
DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.
END UPDATE #3
This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine. That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014. Here is the quote that is getting my attention:
Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.
Battlefield 4 will be exclusive optimized for AMD hardware.
This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles. The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams.
I am particularly interested in this line as well:
While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.
The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming. What is bothersome to me is that both EA and AMD
are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners. In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release. Without these builds, NVIDIA would be at a big disadvantage. This is exactly what happend with the recent Tomb Raider release.
AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy. In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"
So what do we take away from that statement, made in a story published in March, and today's rumor? We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way." That stance just happens to be counter to this rumor.
NVIDIA had performance and compatibility issues with Tomb Raider upon release.
The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA. When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing. In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards. Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales. At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc. But times change as opportunity changes.
The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business." If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.
Will the advantage be with AMD thanks to PS4 and Xbox One hardware?
At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months. AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space. NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side. In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.
Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority. There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion.
Remember the issues with Batman: Arkham Asylum? I do.
I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far. Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:
It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.
Not much there, but he is also not denying of the original report coming from IGN. It might just be too early for a more official statement. I will update this story with information from NVIDIA if I hear anything else.
What do YOU think about this announcement though? Is this good news for AMD and bad news for NVIDIA? Is it good or bad for the gamer and in particular, the PC gamer? Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject.
Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:
END UPDATE #2
Kepler-based Mobile GPUs
Late last month, just before the tech world blew up from the mess that is Computex, NVIDIA announced a new line of mobility discrete graphics parts under the GTX 700M series label. At the time we simply posted some news and specifications about the new products but left performance evaluation for a later time. Today we have that for the highest end offering, the GeForce GTX 780M.
As with most mobility GPU releases it seems, the GTX 700M series is not really a new GPU and only offers cursory feature improvements. Based completely on the Kepler line of parts, the GTX 700M will range from 1536 CUDA cores on the GTX 780M to 768 cores on the GTX 760M.
The flagship GTX 780M is essentially a desktop GTX 680 card in a mobile form factor with lower clock speeds. With 1536 CUDA cores running at 823 MHz and boosting to higher speeds depending on the notebook configuration, a 256-bit memory controller running at 5 GHz, the GTX 780M will likely be the fastest mobile GPU you can buy. (And we’ll be testing that in the coming pages.)
The GTX 760M, 765M and 770M offering ranges of performance that scale down to 768 cores at 657 MHz. NVIDIA claims we’ll see the GTX 760M in systems as small as 14-in and below with weights at 2kg or so from vendors like MSI and Acer. For Ultrabooks and thinner machines you’ll have to step down to smaller, less power hungry GPUs like the GT 750 and 740 but even then we expect NVIDIA to have much faster gaming performance than the Haswell-based processor graphics.
Subject: General Tech, Graphics Cards, Processors, Shows and Expos | June 13, 2013 - 02:26 AM | Scott Michaud
Tagged: E3, E3 13, amd
The Electronic Entertainment Expo (E3) is the biggest event of the year for millions of gamers. The majority of coverage ends up gawking over the latest news out of Microsoft, Sony, or Nintendo, and we certainly will provide our insights in those places if we believe they have been insufficiently explained, but E3 is also a big time for PC gamers too.
5 GHz and unlocked to go from there.
AMD, specifically, has a lot to say this year. In the year of the next-gen console reveals, AMD provides the CPU architecture for two of the three devices and have also designed each of the three GPUs. This just leaves a slight win for IBM, who is responsible for the WiiU main processor, for whatever that is worth. Unless the Steam Box comes to light and without ties to AMD, it is about as close to a clean sweep as any hardware manufacturer could get.
But for the PCs among us...
For those who have seen the EA press conference, you have probably seen lots of sports. If you stuck around after the sports, you probably saw Battlefield 4 being played by 64 players on stage. AMD has been pushing, very strongly, for developer relations over the last year. DICE, formerly known for being an NVIDIA-friendly developer, did not exhibit Battlefield 4 "The Way It's Meant to be Played" at the EA conference. According to one of AMD's Twitter accounts:
— AMD Radeon Graphics (@AMDRadeon) June 12, 2013
On the topic of "Gaming Evolved" titles, AMD is partnering with Square Enix to optimize Thief for GCN and A-Series APUs. The Press Release specifically mentioned Eyefinity and Crossfire support along with a DirectX 11 rendering engine; of course, the enhancements with real, interesting effects are the seemingly boring ones they do not mention.
The last major point from their E3 event was the launch of their 5 GHz FX processors. For more information on that part, check out Josh's thoughts from a couple of days ago.
A necessary gesture
NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC. But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem. But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.
The slide above shows NVIDIA targeting for each segment – expect for consoles obviously. NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces. I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?
The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers. AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets. But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.
With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine. They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software. They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.
This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more.
Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming.
Subject: Graphics Cards | June 10, 2013 - 07:28 PM | Jeremy Hellstrom
Tagged: gtx 770, msi N770 Lightning, overclocking
[H]ard|OCP liked the new GTX 770 Lightning from MSI but thought they would like it better overclocked, perhaps even more than a GTX 680 or HD7970. The triplets below are, from top to bottom, the GTX 680, the GTX 770 and the HD7970, all from the overclocked Lightning family. By using MSI's Afterburner utility [H] pushed the card to 1241MHz on the core and 7.8GHz effective for the RAM, higher than the factory overclock. That speed boost put its performance on par with the overclocked GTX680 but it seems that the impressive speeds that the 7970 Lightning is capable of leaves it comfortably in the lead.
"We take the new MSI N770 Lightning and overclock it to its maximum potential. We will compare it with a highly overclocked MSI GeForce GTX 680 Lightning and GIGABYTE Radeon HD 7970. Each GPU is getting its best chance to show us how well it can perform, as all of these GPUs are highly overclocked."
Here are some more Graphics Card articles from around the web:
- GALAXY GeForce GTX 650 Ti Boost @ [H]ard|OCP
- NVIDIA GTX 770 2GB @ eTeknix
- NVidia GTX 770 Video Card Review @ Ninjalane
- Nvidia GeForce GTX 770 @ Bjorn3D
- EVGA GTX 770 SC 2GB with ACX Cooler Video Card Review @ HiTech Legion
- Gigabyte GTX 650Ti BOOST 2GB OC Video Card @ HiTech Legion
- MSI N770 Lightning Overclocking @ [H]ard|OCP
- NVIDIA GeForce GTX 770 Reviewed in 2-Way SLI and NVIDIA Surround @ Legit Reviews
- Inno3D iChill GeForce GTX 780 review: almost a Titan @ Hardware.info
- ASUS GTX 670 Direct CU Mini @ Kitguru
- Inno3D iChill GeForce GTX 660 @ Hardware.info
- EVGA GeForce GTX 780 Superclocked ACX Cooling Video Card Review @ Legit Reviews
- Gigabyte GTX 780 WindForce OC 3GB Video Card Review @ Madshrimps
- Gigabyte GeForce GTX 770 OC WindForce 3x 2GB @ eTeknix
- iXBT Labs Review: i3DSpeed, May 2013
- Gigabyte HD 7790 2GB @ Bjorn3D
Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 10, 2013 - 02:49 AM | Scott Michaud
Tagged: Ultra, geforce titan, computex
So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.
It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.
Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.
But, this is not the first time we have heard of a Titan Ultra...
Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.
So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.
If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.
Subject: Graphics Cards | June 7, 2013 - 02:33 PM | Ryan Shrout
Tagged: amd, radeon, hd 7970 ghz edition, HD 7970, never settle
AMD just passed me a note that I found to be very interesting. In an obvious response to the release of the NVIDIA GeForce GTX 770 that offers the GK104 GPU (previously only in the GTX 680) for a lower price of $399, AMD wants you to know that at least ONE Radeon HD 7970 GHz Edition card is priced lower than the others.
The Sapphire Vapor-XHD 7970 GHz Edition is currently listed on Newegg.com for $419, a cool $30 less than the other HD 7970 GHz Edition cards. This is not a card-wide price drop to $419 though. AMD had this to say:
In late May I noted that we would be working with our partners to improve channel supply of the AMD Radeon™ HD 7970 GHz Edition to North American resellers like Newegg.com. Today I’m mailing to let you know that this process has begun to bear fruit, with the Sapphire Vapor-X HD 7970 GHz Edition now listing for the AMD SEP of $419 US. Of course, this GPU is also eligible for the Never Settle Reloaded AND Level Up programs!
Improving supply is an ongoing process, of course, but we’re pleased with the initial results of our efforts and hope you might pass word to your readers if you get a chance.
This "ongoing process" might mean that we'll see other partners' card sell for this lower price but it also might not. In AMD's defense, our testing proves that in single GPU configurations, the Radeon HD 7970 GHz Edition does very well compared to the GTX 770, especially at higher resolutions.
I did ask AMD for some more answers in regards to what other partners think about a competitor getting unique treatment with AMD to offer this lower price unit, but I haven't received an answer yet. I'll update here when we do!
For today though, if you are looking for a Radeon HD 7970 GHz Edition that also comes with the AMD Never Settle game bundle (Crysis 3, Bioshock Infinite, Far Cry 3: Blood Dragon and Tomb Raider), it's hard to go wrong with that $419 option.
Computex 2013: Gigabyte Preparing Custom GTX Titan With WindForce 450W Cooler... Some Assembly Required
Subject: Graphics Cards | June 7, 2013 - 01:20 PM | Tim Verry
Tagged: windforce 450w, windforce, gtx titan, gk110, gigabyte
Back in April, Gigabyte showed off its new custom WindForce 450W GPU HSF, but did not name which specific high end graphics cards it would be used with. So far, NVIDIA's highest-end single GPU solution, the GTX Titan, has been off limits for GPU manufacturers as far as putting custom air coolers on the cards (NVIDIA has restricted designs to its reference cooler or factory installed water blocks).
It seems that Gigabyte has found a solution to the cooler restriction, however. The company will be selling a GTX TITAN with model number GV-NTITAN-6GDB later this year that will come with NVIDIA's reference cooler pre-installed along with a bundled WindForce 3X 450W cooler and instructions for switching out the coolers.
Gigabyte Is Showing Off the custom GTX Titan at Computex, as discovered by TechPowerUp.
Users that do take Gigabyte up on its offer to switch to the custom WindForce cooler will still be covered under the company's standard warranty policy, which is a good thing. The kit is likely to be more expensive than your standard TITAN though as Gigabyte is having to sell the card with two coolers and increased support costs. On the other hand, users could swap out the coolers and then sell the unused TITAN reference cooler to offset some of the cost of the kit.
Gigabyte is actually showing off the new graphics card with WindForce 3X 450W cooler at Computex this week. The dual slot WindForce cooler is said to keep a GTX 680 2°C and 23.3 dB quieter than the reference cooler when running the Furmark benchmark suite. The major benefit of the WindForce is having three large fans that can spin at lower RPMs to give you the same cooling performance as the reference NVIDIA design at a much lower noise volume rather than pure cooling performance. Should you be looking to push the TITAN to the extreme, a water block would be your best bet, but for many users i think the allure of a quieter air cooled TITAN may be enough for Gigabyte to snag a few adventurous enthusiasts willing to put up with assembling the new card themselves.
More information on the WindForce 3X 450W cooler can be found here.
Subject: Graphics Cards | June 4, 2013 - 12:04 AM | Tim Verry
Tagged: poseidon, nvidia, kepler, gtx 770, gk-104, computex 2013, computex, ASUS ROG, asus
NVIDIA took the wraps off of its latest-generation Geforce GTX 770 GPU last week, and manufacturers have begun announcing not only reference designs but custom and factory overclocked versions of this GK-104 "Kepler" GPU refresh. One card in particular that caught my attention was the ASUS GTX 770 Poseidon graphics card, which combines NVIDIA's GK-104 GPU with a hybrid heatsink and fan combo that allows the simultaneous use of water and air cooling!
According to the branding, and a hands-on report by Tech Power Up at Computex in Taipei, Taiwan, the GTX 770 Poseidon graphics card is part of the company's Republic of Gamers (ROG) line and likely sports beefy VRM hardware and factory GPU overclocks. Of course, the GTX 770 GPU uses NVIDIA's Kepler architecture and is essentially the GTX 680 with some seriously overclocked memory and refined GPU Boost technology. That means 1,536 CUDA cores, 128 texture units, and 32 ROPs (raster operation units) within 4 GPCs (Graphics Processing Clusters). This is the full GK-104 chip, desite the x70 name. For more information on the GTX 770 GPU, check out our recent review of the NVIDIA GTX 770 card.
Update: ASUS has just launched the new ROG graphics cards at a Computex press conference. According to the ASUS press release:
"ROG Poseidon graphics card with hybrid DirectCU H2O cooling
The new ROG Poseidon graphics card features an NVIDIA® GeForce® GTX 700 Series GPU and a hybrid DirectCU H2O thermal design that supports both air and liquid cooling. Developed by ASUS, its CoolTech fan combines blower and axial fans in one design, forcing air in multiple directions over the heatsink to maximize heat dissipation. Liquid cooling reduces operating temperatures by up to 31 degrees Celsius for cooler running with even greater overclocking potential. ROG Poseidon also features a red pulsing ROG logo for a distinctive dash of style."
Back on the Poseidon specifcally, the card is a short GTX 770 with a distinctive cooler that uses a full cover water block that covers the entire card and includes the GPU, memory, and VRM areas. ASUS further added a more-traditional air cooler to the area above the GPU itself to help dissapate heat. The air cooler is a circular aluminum fin array with a fan that sits in the middle. The air entire hybrid cooler is then covered by a ROG-themed shroud with a configurable LED-backlit Republic of Gamers logo on the side that can be controlled via software.
The water cooling portion acts as any other full cover water block, allowing cool water to move heat away from the metal contact (the bottom of the block) touching the various components. The inlet and outlets poke out from the side of the card, which is a bit odd but the shroud prevents them coming out at 90-degrees like typical blocks. If your case width is tight, you may need to get creative to fit a 90-degree barb extender (I apologize if that's not the technical term) on to the existing tubing connectors (heh). The cooler can be operated with the air cooler's fan running with or without being connected to a water loop. When water cooling is used, the fan can be turned off to reduce noise or left on to allow for higher overclocks and/or lower temperatures.
Unfortunately, that is all of the information that is currently available
as ASUS has not yet officially launched on the custom GTX 770 graphics card. Pricing, availability, and clockspeed details are still unknown.
For more information, stay tuned to the press.asus.com/events livestream page as it might be announced at a Computex press conference this week since the company is showing off the hardware at the show!