Author:
Manufacturer: NVIDIA

Another Wrench – GeForce GTX 760M Results

Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board.  While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.

IMG_0141.JPG

The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics.  Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.

760mspecs.png

This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology.  It is configured with 2GB of GDDR5 memory running at 2.0 GHz

If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first.  Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing.  And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today. 

If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop.  The data presented below depends on that background knowledge!

Okay, you’ve been warned – on to the results.

Continue reading our story about GeForce GTX 760M Frame Rating results and Haswell Optimus issues!!

NVIDIA launches the GTX 760 @ $250; let the price wars begin again!

Subject: Graphics Cards | June 25, 2013 - 01:28 PM |
Tagged: geforce, GK104, gtx 760, nvidia, msi, MSI N760 TF 2GD5/OC

To start off with the good news, the GTX 760 is now available between $250 to $260 for the MSI model that [H]ard|OCP reviewed.  No paper launch this, nor another $400+ card for you to dream about but instead a solid performing card at a decent price.  Power is provide by an 8-pin and a 6-pin PCIe power connector, perhaps a little more than the card needs but perfect for overclockers who need the extra juice.  Performance wise the card trumps the GTX 660Ti and matches the GTX 670 and HD7950 boost in almost every test, for a good $50-75 less to pick up.  Even better news is that some certain sites testing Frame Rating and SLI performance saw great scaling in real performance. 

Read on to get the whole picture from [H]ard|OCP.

H_gtx760.jpg

"Today NVIDIA is launching the GeForce GTX 760. The GeForce GTX 760 will be replacing a video card and offering what use to be high-end memory performance, at a mainstream price. We will evaluate a retail MSI N760 TF 2GD5/OC video card with comparisons to find out whether or not this is a true value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

Getting even more life from GK104

Have you guys heard about this new GPU from NVIDIA?  It’s called GK104 and it turns out that the damn thing is found yet another graphics card this year – the new GeForce GTX 760.  Yup, you read that right, what NVIDIA is saying is the last update to the GeForce lineup through Fall 2013 is going to be based on the same GK104 design that we have previously discussed in reviews of the GTX 680, GTX 670, GTX 660 Ti, GTX 690 and more recently, the GTX 770. This isn’t a bad thing though!  GK104 has done a fantastic job in every field and market segment that NVIDIA has tossed it into with solid performance and even better performance per watt than the competition.  It does mean however that talking up the architecture is kind of mind numbing at this point…

block.jpg

If you are curious about the Kepler graphics architecture and the GK104 in particular, I’m not going to stop you from going back and reading over my initial review of the GTX 680 from January of 2012.  The new GTX 760 takes the same GPU, adds a new and improved version of GPU Boost (the same we saw in the GTX 770) and lowers down the specifications a bit to enable NVIDIA to hit a new price point.  The GTX 760 will be replacing the GTX 660 Ti – that card will be falling into the ether but the GTX 660 will remain, as will everything below it including the GTX 650 Ti Boost, 650 Ti and plain old 650.  The GTX 670 went the way of the dodo with the release of the GTX 770.

01.jpg

Even though the GTX 690 isn't on this list, NVIDIA says it isn't EOL

As for the GeForce GTX 760 it will ship with 1152 CUDA cores running at a base clock of 980 MHz and a typical boost clock of 1033 MHz.  The memory speed remains at 6.0 GHz on a 256-bit memory bus and you can expect to find both 2GB and 4GB frame buffer options from retail partners upon launch.  The 1152 CUDA cores are broken up over 6 SMX units and that means you’ll see some parts with 3 GPCs and others with 4 – NVIDIA claims any performance delta between them will be negligible. 

Continue reading our review of the NVIDIA GeForce GTX 760 2GB Graphics Card!!

Frame Rating: AMD plans driver release to address frame pacing for July 31st

Subject: Graphics Cards | June 20, 2013 - 04:05 PM |
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd

Well, the date has been set.  AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st.  For a problem that many in the industry didn't think existed.  

 

 

Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring.  Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology. 

AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

BF3_2560x1440_PLOT_0.png

So what can we expect on July 31st?  A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous.  More to come in the next 30 days!

Source: Twitter

Rumor: AMD Gets Exclusive Optimization for all Frostbite 3 Games

Subject: Graphics Cards | June 18, 2013 - 03:39 PM |
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd

UPDATE #3

The original source article at IGN.com has been updated with some new information.  Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported." 

The quote from an EA rep says as follows:

DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.

END UPDATE #3

This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine.  That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014.  Here is the quote that is getting my attention:

Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

bf4.jpg

Battlefield 4 will be exclusive optimized for AMD hardware.

This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles.  The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams. 

I am particularly interested in this line as well:

While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming.  What is bothersome to me is that both EA and AMD are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners.  In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release.  Without these builds, NVIDIA would be at a big disadvantage.  This is exactly what happend with the recent Tomb Raider release.

UPDATE

AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy.  In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"

So what do we take away from that statement, made in a story published in March, and today's rumor?  We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way."  That stance just happens to be counter to this rumor. 

END UPDATE

tombraider.jpg

NVIDIA had performance and compatibility issues with Tomb Raider upon release.

The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA.  When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing.  In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards.  Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales.  At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc.  But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business."  If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

ps4controller.jpg

Will the advantage be with AMD thanks to PS4 and Xbox One hardware?

At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months.  AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space.  NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side.  In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.

Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority.  There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion. 

batmanaa.jpg

Remember the issues with Batman: Arkham Asylum?  I do.

I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far.  Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:

It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.

Not much there, but he is also not denying of the original report coming from IGN.  It might just be too early for a more official statement.  I will update this story with information from NVIDIA if I hear anything else.

What do YOU think about this announcement though?  Is this good news for AMD and bad news for NVIDIA?  Is it good or bad for the gamer and in particular, the PC gamer?  Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject. 

UPDATE #2

Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:

taylorquote.png

END UPDATE #2

Source: IGN

Razer Blade Haswell Gaming Notebook is Damn Sexy, Powerful

Subject: Mobile, Shows and Expos | June 12, 2013 - 08:47 PM |
Tagged: E3, razer, blade, haswell, gtx 765m, geforce

With the launch of Intel's Haswell processor, accessory maker-turned notebook vendor Razer announced a pretty slick machine, the Blade.  Based on a quad-core, 37 watt Core i7 Haswell CPU and a GeForce GTX 765M GPU, the Razer Blade packs a lot of punch.

razer1.jpg

It also includes 8GB of DDR3-1600 memory, an mSATA SSD and integrates a 14-in 1600x900 display.  The design of the unit looks very similar to that of the MacBook Pro but the black metal finish is really an attractive style change. 

razer2.jpg

The embedded battery is fairly large at 70 Whr and Razer claims this will equate to a 6 hour battery life when operating non-gaming workloads.  With a weight just barely creeping past 4 lbs, the Razer Blade is both portable and powerful it seems.

razer3.jpg

The price tag starts at $1799 so you won't be able to pick one of these up on the cheap, but for users like me that are willing to pay a bit more for performance and style in a slim chassis, the Blade seems like a very compelling option.  There are a lot of questions left to answer on this notebook including the thermal concerns of packing that much high frequency silicon into a thin and light form factor.  Does the unit get hot in bad places?  Can the screen quality match the performance of Haswell + Kepler? 

We are working with Razer to get a model in very soon to put it to the test and I am looking forward to answering if we have found the best gaming portable on the market.

Author:
Manufacturer: NVIDIA

A necessary gesture

NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC.  But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem.  But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.

nv14.jpg

The slide above shows NVIDIA targeting for each segment – expect for consoles obviously.  NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces.  I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?

The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers.  AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets.  But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.

nv02.jpg

With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine.  They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software.  They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.

nv03.jpg

This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more. 

nv04.jpg

Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming. 

Continue reading our editorial on NVIDIA's stance on it's future in PC gaming!!

Never mind the 780; here comes the GTX 770

Subject: Graphics Cards | May 30, 2013 - 02:55 PM |
Tagged: nvidia, kepler, gtx 770, gtx 680, GK104, geforce, MSI GTX660 HAWK

$400 is a tempting number, much less expensive than the $650 price tag on the GTX 780 and right in line with the existing GTX670 as well as AMD's HD7970.  You will probably not see many at that price, $450 is more likely as there will be very few reference cards released, all manufacturers will be putting there own spins on the design of these cards, which brings the price in line with the GTX680.  Performance wise these cards outpace the two current single GPU flagship cards, not by enough to make it worth upgrading from a 7970 or 680 but certainly enough to attract owners of previous generation cards.  [H]ard|OCP reviewed MSI's Lightning model, with dual fans, an overclock of 104MHz on the base clock and 117MHz boost, plus a completely unlocked BIOS for even more tweaking choices.

If you want to see how well it fares on our new Frame Rating metric you will have to read Ryan's full review here.

H770.jpg

"NVIDIA debuts the "new" GeForce GTX 770 today. The GeForce GTX 770 is poised to provide refreshed performance, for a surprising price. We evaluate a retail MSI GeForce GTX 770 Lightning flagship video card from MSI with specifications that will make any enthusiast smile. The $399 price point just got a kick in the pants."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

GK104 gets cheaper and faster

A week ago today we posted our review of the GeForce GTX 780, NVIDIA's attempt to split the difference between the GTX 680 and the GTX Titan graphics cards in terms of performance and pricing.  Today NVIDIA launches the GeForce GTX 770 that, even though it has a fancy new name, is a card and a GPU that you are very familiar with.

arch01.png

The NVIDIA GK104 GPU Diagram

Based on GK104, the same GPU that powers the GTX 680 (released in March 2012), GTX 670 and the GTX 690 (though in a pair), the new GeForce GTX 770 has very few changes from the previous models that are really worth noting.  NVIDIA has updated the GPU Boost technology to 2.0 (more granular, better controls in software) but the real changes come in the clocks speeds.

specs2.png

The GTX 770 is still built around 4 GPCs and 8 SMXs for a grand total of 1536 CUDA cores, 128 texture units and 32 ROPs.  The clock speeds have increased from 1006 MHz base clock and 1058 MHz Boost up to 1046 MHz base and 1085 MHz Boost.  That is a pretty minor speed bump in reality, an increase of just 4% or so over the previous clock speeds. 

NVIDIA did bump up the GDDR5 memory speed considerably though, going from 6.0 Gbps to 7.0 Gbps, or 1750 MHz.  The memory bus width remains 256-bits wide but the total memory bandwidth has jumped up to 224.3 GB/s.

Maybe the best change for PC gamers is the new starting MSRP for the GeForce GTX 770 at $399 - a full $50-60 less than the GTX 680 was selling for as of yesterday.  If you happened to pick up a GTX 680 recently, you are going to want to look into your return options as this will surely annoying the crap out of you.

If you want more information on the architecture design of the GK104 GPU, check out our initial article on the chips release from last year.  Otherwise, with those few specification changes out of the way, let's move on to some interesting information.

The NVIDIA GeForce GTX 770 2GB Reference Card

Tired of this design yet?  If so, you'll want to look into some of the non-reference options I'll show you on the next page from other vendors, but I for one am still taken with the design of these cards.  You will find a handful of vendors offering up re-branded GTX 770 options at the outset of release but most will have their own SKUs to showcase.

IMG_9918.JPG

Continue reading our review of the NVIDIA GeForce GTX 770 graphics card!!

Trimming the TITAN; NVIDIA's GTX 780

Subject: Graphics Cards | May 24, 2013 - 06:10 PM |
Tagged: nvidia, gtx 780, gk110, geforce

With 768 more CUDA Cores than the 680 but 384 less than the TITAN the 780 offers improvements over the previous generation and will be available for about $350 less than the TITAN.  As you can see in [H]ard|OCP's testing it does outperform the 680 and 7970 but not by a huge margin which hurts the price to performance ratio and makes it more attractive for 680 owners to pick up a second card for SLI.  AMD owners with previous generation cards and deep pockets might be tempted to pick up a pair of these cards as they show very good frame rating results in Ryan's review.

H_780.jpg

"NVIDIA's new GeForce GTX 780 video card has finally been unveiled. We review the GTX 780 with real world gaming with the most intense 3D games, including Metro: Last Light. If the GTX TITAN had you excited but was a bit out of your price range, the GTX 780 should hold your excitement while being a lot less expensive."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP