Review Index:
Feedback

Frame Rating: GTX 760M Results and NVIDIA Optimus Issues with Haswell

Author: Ryan Shrout
Manufacturer: NVIDIA

Bioshock Infinite

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

First, notice here that there is definitely a change from our FRAPS based reported frame rates and our capture-based frame rates.  The blue line on the Observed FPS graph has several places where it dips lower than the FRAPS FPS graph; an indicator that we are seeing some runt or dropped frames (check the final 10 seconds or so).  Even though the average frame rate is drastically improved at over 110 FPS, the frame times graph shows some strange jumps DOWN to nearly 0 ms frame times.  Further investigation was needed.

View Full Size

Well, it appears that NVIDIA’s GeForce GTX 760M, running in tandem with the Intel Core i7-4702MQ processor is exhibiting some frame issues.  As a quick reminder, using a technology called Optimus, NVIDIA attempts to seamless transition between the integrated and discrete graphics options on-board.  For this to work, the frame buffer from the NVIDIA GTX 760M must be transferred to the main memory of the system which is where the frame buffer for the IGP is.  This allows the images rendered by the GTX 760M to “pass through” the integrated processor graphics and connect to the same output / display.  That is a simplification of the process of course, but it gets at what I think is the problem here.

Let’s see another example though with DiRT 3.

July 10, 2013 | 03:16 PM - Posted by truk007

Does it matter what the game settings were for these tests?

July 10, 2013 | 04:53 PM - Posted by ervinshiznit (not verified)

How come the frame buffer has to be copied? Why can't the internal display have two inputs (one from IGP and one from Nvidia) and when you launch a game you switch the display to use the Nvidia output?

July 11, 2013 | 09:26 AM - Posted by Ryan Shrout

Just not how it works...

July 11, 2013 | 03:45 PM - Posted by ervinshiznit (not verified)

As in it's not technically feasible or they just didn't implement it that way?

July 10, 2013 | 09:57 PM - Posted by pdjblum

"Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter."

Ryan,

No doubt I am an idiot, but I just do not get what the percentiles represent. I have read all the frame rating articles, so it is not for a lack of trying. Please explain so an idiot can understand. Thanks much.

My only guess is that it is the absolute number of results that fall in that percentile. Why is the most variance always in the largest percentile?

July 11, 2013 | 09:31 AM - Posted by Ryan Shrout

Percentiles tell you how many of a data point occur at or above that result.

For example, on the Bioshock page, look at the FPS by Percentile graph.  The blue line representing the GE40 with the GTX 760M enabled is a measurement of frame rates / frame times by percentile.  So - look at the 80th percentile and look up - the blue line is hitting about 95 FPS.  That the GTX 760M is running at LESS THAN (to the right) 95 FPS for the 20% of the time.  And thus, 80% of the time, the avg frame rate is higher than 95 FPS (to the left).

That help?

July 11, 2013 | 11:34 AM - Posted by pdjblum

Got it. Just before I read your explanation, I thought of it in terms of test scores. With those, of course, if you are in the 90th percentile, your score is in the top 10%; alternatively, your score is higher than 90% of the rest of the scores. Your explanation confirms this of course. I should have thought of this first. Thanks much and keep up the awesome work.

July 10, 2013 | 11:14 PM - Posted by praack

so if i am reading this report correctly- the software choice to run sli between the laptop graphics and the nvidia discreet graphics card has now introduced framerating issues like AMD has.

Most probably because the the Intel CPU does not contain the hardware segment that prevents the issues when you sli parts- though i am probably talking out of my butt- since no one's really thought out the issues of xfire x yet ...

but the scenario is the same you would probably see if this was an amd platform with a xfire x running - though that set up would be worse than the nvidia issue.

SO after all that talking- am i right or wrong- i really hate being wrong all the time.....

July 11, 2013 | 09:35 AM - Posted by Ryan Shrout

It's not quite the same problem that exists with AMD on the desktop side...not nearly as dramatic.

As for if AMD sees the same problem in notebooks...we are working on getting a platform to test!

July 11, 2013 | 05:28 AM - Posted by gamerk2 (not verified)

Memory bottleneck during the transfer from the discrete GPUs VRAM and main memory prehaps? Possibly paging related as well (I note the HDD isn't listed; was a SSD used, or a traditional HDD?)

July 11, 2013 | 10:13 AM - Posted by Ryan Shrout

Possibly on the memory transfer sepeed, but definitely not paging related.

NVIDIA has told me a fix is in the works.

January 5, 2014 | 04:23 PM - Posted by Anonymous (not verified)

So NVIDIA acknowledged the issue and told you a fix was in the works back in July. Has this problem been resolved? I recently purchased a Sager NP8235 that runs a GTX 770M in Optimus configuration, and I believe I am seeing the same issues. I am running the latest drivers, so I do not believe a fix has been implemented. Could you talk to the people you know at NVIDIA again to get an update on this issue?

This issue is pretty huge since pretty much 100% of notebook computers that are sold with NVIDIA GPUs today all run Optimus configurations. This essentially means that IF this issue is not actually fixable through a driver update, then 100% of the NVIDIA gaming laptops being sold today (and the majority that have been sold over the past couple years) are useless!

July 11, 2013 | 09:59 AM - Posted by madhatter256 (not verified)

I hope the MSI has an option in the BIOS to just "turn off" the IGP so Windows won't detect it and just run entirely off the nvidia gpu, but that defeats the purpose of long battery life for portability....

Optimus still has its issues but I'm sure its more stable than what it is with my laptop which as a Gt540m and I had to reinstall the OS a few times already because of optimus not kicking on the GPU...

July 14, 2013 | 01:48 PM - Posted by Jon Pennington (not verified)

I think that testing 1080P gaming on a notebook is a mistake. I know that PCPer caters to gamers and DIY and gearhead types, but if you go to Newegg or Tiger Direct, 75% of the notebooks being sold (under $1200) have a resolution of 1366x768 or less. Anybody can attach an external monitor, but how many people actually do?

If the only difference between an A8 notebook and an i3 notebook was the framerate of BF3, I'd want to know what the framerate is like when I'm sitting in the commons at school, not plugged in at my lab at home.

July 15, 2013 | 04:56 AM - Posted by LucileBentz04

Eva. although Tony`s blurb is impossible... last friday I got a great new Ford from having earned $6437 this past four weeks and-in excess of, 10 grand last month. it's by-far the most-comfortable work Ive ever done. I began this 8-months ago and pretty much immediately started to earn more than $69 per-hr. I went to this site,, goo.gl/oR8j0

July 15, 2013 | 05:14 PM - Posted by Jeff R. (not verified)

I just picked up a Razer blade with a similar setup (haswell/gtx765) and noticed an abnormal amount of tearing immediately. I hope they fix this, it's quite obvious.

July 16, 2013 | 04:25 PM - Posted by Mschlenker (not verified)

This is the same problem my EeePC 1215N has (Atom™ D525 + ION2 w/ Optimus). Many people reported tearing and stuttering on the laptops LCD but no problems when connected to external LCD over HDMI. The top thought was the PCIE X 1 interface between the ION and Atom was the culprit but that doesn’t make sense if hdmi out is good. BTW nether ASUS or NVidia could really fix the problem and left many disappointed owners. I typically game on my GE40 connected to an Ext LCD so I haven’t had a problem yet (other than Steam’s App forcing the NVidia GPU on all the time). But as of this post neither MSI nor NVidia have bios over 311 that support the GTX7XX series.

July 17, 2013 | 02:17 PM - Posted by Anonymous (not verified)

What settings were used for all of these tests?

August 23, 2013 | 09:25 AM - Posted by mokhtar (not verified)

I just bought GE40, in the system properties it's shown that GTX760M is installed. However, in the display properties its not shown and what's only available is the shared system memory.

Is this something normal for these new VGA?

Anyone please advise thanks

August 23, 2013 | 09:25 AM - Posted by mokhtar (not verified)

I just bought GE40, in the system properties it's shown that GTX760M is installed. However, in the display properties its not shown and what's only available is the shared system memory.

Is this something normal for these new VGA?

Anyone please advise thanks

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.