Bioshock Infinite Results
Is it finally time for you to upgrade your Sandy Bridge gaming rig?
Our Intel Skylake launch coverage is intense! Make sure you hit up all the stories and videos that are interesting for you!
- The Intel Core i7-6700K Review – Skylake First for Enthusiasts (Video)
- Skylake vs. Sandy Bridge: Discrete GPU Showdown (Video)
- ASUS Z170-A Motherboard Preview
- Intel Skylake / Z170 Rapid Storage Technology Tested – PCIe and SATA RAID
Today marks the release of Intel's newest CPU architecture, code named Skylake. I already posted my full review of the Core i7-6700K processor so, if you are looking for CPU performance and specification details on that part, you should start there. What we are looking at in this story is the answer to a very simple, but also very important question:
Is it time for gamers using Sandy Bridge system to finally bite the bullet and upgrade?
I think you'll find that answer will depend on a few things, including your gaming resolution and aptitude for multi-GPU configuration, but even I was surprised by the differences I saw in testing.
Our testing scenario was quite simple. Compare the gaming performance of an Intel Core i7-6700K processor and Z170 motherboard running both a single GTX 980 and a pair of GTX 980s in SLI against an Intel Core i7-2600K and Z77 motherboard using the same GPUs. I installed both the latest NVIDIA GeForce drivers and the latest Intel system drivers for each platform.
Skylake System | Sandy Bridge System | |
---|---|---|
Processor | Intel Core i7-6700K | Intel Core i7-2600K |
Motherboard | ASUS Z170-Deluxe | Gigabyte Z68-UD3H B3 |
Memory | 16GB DDR4-2133 | 8GB DDR3-1600 |
Graphics Card | 1x GeForce GTX 980 2x GeForce GTX 980 (SLI) |
1x GeForce GTX 980 2x GeForce GTX 980 (SLI) |
OS | Windows 8.1 | Windows 8.1 |
Our testing methodology follows our Frame Rating system, which uses a capture-based system to measure frame times at the screen (rather than trusting the software's interpretation).
If you aren't familiar with it, you should probably do a little research into our testing methodology as it is quite different than others you may see online. Rather than using FRAPS to measure frame rates or frame times, we are using an secondary PC to capture the output from the tested graphics card directly and then use post processing on the resulting video to determine frame rates, frame times, frame variance and much more.
This amount of data can be pretty confusing if you attempting to read it without proper background, but I strongly believe that the results we present paint a much more thorough picture of performance than other options. So please, read up on the full discussion about our Frame Rating methods before moving forward!!
While there are literally dozens of file created for each “run” of benchmarks, there are several resulting graphs that FCAT produces, as well as several more that we are generating with additional code of our own.
If you need some more background on how we evaluate gaming performance on PCs, just check out my most recent GPU review for a full breakdown.
I only had time to test four different PC titles:
- Bioshock Infinite
- Grand Theft Auto V
- GRID 2
- Metro: Last Light
Bioshock Infinite (DirectX 11)
BioShock Infinite is a first-person shooter like you’ve never seen. Just ask the judges from E3 2011, where the Irrational Games title won over 85 editorial awards, including the Game Critics Awards’ Best of Show. Set in 1912, players assume the role of former Pinkerton agent Booker DeWitt, sent to the flying city of Columbia on a rescue mission. His target? Elizabeth, imprisoned since childhood. During their daring escape, Booker and Elizabeth form a powerful bond -- one that lets Booker augment his own abilities with her world-altering control over the environment. Together, they fight from high-speed Sky-Lines, in the streets and houses of Columbia, on giant zeppelins, and in the clouds, all while learning to harness an expanding arsenal of weapons and abilities, and immersing players in a story that is not only steeped in profound thrills and surprises, but also invests its characters with what Game Informer called “An amazing experience from beginning to end."
Our Settings for Bioshock Infinite
Though the GTX 980 on the Core i7-6700K is 8% faster than it is on the 2600K, you can also see some differences in frame time consistency. Look at the FPS by Percentile graph where the orange line representing Sandy Bridge tails off sooner than the black line, representing Skylake.
At 2560x1440, all of this deviation between Skylake and Sandy Bridge is essentially gone, thanks to the added weight given to the GPU by the additional pixels. This isn't entirely surprising but does hint that maybe gamers with older systems that are using higher resolution screens may yet not need to upgrade.
GTX 980 SLI Results - 2560x1440
Well this is interesting - as we dive into the world of SLI and the hiccups of multi-GPU, even at 2560x1440, things start to look better for Skylake. In this case we find that the Core i7-6700K has about a 10% higher frame rate on average than Sandy Bridge and also much tighter frame time consistency.
Let's see how the other games in our testing fare.
I’d love to see this test
I’d love to see this test include a Q6600 as well, for more perspective.
It’s going to be 10 years old soon. Comparing it to the newer generation would be neat.
I appreciate the testing very
I appreciate the testing very much but at the same time I strongly disagree with your conclusion.
Do you recommend people spend $600 for a new graphics card that delivers somewhere between 7% and 25% improvement depending on scenario?
Because that’s what you’re recommending for the motherboard, CPU and RAM combination.
7-25% isn’t huge? What do you
7-25% isn’t huge? What do you want??? Geez. 50%? 100%? Ridiculous expectations out of people. This chip is great.
7-25% is pathetic for that
7-25% is pathetic for that price.
Do you buy a new $600 graphic card for that kind of pitiful improvement?
I have a feeling that the
I have a feeling that the difference wouldn’t be near as large if both were clocked the same, and that things would be far more GPU bound if they were both clocked north of 4.5ghz
This article is basically
This article is basically useless for the intended audience.
I would guess that most PCPers that are still using their SB chips are doing so because they are great OCers, with most people getting around ~4.5ghz.
To not factor that into the analysis of an article whose sub title is “Is it time for gamers using Sandy Bridge system to finally bite the bullet and upgrade?” is a pretty glaring oversight and turns what could have been a extremely helpful and useful article into a curiosity.
My 3930K @ 4.6 still spoils
My 3930K @ 4.6 still spoils me with insane FPS. Upgrading from this chip is a really bad idea right now. Just went to 980ti SLI @ 1440p/144hz and its so fast it just spoils me really. If Skylake-E manages to impress, maybe I’ll consider upgrading to that just for the lulz.
Looks like ill have to
Looks like ill have to scratch my upgrade itch next gen.
I’m running a 2600k@4.2 with a gtx980 at 4k and it runs
fine to me. But if I win the contest for that Gigabyte rig….GOODBYE SANDY BRIDGE!!!!
First off I’m running an i5
First off I’m running an i5 2500k running at 4.5 ghz and an AMD R9 280. If you are playing the games you want to play without any problems, smooth game play, etc. Why would you even consider upgrading any part of your computer. What is the real world difference between a stable 60fps and 120fps visually? No difference! Stop being ridiculous in recommending upgrades that don’t benefit the vast majority of gamers operating at 1080p, because they won’t see a difference worth a $600 price tag. As someone else mentioned it looks like you are being paid by intel to make these suggestions and comparisons when it has very little impact on 1080p gamers. I won’t be upgrading my processor anytime soon. The video card will be my next upgrade when my game play gets closer to 30fps with new games. Which I estimate will be at least 2 years from now according to the trends. DX12 improvements might even make that upgrade further in the future. So, let’s stick to real world comparisons in the future!
You are high, there is a huge
You are high, there is a huge difference between 60 and 120 fps 😛
I’ll keep my 4GHz 2500k with
I’ll keep my 4GHz 2500k with 16GB of DDR3 thanks.
Running both chips at the same clock speed and with the same ammount of ram would have shown the true generational improvement. Pretty disappointing that this was not done.
I definitely don’t think the 5% improvement is worth the cost of uprading to the skylake platform and DDR4 though.
The test wasn’t fair. Memory
The test wasn’t fair. Memory matters in CPU limited benchmarks and the CPUs where not OC’d. Who the hell gets a 6700k our a 2700k and does not OC? The is sort of the point of the K series.
Yes i have to agree with
Yes i have to agree with everyone here that has stated that the benchmarks/tests were not performed at Overclocked speeds or with the same amount of memory. Just to show you the difference a percentage increase with my 2600k going from your stock Apple to apples Skylake reviews shows a dramatic improvement in my SiSoft Dhrystone and Whetstone scores of my 2600k at 4750mhz 35.71% faster clockspeed with scores for Dhrystone going from a score 116.86 @ 3500 Mhz to a score of 186.59 a amazing 59.6% increase @ 4750 Mhz, as for the Whetstone it went from a score of 73.44 @ 3500 Mhz to a score of 97.41 @ 4750 Mhz a 32.64% increase which is closer to the clockspeed % increase than the Dhrystone test that amazed me with a 59.6% increase I cannot account for since I was expecting it to be close like the clockspeed % change of 35.71%. I would be greatful to anyone who can account for the dramatic percentage increase of 59.6% over the clockspeed percentage increase. Now this is just the percentage increase 3.5ghz to 4.75ghz on the 2600k. Also I am repeating what everyone else said no one buys a 2500-2600k to run it at stock speeds Heck if you give me that Skylake platform i will give you my 5.1+ ghz capable 2600k I keep cool with a nice and simple but great performing Cooler Master Nepton 140xl with the push pull fans at 40% getting air through its 38mm thick radiator which is silent for the most part. It out cools most 240mm “not 280mm” radiators with its powerful pump and very large copper Cold Plate that has more microfins than any DIY water cooling systems CPU block according to FrostyTech i believe i read that from. I Have to give you a huge Thumbs up for adding SLI to the test even though you did not OVERCLOCK the 5 year old KING of Mainstream CPU’s Good Ole SANDY BRIDGE CPU for the tests The Sandy 2600k is the best CPU I have ever purchased and I do not think I will ever get the payback that CPU has and is STILL giving me in performance on a 24/7 daily basis. Another fantastic thing about SKYLAKE is that the removed the idiotic on die VRM’s and they are back on the Motherboard nice and big. I feel Haswell’s on die VRM’s has caused more CPU failures and chip degradation problems that I rarely if ever really hear about until Intel put those Tiny VRM’s in the CPU die itself not only being too small to put too many volts through them and they add heat to the CPU die. Luckily Skylake has them back on the motherboard where thay can be cooled correctly and it can help you choose a motherboard for you….If you have 2 motherboards that have everything you need and need something to help you make you mind up you pick the motherboard WITH THE MOST VRM’S THUS BEST POWER DELIVERY SYSTEM FOR THE cpu LEADING TO LESS vDROOP ETC.
I am not going to do anymore SiSoft tests because it is making me want to clock my 2600k to 5100mhz and then clock my SLI’ed EVGA GTX 770 Classified cards to 1400mhz cores and and 8000 memory clocks and do some benchmarking which I do not really need since I am using a LG 34UM95 34″ 21/9 3440-1440 IPS monitor with 8bit to 10bit color by dithering and a 60htz cap “tried overclocking the panel with no luck” so I do not use Vsync but I do set a frame rate target of 67fps with EVGA’s Precision software ” it saves me a lot of unneeded power use it keep the cards cool since they are not pushing out every frame they can possibly put out 120+fps all the time” and I get no tearing or stuttering and every game is buttery smooth. Yes I did do a 30 minute test with the Gsync enasbled 34″ 3440-1440 Predator monitor, but nothing I had time to play ran under 60fps and my current rig runs everything above 60FPS with my main game right now being War Thunder Ground Forces that has fantastic graphics and gameplay…blows World of Tanks outta the water, plus if your into it you can fly planes also. It includes
I listen to your podcast
I listen to your podcast pretty frequently but and like you guys BUT, yep, pretty obvious you guys got paid for this one. Shame on you!!!
You got us, we each received
You got us, we each received a portion of this island chain in Dubai.
I think this test isn’t
I think this test isn’t exactly even. First off you didn’t do a test of the performance on these cpu’s in their overclocked state. As these are K processors chances are the people that would want these comparisons will be overclocking them. So a base vs base is already biased since they have vastly different base clocks.
I can speak from experience that most 2600k can reach 4.5ghz and of those most can reach 4.7k with decent cooling. I mention this because as the cpu speed goes up the less likely the cpu will be to bottleneck the gpu’s.
Next thing i noticed in your test is the difference in RAM. 8 gig vs 16 gig. While some might not think this matters I have seen some of these games eat up over 8 gigs easy when run at 1440P or higher resolutions.
So my point here is your test beds were not as similar as possible. You might not be able to use the same ram or speed of ram, but you should have atleast matched the amount of ram on both machines.
I think if you were to do the two things I suggest you would see different results. Their might still be an advantage to skylake, but I think the game will be much more minimal. 1-4% would be my guess.