Review Index:

The Haswell Review - Intel Core i7-4770K Performance and Architecture

Author: Ryan Shrout
Subject: Processors
Manufacturer: Intel

Z87 Chipset Changes, New Socket, Core i7-4770K Sample

As with most new processor releases, Intel has prepared a new chipset to support Haswell and its desktop debut.  The flagship part will be the Z87 and every motherboard vendor you can think of has like a dozen of them waiting to sell you.

View Full Size

Realistically, there isn't much that changes with the move from the Z77 to Z87 platform, as you might expect for a processor that isn't changed dramatically from Ivy Bridge to Haswell.  You still have 16 lanes of PCI Express 3.0 coming from the processor and the chipset is connected via a DMI 2.0 legacy bus.  The chipset now supports six full SATA 6G connections and has six USB 3.0 ports as well. 

View Full Size

This table provided by Intel shows off the changes to the Z87 chipset, and it isn't much.  No more PCI!  That will either be a total mind-melt for you or you won't even notice, either way, it had to happen sooner or later.  (Oh, but don't worry, boards can still include third party PCI controllers.)

View Full Size

The Z87 chipset does use a new socket, LGA1150, so don't plan on trying to smash a Haswell processor into your Z77 platform or use an Ivy Bridge part in the new Z87 options.  Other than the pin count though the socket works in the exact same fashion.

View Full Size

For our testing we got in a TON of motherboards (of which our very own Morry Teitelman is inundated in) but I decided to use the Intel DZ87KLT-75K for our initial testing.  Not only was it just for convenience but it also turned out to be a rock solid and fun motherboard to use - too bad it will be the last from the desktop board team at Intel...

View Full Size

There she is, the 1.4 billion transistor baby that is the Core i7-4770K Haswell processor.  Housed in the LGA1150 socket and clamped down, it really looks no different than the Sandy Bridge and Ivy Bridge parts we have lying around here - which is why a Sharpie is always on hand for identification purposes. 

As listed on the previous page, the specifications are very familiar to enthusiasts:

  • 3.5 GHz base clock
  • 3.9 GHz Max Turbo
  • 8MB L3 Cache
  • Dual channel DDR3-1333/1600 support
  • Intel HD Graphics 4600 - up to 1250 MHz
  • Full unlocked multipliers
  • $339

Speaking of those multipliers...

June 1, 2013 | 10:31 AM - Posted by Anonymous (not verified)

Thanks for the review, I just wish you'd use 3930K instead of 3970X.

June 1, 2013 | 10:41 AM - Posted by Anonymous (not verified)

Are there any OpenCL benchmarks forthcoming, and are there any gaming engines that will be able to utilize Haswell GPGPU + CPU cores for gaming physics while simultaneously using a descrete GPU for gaming graphics! Also, are any lucid gpu virtualization software benchmarks going to be available for Haswell within the next few months, as for desktop gaming Haswell CPUs are always going to be paired with a descrete GPU, and being able to utilize the Haswell GPU for extra gaming compute would be a great boost, short of a 6 core Haswell appearing for the desktop!

June 1, 2013 | 11:15 AM - Posted by djGrrr

Does anyone else see the problem with having 6 SATA3 ports?
They have not changed the 20Gbit DMI 2.0 connection between the CPU and chipset, so the performance of all these ports if actually being utilized is going to be crap, how can you expect to get anywhere close to the 36Gbit that the SATA3 ports should offer (thats when your not even taking into account the other IO, such as the extra SATA3 ports that some boards offer from addon controllers, that likely use some of the pci express lanes from the chipset, its all going to be incredibly bottlenecked by the DMI 2.0 20Gbit bus connecting the CPU to the Chipset

June 1, 2013 | 11:20 AM - Posted by Robogeoff (not verified)

Does Intel no longer have anything to offer desktop enthusiasts? I've been reading the reviews for each generation of the i7 since my 920, and I still haven't seen a compelling reason to upgrade.

That's 4 generations of "evolution" that have yielded so little improvements in performance. "Tick-tock" is misleading, as it really feels like "tick-tock-tock-tock-tock..."

June 14, 2013 | 03:05 AM - Posted by Panta

im exactly in the same situation, so pleased
with my 920 OC & load temps i don't see any reason to upgrade..

infect i will be waiting for x89, hoping it would be at-list as good as x58 & i7 920 cpu!

June 1, 2013 | 11:25 AM - Posted by snowbound999

Closing thoughts page regarding power consumption has ("remember, they are different sockets not)". Aren't you missing something after "not"?

June 1, 2013 | 12:12 PM - Posted by Anonymous (not verified)

what about the locked parts OC? do you still have access to the 5x turbo increase at least?

June 1, 2013 | 12:18 PM - Posted by Anonymous (not verified)

so is a bulldozer i think the next gen will be better

June 1, 2013 | 12:30 PM - Posted by AMD64 (not verified)

@Jml: how about that mr. Jml ? Intel Haswell sucks and you suck aswell !

June 1, 2013 | 12:44 PM - Posted by windwalker

Yawn, what a pathetic showing from Intel.
What was the point of that cringe worthy denial of stagnation next to an admission of 5% improvement?
Isn't it high time to face the music when the efforts of thousands of brilliant and highly educated people and billions in expenses yield a 5% improvement?

June 1, 2013 | 01:35 PM - Posted by GPU: Support for fp64? (not verified)

Hi guys. Thanks for the wonderful review.

1. Do you know if any of the GPU SKUs supports FP64, particularly under OpenCL?

2. Is it possible for you to post the OpenCL extensions supported on the HD 4600? You can use a utility like "GPU Caps Viewer" from Geeks3D.

June 1, 2013 | 01:40 PM - Posted by Rahul (not verified)

For GPU caps viewer, go to OpenCL tab, select the GPU device, then go to "More OpenCL information". That will display the exact list of OpenCL extensions supported. Your help will be greatly appreciated :)

June 3, 2013 | 03:20 PM - Posted by Adrian (not verified)

No, just like Ivy, the GPU does not have OpenCL Khronos ARB FP64 certification. Nor has Intel provided a custom extension like AMD.

It does support FP64 under DirectX ComputeShader.

So it does support FP64 but not precise enough for OpenCL.

June 1, 2013 | 01:37 PM - Posted by Rahul (not verified)

Also, wondering about the TSX support. Has Intel posted a list of which SKUs support the new transactional extensions (TSX)?

June 1, 2013 | 02:09 PM - Posted by Anonymous (not verified)

Appreciate the time to write up the review Ryan, it's just a shame Intel is teasing the desktop market with empty promises and a pointless iGPU that nobody cares about. I have yet to meet someone buying an i5 or i7 for their desktop scream, "Oh man it's got this kick ass iGPU HD 4000 graphics man!"

AMD may be weak in the market, but at least they don't waste their time and effort creating an all-in-one chip with half the die being wasted adding unnecessary heat. They could start pushing 6 core chips instead into the top i5/i7 chips and use that extra space to push 8 core Extreme parts, but they don't.

Do. Not. Under. Stand. Intel.

June 1, 2013 | 03:15 PM - Posted by Anonymous (not verified)

It is so true, Intel's integrated GPU IP will not For the foreseeable future, be able to keep up with AMD's offerings, as all AMD would have to do is up its, current technology, integrated GPU execution resources to easily overcome any Haswell gains! AMD's next generation hUMA APUs will, leave Intel's marketing spin pros, with the hard task of putting so much more lipstick, on an overpriced integrated GPU pig! It is no wonder why Intel marketing had to come up with the ultrabook form factor, to get their Ivybridge hd4000 and Haswell GT3 crystalwell integrated graphics into other than Apple laptop products, yes let's build a form factor so thin, that the only way to meet the thermal budget is to use Intel's CPU/(Anemic)GPU product, AMD will upstage Intel on this front, at a much lower cost! I am just fine with a regular form factor laptop, and descrete GPU, and would be better served if I could get more CPU cores, as opposed to an over priced Ultrabook with an overpriced CPU/GPU!

June 1, 2013 | 03:12 PM - Posted by Anonymous (not verified)

Looking at how AMD leads in the price performance (even the pretty old $100 A10-5800 is a better value than the $350 i7-4770!!!)...

Now we know why Intel CEO Otellini planned to officially jump ship on May 31, 2013. Because the Haswell benchmarks would show what a terrible investment of billions of dollars wasted with little to show.

June 1, 2013 | 03:47 PM - Posted by Anonymous (not verified)

Let's not forget Intel's poor graphics driver record, or Intel's OEM partners terrable OEM customizied Intel HD graphics driver update issues! Paul(Chip Pimp) Otellini is gone after pulling that golden rip cord, and bailing out! Intel, like M$, has had too much market share, for too long, and this PC/laptop user has had enough of this WINTEL madness! I will stay with my SandyBridge and W7 laptop, and look for AMD's HSA offerings and Linux! Ultrabooks, without a descrete GPU, is a Ultra Joke!

June 1, 2013 | 03:31 PM - Posted by JwolfTech (not verified)

Great overall review. Detailed to say the least. Till there is 8 core Intel processors I dont see a need to upgrade from a 3930K for years.

The Core i5 Unlocked version should be interesting based on the price point. Thats what most will be looking at.

June 1, 2013 | 07:09 PM - Posted by Anonymous (not verified)

Hey, where's your FCAT results there buddy? intel pay you enough to omit it? pathetic

June 1, 2013 | 07:52 PM - Posted by Anonymous (not verified)

You're apparently too stupid to read. It says in the article they'll be testing the graphics later.

I'm personally interested to see how the GT3e GPU's do compared to mid ranged Nvidia GPU's.

June 1, 2013 | 11:16 PM - Posted by raxx (not verified)

Why wasn't an overclocked i7-3770K (4.5Ghz)included in the benchmarks? It would be nice to see how the chips compare at that level.

June 2, 2013 | 01:40 AM - Posted by Anonymous (not verified)

Would like to see an article looking at power consumption compared to a i7 920. There's a few of us out there with the good old 920 overclocked to 4GHz+ burning up a heap of power. I'm wondering if it's worth the upgrade to Haswell to reduce power consumption and see how long it'll take to pay off the upgrade.

June 2, 2013 | 03:21 AM - Posted by oscarbg (not verified)

please post report of glewinfo executable of
this should post experimental OGL extensions not reported in gpu caps viewer and I suspect most of OGL 4.2 should expose entry points..

June 2, 2013 | 06:11 AM - Posted by Tri Wahyudianto (not verified)

i hope in a very soon intel give us another options processor without Intel Integrated Graphics with price cutdown.

that because their IGP is so useless but still AMD bulldozer and APU is such a bottleneck when it's using with discrete GPU, either AMD and Nvidia GPU.

June 2, 2013 | 06:33 AM - Posted by Anonymous (not verified)

Hmmmm... Seems Haswell is Intel's "bulldozer" fiasco. Haswell should be called Failwell... or Hasbeen.

June 2, 2013 | 07:00 AM - Posted by BiggieShady

In every graph 3570K is named i7 instead of i5. Damn you copy paste :)

June 2, 2013 | 09:04 AM - Posted by mAxius

so as expected its a big MEH

June 2, 2013 | 01:52 PM - Posted by boothman

Micro Center has the I7-4770K for $279, $70 cheaper than Newegg.

June 2, 2013 | 06:36 PM - Posted by Anonymous (not verified)

Even better. They dropped the price on the 3770K by $130 to $229.

WTF Newegg/resellers. Have you really been gouging people this long?

June 3, 2013 | 11:31 PM - Posted by Anonymous (not verified)

Its been that way for a long time if you have a local MicroCenter to pick one up at, otherwise its still ~$320 at most etailers

June 5, 2013 | 04:11 PM - Posted by Anonymous (not verified)

They are not making any money at that price. DUH

June 2, 2013 | 07:59 PM - Posted by Anonymous (not verified)


June 2, 2013 | 07:58 PM - Posted by Anonymous (not verified)

I`m going Haswell when Blue hits.
Also , I saw in other tech sites that the 4770 is not the top performing Haswell chip that will be released.
There will be others with more GPU horsepower ?

June 2, 2013 | 07:59 PM - Posted by Anonymous (not verified)

I presently have a C2Q 9550 12 MB L2 cache...would I get much benefit ?

June 3, 2013 | 02:28 AM - Posted by Anonymous (not verified)

Stepping up from a C2Q 9550 (same chip I have now) to just an i7 920 would be a huge leap, let alone SB being another sizable jump, with the 5% from both Haswell and IB I think it's safe to say you will see major performance boost even with a 1Ghz OC on that chip you have now.

I haven't went out to upgrade myself because I was a believer in the Haswell empty promises that wasted my time, but I work with machines that are SB i5's and they are smoking smooth, quiet, cool, and fast.

June 3, 2013 | 02:26 AM - Posted by Anonymous (not verified)

I've only heard of a lower TDP 65W model that has the eDRAM onboard (flagship iGPU) that is supposed to be comparable to the i7 4770K, but I really don't see how that is possible.

Anyways, I wouldn't call anything with more "GPU" power to be a top performer on the 4770K lineup because to be quite honest, nobody buying those chips is looking for the integrated GPU component. They'd probably sell better if they took that space and replaced it with 2 extra cores. People would have far less to bitch about and you'd see performance gains that would give Intel another 4 years of this 5% performance boost before people start bitching about monopoly.

June 2, 2013 | 08:58 PM - Posted by Anonymous (not verified)

AMD could name their chip SuperDuperIntelKiller and it still wouldn`t be close.

June 3, 2013 | 02:33 AM - Posted by lol (not verified)

Cinebench 11.5 . multithread
Haswell -> 7.68

Here is my conclusion:
Ivy should be 2x faster than SB.
Haswell should be 4x times faster than SB and 2x Ivy, not a 10% ><

Intel thinks we're all idiots or what?
It'a all AMD's fault which does not put enough pressure.

June 3, 2013 | 04:59 AM - Posted by rrr (not verified)

LMAO LMAO LMAO, somebody is seriously underestimating, how hard it is to double chip's performance every year without adding more execution units.

June 3, 2013 | 03:28 PM - Posted by D1RTYD1Z619

Looks like my 2600k will live in my system for another few years. "YAY! - MY WALLET

June 6, 2013 | 04:57 PM - Posted by PhoneyVirus

Not to sound like a dick, the first page was just a wast of time I'm not a design engineer now if I had access to the equipment I be more then gladly to study the Architecture.

I do know what your talking about though but for the newbie or first timer they wouldn't have a clue why because your throwing words with no meaning or diagrams to where it's coming in or going out and what it's connected to. Long story short I got bored very fast and just wanted to skip the first page all together but didn't.

In the future don't throw up shit like this unless you have some sort of diagram to follow, Tom's Hardware don't use this and either does HardOCP keep it simple but yet in lighting the read slowly not slide show screen shot's from IDF.

Second Page well let's just say I didn't pay for a $400 Graphics card to be reading about Intel's GT2 Architecture and Mobile Crap but then again some people are probable interested in this stuff but I doubt anyone that read this website is.

Thanks for the Overkill Review PCPer.

June 6, 2013 | 04:52 PM - Posted by PhoneyVirus

Also a follow up to the first page there is NO Transactional Synchronization Extensions (TSX) in the Core i7 4770K Processor.

June 8, 2013 | 02:47 PM - Posted by kileysmith33

Evelyn. I agree that Raymond`s postlng is flabbergasting, yesterday I got a gorgeous Acura after I been earnin $7654 this-last/4 weeks and just over ten k last munth. without a doubt its my favourite-work Ive ever had. I started this six months/ago and practically straight away began to make over $82, per-hr. I use the details here,

June 10, 2013 | 02:07 AM - Posted by chefbenito (not verified)

Wow. Sort of cool but barely evolutionary and nothing crazy new. So glad I bought a beefy 2600k and a sick GB z68. I knew the rumors around haswell were too good to be true. The bottom line of this review should be- "If you are a PC gamer with a fast GPU and an i7. Ignore Haswell altogether." Honestly did we hit a wall? Is 5GHZ on 8 cores good enuf for anything? I will wait (probably for a long time) for the CPU that starts to crush my 2600k in gaming FPS. Glad to see my investment still giving me returns despite several new CPU releases.

Truth be told: Sandy Bridge was the big leap in gaming CPUs. Everything since then has been extremely underwhelming and incremental. Great review as always guys.

September 9, 2013 | 10:07 PM - Posted by trancefreak (not verified)

Yup totally agree with you. I have had my 2600k for almost 2 1/2 years and 3 years come march 2014.

It overclocks like a beast and although I moved away from p67 boards to a z77, It is still rocking without worrying.

Not to sound contradiction but I am going to give my 2600k to my son and keep it in the family. I ordered a 4770k and a z87 board and that will be it until there is a huge jump in microprocessors.

June 15, 2013 | 12:07 PM - Posted by Anonymous (not verified)

Idle power consumption was higher than the 3770 most likely due to the FIVR.

June 18, 2013 | 04:53 PM - Posted by 3dfxrain (not verified)

The marketplace and people take care of themselves and others.

September 10, 2013 | 01:30 PM - Posted by Anonymous (not verified)

I have a new rig. Asus maximus vi extreme board and ci7 4770k but it wouldnt give any display via hdmi to me. Please help !! The only way i am able to use my desktop new is that i have temporarily installed a hd7770 and using its hdmi output for display.

Thanks in advance. My retailer told me that since its k processor u need a graphics card for display!!?!

September 27, 2013 | 06:29 PM - Posted by drbaltazar (not verified)

@ryan!could you adjust message signal interrupt value to one per core per device in the future(if you aren't already)specificly for CPU with GPU onboard.driver are limited to one interrupt per socket per device.(ya it is limited!but ms suggest one MSI per physical CPU.since now each core are CPU . I feel it isn't fair for cpu including GPU to ignore this!why I ignore normal GPU?diminishing return.I feel this have a more dramatic impact on Apr like has well or jaguar then on desktop with GPU like a 7970.Ty Ryan

PS:Drivers can register a single InterruptMessageService routine that handles all possible messages or individual InterruptService routines for each message


June 29, 2014 | 06:59 PM - Posted by Earan (not verified)

I7 920 fanatics: the 920 is a great cpu of you are gaming, doing Photoshop and other light stuff. If you are doing 3D, video editing and compositing and other heavy stuff, the 4770 will swipe the floor with your 920, in performance and power consumption.

August 11, 2015 | 10:29 AM - Posted by Michael Miller (not verified)

ASUS Maximus Extreme 2.0X Motherboard(X38 Chipset, 2 Gigabit Ethernet NICS Onboard)need 1 each. Send RFQ.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.