Review Index:
Feedback

Bitcoin Currency and GPU Mining Performance Comparison

Author: Ryan Shrout
Manufacturer: General

Analysis and Conclusions

Why are GPUs faster?

As more and more miners come into the fold of the Bitcoin bandwagon the era of the CPU being able to effectively find keys dropped considerably.  Sure you can still technically do it but the amount of time it takes even today makes it simply a moot point and as the difficulty continues to increase in the days and months to come, the CPU will be rendered even less useful.  The only reasonable way to compute the SHA256 hashes is to use the power of GPU computing.  But why are GPUs so much better at this process than CPUs?  

The answer is one that both AMD and NVIDIA have been pushing for some time: the parallel processing power of the GPU far exceeds that of the CPU and for certain workloads (of which Bitcoin mining is definitely one) the performance delta between the two platforms can be staggering.  This Bitcoin wiki page seems to have a great summation of the reasoning as it comes down the ability for the GPU to process more ALU-based 32-bit instructions in succession than a CPU.  A CPU today can handle 4 or 8 (with Intel's new AVX instructions) 32-bit instructions per clock per core giving us a maximum performance on a Core i7-2600K of 8 instructions x 4 cores x 3.4 GHz = 108.8 GigaInstructions. 

View Full Size

On the other hand a Radeon HD 6850 has 960 stream processors that each run a 32-bit ALU instruction per clock.  Even at a slower 775 MHz clock rate that gives us 960 instructions x 0.775 GHz = 744 GigaInstructions.  You can now see why getting a dual-GPU graphics card and how overclocking can do dramatically affect the performance of our Bitcoin mining operations.

Why is AMD so much faster than NVIDIA?

Obviously from our testing the AMD architecture is much better suited for this mining algorithm than NVIDIA's GeForce cards, but why?  It really comes down to AMD's use of smaller and simpler shader processing units compared to the design that NVIDIA uses; AMD runs more SPs at lower frequencies (1600 SPs on an HD 5870 running at 850 MHz) while NVIDIA runs fewer SPs at higher frequencies (GTX 480 has 480 SPs running at 1400 MHz).  This gives AMD a better ALU output than NVIDIA: 

  • AMD Radeon HD 6990: 3072 ALUs x 830 MHz = 2550 billion 32-bit instruction per second
  • NVIDIA GTX 590: 1024 ALUs x 1214 MHz = 1243 billion 32-bit instruction per second
  • Source: Bitcoin Wiki

While in gaming this difference is raw compute power can be offset by other GPU technologies, for the raw mathematical power need for mining Bitcoins (and other ALU-bound work like password cracking) AMD definitely has better utilization of its processing power.

There is another difference between how AMD and NVIDIA operate on the SHA256 keys that affect performance.  From the wiki:

...another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs, but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).

How do you get started with your own mining?

So you are probably wondering how you can get started playing with this whole Bitcoin mining operation.  First a warning: we are not responsible should the currency collapse and your GPU purchases be made useless!  That out of the way, a great place to start is this tutorial that goes through setting up a GPU-based mining operation in Windows.  

View Full Size

One of many online tutorials about setting up a Bitcoin mining operation on your PC.

Just like other GPU computing tasks Bitcoin offers you a way to utilize the power of your graphics card when not gaming.  While maybe not as humanity-driven as Folding@Home, everyone gets to make their own choice!

What is the "Best Card" for mining?

The answer to this question obviously depends on what you are going for but the standard answer is going to be that the Radeon HD 5830 is best "value" in Bitcoin mining.  The mid-range last-gen GPU from AMD actually bests the GeForce GTX 590 in overall performance and also takes the crown from all of our tested options in terms of performance per dollar.  It is not the most efficient in terms of power (the dual-GPU cards still take the advantage there) but the potential profit far outweighs the cost of electricity.

If you want the best overall performance then you will definitely want to go the route of a dual-GPU beast like the Radeon HD 6990.  Though we didn't overclock that card (instead we pushed forward on the ASUS ARES based on the HD 5870 GPUs), the headroom provided by that design will allow users to really get the most mining out of a single PCI Express slot.  

One interesting configuration option would be to combine the new AMD A8-3850 APU with a motherboard capable of handling three graphics cards, of which there is only one.  

View Full Size

The ASRock A75 Extreme6 would allow a user to build a new system for Bitcoin mining that supports three dual-slot graphics cards as well as the A8-3850 APU for a pretty low price.  With the board + APU combination running you only $280, you could then choose to go the route of the Radeon HD 5830s or even the ultra-expensive Radeon HD 6990 while still getting a moderate 80 Mhash/s from the integrated graphics at the same time.  

If you wanted to build the most compact mining system you would likely want a board with the most possible PCI Express slots and fill them with single slot cards you can find.  The MSI Big Bang Marshal motherboards based on the P67 chipset support 8 (!!) PCIe slots that you could fill with single slot Radeon HD 6850 cards or go the dual-GPU route and fill it with a set of four Radeon HD 6990s.  You realize of course this a bad investment, right?

Our testing with "The Beast" on the previous page showed us that while expensive getting the most expensive and fastest running GPUs can be a good investment (if you are willing to take all the associated risks) in terms of performance per dollar, per watt and even for the one year cycle we discussed.  Running a system at 1000 watts twenty four hours a day, seven days a week would anywhere from $30-60 / month to your electric bill (depending on where you live) so there is that consideration as well.  Still, this was a fun experiment and I think the results will answer a lot of user's questions.

Power Consumption Costs

** Update 7/13/11 **  We recently wrote another piece on the cost of the power to run our Bitcoin mining operations used in this performance article.  Based on the individual prices of electric in all 50 states of the US, we found that the cost of the power to run some cards exceeded the value of the Bitcoin currency based on today's exchange rates.  I would highly recommend you check out that story as well after giving this performance-based article a thorough reading.  If the below graph interests or scares you at all, you'll definitely want to read that story!!

View Full Size

** End Update **

Final Thoughts

Bitcoin mining is definitely a fringe element of computing today but there is a chance that in the future this technologically created currency could become more popular and used by a higher percentage of the public.  As it stands now this can be a fun way to utilize the power of your GPU for more than just gaming and even gives users the chance to make a bit of side money in the process - just realize that there are a lot of risks here so if you aren't comfortable spending on NEW cards just utilize the ones you already have.  

Without a doubt the AMD Radeon architecture has a huge advantage in this type of computing and I am honestly surprised we haven't seen the company at least mention this performance gap as a proof-point for its GPUs.  Just as they don't officially promote overclocking they could just as easily "unofficially" talk about the capability of the Radeon lineup to sprint past the modern NVIDIA Fermi-designs in this ALU-bound algorithm. 

I hope you found this compilation of data interesting, useful and entertaining, I know we all had fun putting it together.  Jump down to the comments below if you have questions for us and we'll do our best to answer them and update as necessary.  Thanks for reading!

July 13, 2011 | 09:22 PM - Posted by Florian (not verified)

I think it is your responsibility to deter readers more actively from investing in hardware in order to conduct bitcoin mining and distance yourselves from those activities. It is easy for people to understand that they can make money from computing power, but it takes some very careful reading to understand that by design, this whole enterprise will become less and less profitable over time. So I think it would be better to put the emphasis of the article on parallel computing performance and to use bitcoin merely for illustrative purposes. At the very least, you should factor in the energy costs in your profitability analysis, but in my opinion, calculating projections is misleading and even deceptive, given the facts about Bitcoin (see below).

So my warning here:
!!! WARNING !!!
===================================================
Investing in hardware in order to engage in bitcoin mining is a highly risky and quite possibly loss-making idea!
The calculations of "Days to payoff" and "1 year profit" in this article are misleading: Not only is the rate of bitcoin creation is deliberately being slowed as the total number of bitcoins approaches 21 millions, it is also getting more and more difficult to accumulate enough computing power as the number of participants in bitcoin mining is increasing (as people reading this article and others will start setting up their own mining operation). The only effect countering this deterioration in profitability would be an increase in the dollar value of the bitcoin, which is uncertain and unpredictable.
====================================================

July 13, 2011 | 10:25 PM - Posted by Ken Addison

Oh hey look, that's already in the article..

"Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases. Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!"

July 13, 2011 | 10:52 PM - Posted by Anonymous (not verified)

I hope this is a dumb question, but what prevents a virus from creating a mining botnet?

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

Nothing really. Plus a virus which specifically only attempted GPU mining would be alot easier to hide in the windows environment since most users are unlikely to be monitoring GPU usage levels when simply web browsing etc.

A virus which intelligently slowed its mining attack if the user was trying to do something GPU intensive (gaming), in order to hide the system use and keep the user from noticing massive in-game slowdown, could likely mine away unnoticed.

I do not fully understand the setup in regards to mining as a pool though, which is what you would ultimately want all your zombied systems to do. I guess it is probably not 'that' difficult to setup a pooling setup given how many continue popping up, plus presumably someone writing a virus specifically targetted at hijacking GPU cycles is probably a decent enough coder.

July 14, 2011 | 08:20 PM - Posted by Tim Verry

There have actually been some botnets (that have since been shut down) mining for pools; however, they were thousands of computers using the CPUs to mine as access to the GPU hardware is more difficult/would require more end user cooperation to get that botnet software installed and running, AFAIK.

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

This test didn't use DiabloMiner, so it's automatically invalid.

July 14, 2011 | 06:37 AM - Posted by Sturle (not verified)

What price did you use for power in your profit calculations? At 500W for a simple 6990 system, total power consumption in a year of 24/7 use will be 4380 kWh. Your profit after one year will be negative if your price for power is more than about 35 cents, assuming constant difficulty. All Nvidia cards will operate at a loss unless your power is very cheap or free.

Difficulty is about 1000 times larger now than half a year ago, btw. Power cost has become the most important factor in mining profitabilty.

July 14, 2011 | 09:49 AM - Posted by Ryan Shrout

You should check out the second article for a host of details on that topic:

http://www.pcper.com/reviews/Graphics-Cards/Bitcoin-Mining-Update-Power-...

July 14, 2011 | 08:32 AM - Posted by Europe (not verified)

For european readers, the power use is a bit more important. 1kwh of power costs on average around 0.25 euro.
Which means a system like the beast (using 1kw of power) will cost you 0.25*24*365 = 2190 euros per year in electricity.
The beast yearly produces 3637 dollar equivalent bit coins, which is about 2584 Euros.

That means it will effectively only produce 394 euros. And that is not counting the cost of buying the system.

July 14, 2011 | 01:25 PM - Posted by Acejam (not verified)

A 6990 in the default BIOS position should generate 330 MHash/s PER core. That's 660 MHash/s per card. I'm not sure why, but your card is showing a much slower speed on one of the cores. (~285.3 MHash/s)

Switch the BIOS switch to position 2 and you'll be at 360-375 MHash/s per core.

You also seem to be missing the most basic flags for GUIMiner running poclbm: -v -w128

July 14, 2011 | 02:13 PM - Posted by Anonymous (not verified)

Does anyone see this as an AMD ATI scam? I smell so O_o

July 15, 2011 | 03:22 AM - Posted by Anonymous (not verified)

Your cpu usage is silly!
It's prolly the guiminer interface or something.
My 'rig' runs 350Mh/s on i7 2600k and HD6970 and rarely hits 4% cpu usage.
And that is while i run an active minecraft server and use the rig to watch videos and stuff (gets it to about 8% for SD video).
I'm using the Phoenix miner btw.

July 15, 2011 | 03:30 PM - Posted by Sc00bz (not verified)

"... maximum performance on a Core i7-2600K of 8 instructions x 4 cores x 3.4 GHz = 108.8 GigaInstructions."

It's 4 integer operations/instruction x 3 instructions/clock x 4 cores x 3.4 GHz = 163.2 GigaInstructions/second. AVX is 4 integer operations/instruction or 8 floating point operations/instruction. Each clock can issue up to 3 instructions if they don't depend on the answer of the previous instructions. The nice thing with AVX over SSE2 is AVX has instructions like a = b *operation* c vs a = a *operation* b for SSE2.

July 18, 2011 | 03:10 AM - Posted by Anonymous (not verified)

Great articles guys. thanks.

July 19, 2011 | 11:36 AM - Posted by Anonymous (not verified)

You DO know that there are single-slot water-cooled Radeon HD 6990 monstrosities out there, do you? 8 of these in a single system *head spins*...

July 19, 2011 | 04:52 PM - Posted by Anonymous (not verified)

This whole thing sounds kind of stupid.

July 21, 2011 | 09:06 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 21, 2011 | 09:07 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 23, 2011 | 12:48 AM - Posted by Jimmy (not verified)

Now i know why i can't find another 5850 to run crossfire with. The ones i do find are way overpriced now.

July 24, 2011 | 03:45 AM - Posted by Escalus (not verified)

I'm looking at testing out a very simple mining rig. If I get a Radeon 6XXX series GPU, would it make sense to use it on a Core 2 Duo system? Would the CPU be a bottleneck?

July 27, 2011 | 07:04 AM - Posted by Those Who Get it (not verified)

So your telling me you put a Virus on your computer that helps criminals launder money.
you let it operate through your GPU because there is no security there.
and you spend hundreds on hardware and power, for a experiment in social engeneering?
then you think that because 3 places are taking the hype of the bitcoin as a COUPON to sell you shit at 3 times the normal cost, that the bitcoin is therin a currency?

what do you think your GPU is really processing?
or does anyone think?

July 27, 2011 | 07:17 AM - Posted by Those Who Get it (not verified)

You just dont get it,
the GPU is processing YOU!
It is internally cyclicly redundant pre-processing your own non-trasnactions, into a multilevel advertising purchacing and marketing scheme.
If they do not enable the user with a journey, then there is no game to be played. There is no Corelation to alternative universal dimentional shifting of exchange goods in virtuality, when there still is nothing but virtuality in existance.
How do you perceive that something exist when one person tells you that it exist, and masses of people join that ONE person to confirm that it exists.
That is a singularity of the black hole variety.

October 12, 2011 | 08:54 AM - Posted by Anonymous (not verified)

Issue -problem guiminer with dual gpu card HD6870x2 powercolor. After creating new worker for the second Gpu, it still doesnt work 0 Mhashes the first gpu at 304 Mhashes clock at 970 Mhz 60% fan speed temp 74 degrees Celsius
flags -v -w128 -a4. Does anybody know how to setup this correctly , so that both gpu´s work at the same time thank you for helping me out.

December 15, 2012 | 12:45 PM - Posted by Anonymous (not verified)

This is all far too complicated for me. Can't you have a simple link to click on that will look at one's machine and say "yes" or "no"?

February 21, 2013 | 10:46 AM - Posted by Anonymous (not verified)

How do I buy the beast? Then after I buy, I would probably find a better more efficient system where?

July 14, 2013 | 11:40 PM - Posted by whitesites (not verified)

You guys really need to rerun this test using bitminter. I have the GTX 560 TI and I am getting 138 Mhps with that card. Also would really like to see how the new ATI 7xxx series cards perform.

November 6, 2013 | 10:24 PM - Posted by Anonymous (not verified)

is your 560ti a 448 core version or no? also are you only running 1 card or do you have a sli, tri-sli, or tri-sli with a fourth dedicated physics card?

kinda wonder what my ole x58 rig might pull off with its i7 980X cpu
3x GTX 560Ti 448 core GPU's
lowly 9600GT that is just in there as a dedicated PhysX processor

course I built it with a 1600 watt psu and the system doesn't pull enough juice to really even warm the power supply up LOL

September 6, 2013 | 09:41 PM - Posted by Anonymous (not verified)

Try this site, it may help you. Goodluck! http://btc.nubtektrx.us

November 22, 2013 | 04:27 PM - Posted by Ralph (not verified)

Has anyone considered solar power for the electricity? Take longer to recoup costs, but any analysis?
Ralph

December 26, 2013 | 06:47 AM - Posted by Anonymous (not verified)

hi
Iwant to have one accunt plz .
download sowftwer

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.