Review Index:
Feedback

Bitcoin Currency and GPU Mining Performance Comparison

Author: Ryan Shrout
Manufacturer: General

Testing Configuration and Software Setup

Software Configuration

True ease of use is something that the Bitcoin ecosystem doesn't really have yet though they are steadily improving on it.  You'll need a couple different items up and running on one or more machines to really start with your mining experience.

View Full Size

The first thing you'll need is the Bitcoin client application that acts like your wallet and actually accesses your wallet.dat file.  While this doesn't necessarily need to be running on the same hardware that is doing the mining, you'll need to run this to get your key information to share with the mining apps. 

View Full Size

For your mining application, there are several options including some command-line based apps and graphical ones.  For the quickest setup and configuration time we liked GUIMiner, seen above.  The interface you use does not necessarily determine the kernel you use for computing the Bitcoins and which kernel you use can alter performance pretty dramatically.  In its infancy the Bitcoin community ran CPU-based kernels until the performance difficulty got to a point where they were incredibly inefficient leading to the creation of several GPU-based designs.

View Full Size

For our testing we went with the poclbm kernel that is built around OpenCL and works with AMD Radeon HD 4000 series and above and NVIDIA GeForce 8000 series and above graphics cards.  There definitely are other options out there for Bitcoin mining and many enthusiasts argue that some perform better than others across different ranges of CPUs and GPUs but in terms of popularity today, poclbm seems to be the winner.  

The above image shows us actually running a pair of the kernels, one for each GPU on a multi-GPU graphics card.  If you have more than one GPU in your system, whether on a single card or multiple, you need only assign a kernel to each available processor to max out your processing performance.  

(Side note: because it is built on OpenCL, you can actually run this on CPUs that have compliant OpenCL stacks.  However, it is not the most efficient on that class of processor by any means.)

When running a Bitcoin mining application be prepared for a lot of GPU utilization but not much on the CPU side of things.  

View Full Size

Here you can see our Core i7 Sandy Bridge based processor is not getting a heavy workout while running the GUIMiner application with our OpenCL-based client focused on the GPU.  Looking at the graphics card workload however...

View Full Size

The ASUS ARES card (dual Radeon HD 5870 GPUs) is working hard with a 99% GPU load reached and temperature slowly rising.  If you have a GPU with a loud fan or are sensitive to the heat created by your graphics card then this mining process might not be for you!

Hardware Configurations

For our testing we ran the Bitcoin clients on our standard GPU testing bed built out of the following:

  • Testing Configuration
  • ASUS P6X58D Premium Motherboard
  • Intel Core i7-965 @ 3.33 GHz Processor
  • 3 x 2GB Corsair DDR3-1333 MHz Memory
  • Western Digital VelociRaptor 600GB HDD
  • Corsair Professional Series 1200w PSU
  • NVIDIA Driver: 275.33
  • AMD Driver: 11.6

Our graphics card selection was based on trying to compare some current options to some previous generation cards that are likely to already be in the hands of potential GPU Bitcoin Miners.  Here is the lineup with a few curveballs tossed in:

  • GeForce GTX 285 - $300
  • GeForce GTX 295 - $289
  • GeForce GTX 460 - $160
  • GeForce GTX 560 Ti - $319
  • GeForce GTX 580 - $469
  • GeForce GTX 590 - $749
  • Radeon HD 4890 - $240
  • Radeon HD 5750 - $115
  • Radeon HD 5830 - $129
  • Radeon HD 5970 - $620
  • Radeon HD 6850 - $159
  • Radeon HD 6990 - $750
  • Radeon HD 5870 x2 (ASUS ARES) - $1100
  • Radeon HD 5870 x2 (Overclocked) - $1100
  • AMD A8-3850 APU - $139
  • "The Beast" - $1710

We have covered the bases of the last several years by starting with the HD 4890 and GTX 285 cards of yester-year.  We included a range of modern cards including the very popular GeForce GTX 460 and the lower end Radeon HD 5750.  Dual-GPU cards make a frequent showing with the GT X 295, GTX 590, HDF 5970 and HD 6990 as well as the ASUS ARES in a standard and overclocked setting.  Standard clock rate on the ASUS ARES is 850 MHz and our overclocked setting pushed that to 1005 MHz - an 18% increase. Basically, we just wanted to see how high we could push that $1100 graphics card.

The big outlier is the new AMD A8-3850 APU released this month that combines a quad-core CPU and "discrete class" GPU on a processor.  The Radeon HD 6550D GPU on that die (as it is branded) has 400 stream processors and uses a DDR3 memory interface that is shared with the x86 cores.  Because the computing process at work in Bitcoin mining is not memory dependent, we kind of expected the APU to do well for its price and position.  

The pricing listed here is used throughout our performance review to judge value and profitability.  Keep in mind that some of these numbers were hard to really nail down especially for cards like the GTX 285, GTX 295, HD 4890, HD 5970 and ASUS ARES that are hard to find anywhere but eBay and very small online stores.  The prices here are my best estimates at what you would have to pay (on average) to acquire a card like this today.

You might also be wondering what "The Beast" is in our list above.  That is a mega-crunching machine we put together after doing all of our other card testing to see just how much we could push out of a single system.  Using the same base test bed, we installed the Radeon HD 6990 4GB, Radeon HD 5970 2GB (both dual-GPU cards) and the Radeon HD 6970 2GB single-GPU cards.  While we wanted to include the ASUS ARES in this configuration we weren't given that option since it required three PCIe power connections and our Corsair AX1200 power supply only supplied us with six of them.  You will have to wait until later in the article to see the results of that setup as I decided to leave it off the single card result graphs as it tended to skew the scale quite a bit.

What to look for

The first thing you are going to notice is that the AMD graphics cards solidly outperform the NVIDIA GPUs for reasons we are still diving into.  The VLIW architecture at work on the 4000/5000/6000 series of cards is seeing some very high utilization by the poclbm kernel and it is definitely one of those few applications nearly reaching the theoretical limits of TFLOPs claimed by AMD over the years.  

What else is there to evaluate?

  • Pure Mhash/s rates - how fast is each GPU in computing the math required for Bitcoin mining?  The higher the Mhash/s rate the faster the card and quicker you will get to finding the next coin in the currency.
  • Performance per Dollar - Mhash/s/$ - This is probably the most important factor for users that might consider Bitcoin mining as a way to make money and pay for things they want to buy.  Which card is going to bring the most "value" the mining experience?
  • Performance per Watt - Mhash/s/watt - If you value your air conditioning bill more than most or maybe want to cram as many cards into an enclosure as possible for a mining power house you might want to know which cards and GPUs are the most power efficient.
  • Dollars per day - Step 1: Mine.  Step 2: ??  Step 3: Profit.  How much money can you make on a given card on a daily basis?  This metric will fluctuate from day to day based on the actual exchange rate of a Bitcoin with USD (or your own currency) but we will evaluate it based on the numbers as of this writing.
  • Time to Graphics Card Payoff - Based on the amount of money you can earn per day, how long will it take you to pay off the card you purchased for this purpose and start making the aforementioned profits?  Obviously for cards that are either end-of-lifed or just plain hard to find this is going to be a rough estimate (go ahead and find me an average price for a GTX 285 today) but it provides another useful data point for professional miners.
  • One Year Profit - If you took that daily earned amount and could apply it perfect for one year (which we know you can't really because of the changing algorithm) and subtracted the cost of that graphics card, how much could you possibly MAKE in a year?  The numbers might surprise you!

So there you have it - let's jump into the results and see what our testing brought forth with more details and explanations along the way!

July 13, 2011 | 09:22 PM - Posted by Florian (not verified)

I think it is your responsibility to deter readers more actively from investing in hardware in order to conduct bitcoin mining and distance yourselves from those activities. It is easy for people to understand that they can make money from computing power, but it takes some very careful reading to understand that by design, this whole enterprise will become less and less profitable over time. So I think it would be better to put the emphasis of the article on parallel computing performance and to use bitcoin merely for illustrative purposes. At the very least, you should factor in the energy costs in your profitability analysis, but in my opinion, calculating projections is misleading and even deceptive, given the facts about Bitcoin (see below).

So my warning here:
!!! WARNING !!!
===================================================
Investing in hardware in order to engage in bitcoin mining is a highly risky and quite possibly loss-making idea!
The calculations of "Days to payoff" and "1 year profit" in this article are misleading: Not only is the rate of bitcoin creation is deliberately being slowed as the total number of bitcoins approaches 21 millions, it is also getting more and more difficult to accumulate enough computing power as the number of participants in bitcoin mining is increasing (as people reading this article and others will start setting up their own mining operation). The only effect countering this deterioration in profitability would be an increase in the dollar value of the bitcoin, which is uncertain and unpredictable.
====================================================

July 13, 2011 | 10:25 PM - Posted by Ken Addison

Oh hey look, that's already in the article..

"Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases. Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!"

July 13, 2011 | 10:52 PM - Posted by Anonymous (not verified)

I hope this is a dumb question, but what prevents a virus from creating a mining botnet?

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

Nothing really. Plus a virus which specifically only attempted GPU mining would be alot easier to hide in the windows environment since most users are unlikely to be monitoring GPU usage levels when simply web browsing etc.

A virus which intelligently slowed its mining attack if the user was trying to do something GPU intensive (gaming), in order to hide the system use and keep the user from noticing massive in-game slowdown, could likely mine away unnoticed.

I do not fully understand the setup in regards to mining as a pool though, which is what you would ultimately want all your zombied systems to do. I guess it is probably not 'that' difficult to setup a pooling setup given how many continue popping up, plus presumably someone writing a virus specifically targetted at hijacking GPU cycles is probably a decent enough coder.

July 14, 2011 | 08:20 PM - Posted by Tim Verry

There have actually been some botnets (that have since been shut down) mining for pools; however, they were thousands of computers using the CPUs to mine as access to the GPU hardware is more difficult/would require more end user cooperation to get that botnet software installed and running, AFAIK.

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

This test didn't use DiabloMiner, so it's automatically invalid.

July 14, 2011 | 06:37 AM - Posted by Sturle (not verified)

What price did you use for power in your profit calculations? At 500W for a simple 6990 system, total power consumption in a year of 24/7 use will be 4380 kWh. Your profit after one year will be negative if your price for power is more than about 35 cents, assuming constant difficulty. All Nvidia cards will operate at a loss unless your power is very cheap or free.

Difficulty is about 1000 times larger now than half a year ago, btw. Power cost has become the most important factor in mining profitabilty.

July 14, 2011 | 09:49 AM - Posted by Ryan Shrout

You should check out the second article for a host of details on that topic:

http://www.pcper.com/reviews/Graphics-Cards/Bitcoin-Mining-Update-Power-...

July 14, 2011 | 08:32 AM - Posted by Europe (not verified)

For european readers, the power use is a bit more important. 1kwh of power costs on average around 0.25 euro.
Which means a system like the beast (using 1kw of power) will cost you 0.25*24*365 = 2190 euros per year in electricity.
The beast yearly produces 3637 dollar equivalent bit coins, which is about 2584 Euros.

That means it will effectively only produce 394 euros. And that is not counting the cost of buying the system.

July 14, 2011 | 01:25 PM - Posted by Acejam (not verified)

A 6990 in the default BIOS position should generate 330 MHash/s PER core. That's 660 MHash/s per card. I'm not sure why, but your card is showing a much slower speed on one of the cores. (~285.3 MHash/s)

Switch the BIOS switch to position 2 and you'll be at 360-375 MHash/s per core.

You also seem to be missing the most basic flags for GUIMiner running poclbm: -v -w128

July 14, 2011 | 02:13 PM - Posted by Anonymous (not verified)

Does anyone see this as an AMD ATI scam? I smell so O_o

July 15, 2011 | 03:22 AM - Posted by Anonymous (not verified)

Your cpu usage is silly!
It's prolly the guiminer interface or something.
My 'rig' runs 350Mh/s on i7 2600k and HD6970 and rarely hits 4% cpu usage.
And that is while i run an active minecraft server and use the rig to watch videos and stuff (gets it to about 8% for SD video).
I'm using the Phoenix miner btw.

July 15, 2011 | 03:30 PM - Posted by Sc00bz (not verified)

"... maximum performance on a Core i7-2600K of 8 instructions x 4 cores x 3.4 GHz = 108.8 GigaInstructions."

It's 4 integer operations/instruction x 3 instructions/clock x 4 cores x 3.4 GHz = 163.2 GigaInstructions/second. AVX is 4 integer operations/instruction or 8 floating point operations/instruction. Each clock can issue up to 3 instructions if they don't depend on the answer of the previous instructions. The nice thing with AVX over SSE2 is AVX has instructions like a = b *operation* c vs a = a *operation* b for SSE2.

July 18, 2011 | 03:10 AM - Posted by Anonymous (not verified)

Great articles guys. thanks.

July 19, 2011 | 11:36 AM - Posted by Anonymous (not verified)

You DO know that there are single-slot water-cooled Radeon HD 6990 monstrosities out there, do you? 8 of these in a single system *head spins*...

July 19, 2011 | 04:52 PM - Posted by Anonymous (not verified)

This whole thing sounds kind of stupid.

July 21, 2011 | 09:06 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 21, 2011 | 09:07 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 23, 2011 | 12:48 AM - Posted by Jimmy (not verified)

Now i know why i can't find another 5850 to run crossfire with. The ones i do find are way overpriced now.

July 24, 2011 | 03:45 AM - Posted by Escalus (not verified)

I'm looking at testing out a very simple mining rig. If I get a Radeon 6XXX series GPU, would it make sense to use it on a Core 2 Duo system? Would the CPU be a bottleneck?

July 27, 2011 | 07:04 AM - Posted by Those Who Get it (not verified)

So your telling me you put a Virus on your computer that helps criminals launder money.
you let it operate through your GPU because there is no security there.
and you spend hundreds on hardware and power, for a experiment in social engeneering?
then you think that because 3 places are taking the hype of the bitcoin as a COUPON to sell you shit at 3 times the normal cost, that the bitcoin is therin a currency?

what do you think your GPU is really processing?
or does anyone think?

July 27, 2011 | 07:17 AM - Posted by Those Who Get it (not verified)

You just dont get it,
the GPU is processing YOU!
It is internally cyclicly redundant pre-processing your own non-trasnactions, into a multilevel advertising purchacing and marketing scheme.
If they do not enable the user with a journey, then there is no game to be played. There is no Corelation to alternative universal dimentional shifting of exchange goods in virtuality, when there still is nothing but virtuality in existance.
How do you perceive that something exist when one person tells you that it exist, and masses of people join that ONE person to confirm that it exists.
That is a singularity of the black hole variety.

October 12, 2011 | 08:54 AM - Posted by Anonymous (not verified)

Issue -problem guiminer with dual gpu card HD6870x2 powercolor. After creating new worker for the second Gpu, it still doesnt work 0 Mhashes the first gpu at 304 Mhashes clock at 970 Mhz 60% fan speed temp 74 degrees Celsius
flags -v -w128 -a4. Does anybody know how to setup this correctly , so that both gpu´s work at the same time thank you for helping me out.

December 15, 2012 | 12:45 PM - Posted by Anonymous (not verified)

This is all far too complicated for me. Can't you have a simple link to click on that will look at one's machine and say "yes" or "no"?

February 21, 2013 | 10:46 AM - Posted by Anonymous (not verified)

How do I buy the beast? Then after I buy, I would probably find a better more efficient system where?

July 14, 2013 | 11:40 PM - Posted by whitesites (not verified)

You guys really need to rerun this test using bitminter. I have the GTX 560 TI and I am getting 138 Mhps with that card. Also would really like to see how the new ATI 7xxx series cards perform.

November 6, 2013 | 10:24 PM - Posted by Anonymous (not verified)

is your 560ti a 448 core version or no? also are you only running 1 card or do you have a sli, tri-sli, or tri-sli with a fourth dedicated physics card?

kinda wonder what my ole x58 rig might pull off with its i7 980X cpu
3x GTX 560Ti 448 core GPU's
lowly 9600GT that is just in there as a dedicated PhysX processor

course I built it with a 1600 watt psu and the system doesn't pull enough juice to really even warm the power supply up LOL

September 6, 2013 | 09:41 PM - Posted by Anonymous (not verified)

Try this site, it may help you. Goodluck! http://btc.nubtektrx.us

November 22, 2013 | 04:27 PM - Posted by Ralph (not verified)

Has anyone considered solar power for the electricity? Take longer to recoup costs, but any analysis?
Ralph

December 26, 2013 | 06:47 AM - Posted by Anonymous (not verified)

hi
Iwant to have one accunt plz .
download sowftwer

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.