Review Index:
Feedback

Bitcoin Currency and GPU Mining Performance Comparison

Author: Ryan Shrout
Manufacturer: General

Dollars per Day and your Payoff Period

For our next metrics we have to define a few things.  First, in order to find a "dollars per day" we had to set a number as the exchange rate.  As of July 11th at 3pm the exchange rate that we found online was $14.30 per Bitcoin.  I found that rate at the Mt. Gox website that is a 24/7 exchange for the currency and one that has a daily graph that shows the value as it changes. 

Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases.  Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!

To find the daily rate we used the calculator on alloscomp.com that uses the current "difficulty factor" and the Mhash/s rates we found using our benchmarks on the previous page.  That factor estimates how long it will take to find the next Bitcoin as of that moment in time - the more coins that are found the more that number will increase, so get in fast!  The calculator is able to estimate how much you'll make in USD based on that factor and our input of the exchange rate and Mhash/s for each card.

View Full Size

How did the results pan out for our various graphics cards?

View Full Size

Obviously here the metric will fall in line with the raw Mhash/s rates we found on the previous page but this time we are looking at the performance in actually monetary value.  The overclocked ARES card is earning us the most on a daily basis with as much as $7.58 falling into wallet.  That is a pretty staggering number considering the HD 5830 will only net us $1.83 per day while the GeForce GTX 460 gets us a meager $0.53!  Hard to feed a family on that...

 

Our next metric is really meant for those hardcore mining gurus that want to invest in GPUs to make money on the system.  Based on our estimated prices for each graphics card and the dollar-per-day we can earn with each card, how long does it take a user to pay off the investment in that specific graphics card before you start making that cold hard (virtual) cash?

View Full Size

In this case the lower number is better and indicates your quickest route to profit.  You would have to mine on the GTX 285 for 526+ days to pay off the current purchase price of $300 and in fact that increase quite a bit as the ability to mine becomes more difficult over time! (That is obviously the case for ALL of our results.)  The quickest payoff comes with the Radeon HD 5830 that takes only 70 days to earn that $129 price tag followed by the HD 5750 and its $115 cost.  The ARES card takes as much as 201 days or as little as 145 depending on you overclocking settings both of which are pretty substantial investments.

 

Finally, just for a fun experiment, I decided to graph the "One year profit" of mining Bitcoins on each of these GPUs (of course baring the changing difficulty of mining we keep bringing up).  The formula was simple: take the earned dollars per day of the graphics card, multiply by 365 days and subtract the cost of the graphics card itself.  Interesting results here: 

View Full Size

Anything you see without a line and number associated with it is a net loss.  The GTX 580 as an example would earn $405 over a year but it costs $469 today for a net loss of ~$63.  The rest of the cards would earn you something ranging from $1.30 for the GTX 560 Ti up to $1666.70 for the overclocked ASUS ARES!!  For an investment of $1100 (when you can find it) and some elbow grease on the overclocking, you could make a pretty penny over the course of a year with this dual-GPU graphical beast!  

The Radeon HD 5830 is still the best single GPU card with a $538 return on investment and a 400% profit margin for your $120 upfront.  Even the modest AMD A8-3850 would be able to earn its keep with Bitcoin mining earning more than all the profitable NVIDIA GPUs combined.

July 13, 2011 | 09:22 PM - Posted by Florian (not verified)

I think it is your responsibility to deter readers more actively from investing in hardware in order to conduct bitcoin mining and distance yourselves from those activities. It is easy for people to understand that they can make money from computing power, but it takes some very careful reading to understand that by design, this whole enterprise will become less and less profitable over time. So I think it would be better to put the emphasis of the article on parallel computing performance and to use bitcoin merely for illustrative purposes. At the very least, you should factor in the energy costs in your profitability analysis, but in my opinion, calculating projections is misleading and even deceptive, given the facts about Bitcoin (see below).

So my warning here:
!!! WARNING !!!
===================================================
Investing in hardware in order to engage in bitcoin mining is a highly risky and quite possibly loss-making idea!
The calculations of "Days to payoff" and "1 year profit" in this article are misleading: Not only is the rate of bitcoin creation is deliberately being slowed as the total number of bitcoins approaches 21 millions, it is also getting more and more difficult to accumulate enough computing power as the number of participants in bitcoin mining is increasing (as people reading this article and others will start setting up their own mining operation). The only effect countering this deterioration in profitability would be an increase in the dollar value of the bitcoin, which is uncertain and unpredictable.
====================================================

July 13, 2011 | 10:25 PM - Posted by Ken Addison

Oh hey look, that's already in the article..

"Please keep in mind that we understand that these values will change over time not only because of the exchange rate differences but because your ability to mine Bitcoins will slow down over time as the algorithm to find coins becomes more and more complex as the network hashing power increases. Read over the first two pages of the article again to understand WHY this happens but just know the results you will see below are based on an instance in time during this writing process!"

July 13, 2011 | 10:52 PM - Posted by Anonymous (not verified)

I hope this is a dumb question, but what prevents a virus from creating a mining botnet?

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

Nothing really. Plus a virus which specifically only attempted GPU mining would be alot easier to hide in the windows environment since most users are unlikely to be monitoring GPU usage levels when simply web browsing etc.

A virus which intelligently slowed its mining attack if the user was trying to do something GPU intensive (gaming), in order to hide the system use and keep the user from noticing massive in-game slowdown, could likely mine away unnoticed.

I do not fully understand the setup in regards to mining as a pool though, which is what you would ultimately want all your zombied systems to do. I guess it is probably not 'that' difficult to setup a pooling setup given how many continue popping up, plus presumably someone writing a virus specifically targetted at hijacking GPU cycles is probably a decent enough coder.

July 14, 2011 | 08:20 PM - Posted by Tim Verry

There have actually been some botnets (that have since been shut down) mining for pools; however, they were thousands of computers using the CPUs to mine as access to the GPU hardware is more difficult/would require more end user cooperation to get that botnet software installed and running, AFAIK.

July 13, 2011 | 11:27 PM - Posted by Anonymous (not verified)

This test didn't use DiabloMiner, so it's automatically invalid.

July 14, 2011 | 06:37 AM - Posted by Sturle (not verified)

What price did you use for power in your profit calculations? At 500W for a simple 6990 system, total power consumption in a year of 24/7 use will be 4380 kWh. Your profit after one year will be negative if your price for power is more than about 35 cents, assuming constant difficulty. All Nvidia cards will operate at a loss unless your power is very cheap or free.

Difficulty is about 1000 times larger now than half a year ago, btw. Power cost has become the most important factor in mining profitabilty.

July 14, 2011 | 09:49 AM - Posted by Ryan Shrout

You should check out the second article for a host of details on that topic:

http://www.pcper.com/reviews/Graphics-Cards/Bitcoin-Mining-Update-Power-...

July 14, 2011 | 08:32 AM - Posted by Europe (not verified)

For european readers, the power use is a bit more important. 1kwh of power costs on average around 0.25 euro.
Which means a system like the beast (using 1kw of power) will cost you 0.25*24*365 = 2190 euros per year in electricity.
The beast yearly produces 3637 dollar equivalent bit coins, which is about 2584 Euros.

That means it will effectively only produce 394 euros. And that is not counting the cost of buying the system.

July 14, 2011 | 01:25 PM - Posted by Acejam (not verified)

A 6990 in the default BIOS position should generate 330 MHash/s PER core. That's 660 MHash/s per card. I'm not sure why, but your card is showing a much slower speed on one of the cores. (~285.3 MHash/s)

Switch the BIOS switch to position 2 and you'll be at 360-375 MHash/s per core.

You also seem to be missing the most basic flags for GUIMiner running poclbm: -v -w128

July 14, 2011 | 02:13 PM - Posted by Anonymous (not verified)

Does anyone see this as an AMD ATI scam? I smell so O_o

July 15, 2011 | 03:22 AM - Posted by Anonymous (not verified)

Your cpu usage is silly!
It's prolly the guiminer interface or something.
My 'rig' runs 350Mh/s on i7 2600k and HD6970 and rarely hits 4% cpu usage.
And that is while i run an active minecraft server and use the rig to watch videos and stuff (gets it to about 8% for SD video).
I'm using the Phoenix miner btw.

July 15, 2011 | 03:30 PM - Posted by Sc00bz (not verified)

"... maximum performance on a Core i7-2600K of 8 instructions x 4 cores x 3.4 GHz = 108.8 GigaInstructions."

It's 4 integer operations/instruction x 3 instructions/clock x 4 cores x 3.4 GHz = 163.2 GigaInstructions/second. AVX is 4 integer operations/instruction or 8 floating point operations/instruction. Each clock can issue up to 3 instructions if they don't depend on the answer of the previous instructions. The nice thing with AVX over SSE2 is AVX has instructions like a = b *operation* c vs a = a *operation* b for SSE2.

July 18, 2011 | 03:10 AM - Posted by Anonymous (not verified)

Great articles guys. thanks.

July 19, 2011 | 11:36 AM - Posted by Anonymous (not verified)

You DO know that there are single-slot water-cooled Radeon HD 6990 monstrosities out there, do you? 8 of these in a single system *head spins*...

July 19, 2011 | 04:52 PM - Posted by Anonymous (not verified)

This whole thing sounds kind of stupid.

July 21, 2011 | 09:06 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 21, 2011 | 09:07 PM - Posted by Bryan Clanton (not verified)

Who is this IDIOT slamming bit coins? Moron, the US government has nothing to do with the Federal Reserve Bank. It is owned by private individuals. Do you homework before your next show-n-tell.

July 23, 2011 | 12:48 AM - Posted by Jimmy (not verified)

Now i know why i can't find another 5850 to run crossfire with. The ones i do find are way overpriced now.

July 24, 2011 | 03:45 AM - Posted by Escalus (not verified)

I'm looking at testing out a very simple mining rig. If I get a Radeon 6XXX series GPU, would it make sense to use it on a Core 2 Duo system? Would the CPU be a bottleneck?

July 27, 2011 | 07:04 AM - Posted by Those Who Get it (not verified)

So your telling me you put a Virus on your computer that helps criminals launder money.
you let it operate through your GPU because there is no security there.
and you spend hundreds on hardware and power, for a experiment in social engeneering?
then you think that because 3 places are taking the hype of the bitcoin as a COUPON to sell you shit at 3 times the normal cost, that the bitcoin is therin a currency?

what do you think your GPU is really processing?
or does anyone think?

July 27, 2011 | 07:17 AM - Posted by Those Who Get it (not verified)

You just dont get it,
the GPU is processing YOU!
It is internally cyclicly redundant pre-processing your own non-trasnactions, into a multilevel advertising purchacing and marketing scheme.
If they do not enable the user with a journey, then there is no game to be played. There is no Corelation to alternative universal dimentional shifting of exchange goods in virtuality, when there still is nothing but virtuality in existance.
How do you perceive that something exist when one person tells you that it exist, and masses of people join that ONE person to confirm that it exists.
That is a singularity of the black hole variety.

October 12, 2011 | 08:54 AM - Posted by Anonymous (not verified)

Issue -problem guiminer with dual gpu card HD6870x2 powercolor. After creating new worker for the second Gpu, it still doesnt work 0 Mhashes the first gpu at 304 Mhashes clock at 970 Mhz 60% fan speed temp 74 degrees Celsius
flags -v -w128 -a4. Does anybody know how to setup this correctly , so that both gpu´s work at the same time thank you for helping me out.

December 15, 2012 | 12:45 PM - Posted by Anonymous (not verified)

This is all far too complicated for me. Can't you have a simple link to click on that will look at one's machine and say "yes" or "no"?

February 21, 2013 | 10:46 AM - Posted by Anonymous (not verified)

How do I buy the beast? Then after I buy, I would probably find a better more efficient system where?

July 14, 2013 | 11:40 PM - Posted by whitesites (not verified)

You guys really need to rerun this test using bitminter. I have the GTX 560 TI and I am getting 138 Mhps with that card. Also would really like to see how the new ATI 7xxx series cards perform.

November 6, 2013 | 10:24 PM - Posted by Anonymous (not verified)

is your 560ti a 448 core version or no? also are you only running 1 card or do you have a sli, tri-sli, or tri-sli with a fourth dedicated physics card?

kinda wonder what my ole x58 rig might pull off with its i7 980X cpu
3x GTX 560Ti 448 core GPU's
lowly 9600GT that is just in there as a dedicated PhysX processor

course I built it with a 1600 watt psu and the system doesn't pull enough juice to really even warm the power supply up LOL

September 6, 2013 | 09:41 PM - Posted by Anonymous (not verified)

Try this site, it may help you. Goodluck! http://btc.nubtektrx.us

November 22, 2013 | 04:27 PM - Posted by Ralph (not verified)

Has anyone considered solar power for the electricity? Take longer to recoup costs, but any analysis?
Ralph

December 26, 2013 | 06:47 AM - Posted by Anonymous (not verified)

hi
Iwant to have one accunt plz .
download sowftwer

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.