The replacement power supply -- the Z800's third one now -- arrived while we were on holiday, and so far seems to be running better. I decided to put it to the test by mining Dash and Ethereum in order to learn more about the mining process.
Running some numbers, one thing is clear: mining Dash at home with a GPU is a good way to lose money. In a little less than a week of mining I collected 0.39 DASH, which at current exchange rates is worth 93 cents. Given NYC electricity rates of 10 cents / KWh, an estimated 0.4 KWh of power required, running 24 hours a day for six days, that's $5.76 worth of electricity consumed, or an operational loss of $4.83.
So why does the DASH mining pool have 2.75 GHash/second of capacity? Are all those miners losing money? Did they get a Brutalis and a small hydropower plant for Christmas?
Maybe a place to start is to ask if it looks any better if we slot in more GPU's. That could help, but we probably want to be smart about how much we are paying for compute power.
Card | CUDA Cores | Price | Cost/CUDA Core | KH/sec | Cost/KH/sec |
---|---|---|---|---|---|
EVGA Titan X | 3072 | $999 | $0.33 | 12021 | $0.08 |
EVGA GTX 980 | 2048 | $503 | $0.25 | 9136 | $0.05 |
MSI GTX 980Ti | 2816 | $649 | $0.23 | 11693 | $0.05 |
MSI GTX 750 TI | 640 | $129 | $0.20 | 2993 | $0.04 |
So there's an interesting result here. The best deal / KHash is the lowly 750 Ti. The problem is in order to get the same total yield as a single 980 TI you need four of them, which means multiple Z800's and more power. Probably we want to make the most out of our one box. For $1000 we can put in two GTX 980's, and get 18272 KHash/sec. The 980 TI would do more, but it's significantly more expensive and the current models available take up three slots: it's not going to fit into the Z800's case along with the RAID controller we installed previously.
Right now the rig mines $56.60 worth of DASH per year with 2993 KHash/sec.
The upgraded machine will yield $345, which seems like a much better deal, a 30% return on our investment, until we remember we haven't yet paid for electricity, and video cards are a depreciating asset, so our $1000 is gone and we're paying $350 / year in additional electricity bills. Ouch.
For a benchmark let's assume we just invested that $1000 in something yielding 2% / year with no appreciation. At the end of three years we'd have $1060. At the end of three years running our dual GTX 980 system, we'd have a $1015 loss, at least, assuming nothing's burned out and had to be replaced.
If you are doing something else running a server 24x7 which is going to make us at least $1075 this might still make sense: the investment in the dual GTX 980 managed to get us just about breakeven on an operational basis. But as an investment on its own it does not work, not unless you are speculating about cryptocurrency appreciation. I ran those numbers too and we'd need to see a 36.67% appreciation over three years before you start getting a return on investment. This is not impossible -- we've hit that level twice in 2015 -- but so far Bitcoin hasn't sustained that level.