The State of Mining Today


#1

@Tracy encouraged me to start a thread and share some mining knowledge.

About me: I’m an ex sysadmin converted full-time miner/crypto trader. Crypto mining has been a passion project of mine for years and I’ve been doing it full time for about 6 months now. I’ve done my homework on mining and am more than happy to be corrected/updated.

Ok, so, mining in 2017.

In my opinion, CPU mining is done. While some currencies (namely Monero) can be mined for a profit with a CPU, the power/cost ratio doesn’t make sense to me. The heat expenditure and wear on your equipment doesn’t seem worth it at all.

GPU mining. AMD has been (for now) dethroned as the best. NVIDIA 1070’s and 1080ti’s are the fastest mining cards on the market for many currencies (always note, each card performs a little differently per currency, unique “profit niches” are not uncommon so do your homework.) The new AMD VEGA 64 is rumored to be the new king of all cards but I haven’t seen proof yet. NVIDIA is releasing the P106 which is a mining specific 1060. The ROI on these things is going to be INSANE.

ASIC’s
These have killed mining markets for many currencies and for others (I think bitcoin is the only example of this) they have guaranteed the long term viability and success. What is an ASIC? They are cards that can mine a specific algorithm or sets of algorithms far faster than anything else. They’re purpose built but they cannot be reconfigured. A Bitcoin ASIC can only mine coins that use that algorithm. They are completely useless for mining say, Ethereum.

ASIC’s are very very powerful but also, the market is quite saturated. You need to have absolutely amazing power costs and low rent to significantly profit from them in most cases.

Mining in general has been on the decline of late but the market is still going strong and is very profitable. There are numerous hardware setups that you can buy that still pay for themselves in less than 6 months. Feel free to reply with any questions or tell me how wrong I am. Sorry if this is incredibly general, reply if you have a specific question!


#2

Thanks for your post, it’s really helpful!

Noob here with some questions! :joy:

It seems that ASIC is much faster than GPU am I right? How do they compare?

Also, if the market is quite saturated for for ASIC’s why is it not saturated for GPU as the objective is the same?

At last, how does mining compare to trading? If it is not possible to answer generally, let’s say how does it compared historically?

Thanks in advance!


#3

Thank you for sharing your knowledge. Nice post👍🏻


#4

Personaly, I think the P106 will be useless.

Compared to a 1060 : the price is the same, the hashrate is the same but the warranty is only 90 days ! And you can not sell it back to gamers unlike classicals GPU.

So stay away from it unless NVIDIA drastically changes his model.


#5

the L3 ASIC miner is a beast… even at 800W its giving you like 450-500 hash rate.

the problem is, its algorithm specific, so if the difficulty increase and the profits go away it becomes a door stop… but it would have made you tons of money before that, so still a good investment at this time IMO.

with graphics cards, your not locked in to 1 algorithm. the hashrates arent as good even with 4 in one rig… BUT once the difficulty gets to hgih you can switch to mining another coin or sell your card to a gamer and dump that money directly into the coin.

If i was able to get an L3 miner today, id buy one. i couldnt so i went with the vega and some 480’s


#6

something regarding the vega hype:

Personally I’d invest into 1070s, if you are into mining right now.


#7

Thanks so much for your knowledge. This is priceless!!! Keep posting please!!!


#8

That’s what we are doing.


#9

Can anyone post real stats on a VEGA 64? I haven’t seen anything I trust yet but I’m kind of expecting them to be a dud on the ROI. After the Claymore 9.7 update, 1070’s have been the dominant beast on the market for ROI.


#10

@peter

when you say “thats what we are doing” are you referring to the L3 ASIC miner or using GPU’s and resllling them to gamers once the difficulty spikes?


#12

Based on a past video of his, I’m pretty sure Peter is doing ASUS 1070 ROG STRIX’s (hopefully he got the O8G model and not the plain ol 8G)

Not sure if that’s changed since then, just what I remember.


#13

So, with respect to state of mining today. I’m both a software and hardware engineer, so I know both sides of the computing coin very well. I’ve done enough research that I’ve concluded that CPU based mining is pretty much done. It’s either GPU or ASIC hardware.

Long story short: I almost dropped $5k into hardware purchases about 3 weeks ago (mid July 2017). I went so far as to add all the components to a shopping cart and get to the step of keying in my credit card info.

However, two things caused me to simply buy and hold bitcoins with that $5k instead:

  1. So-called profitability calculators. Before I hit that send button, I thought to myself, "surely someone has figured out the full cost of owning and running this equipment and reported their P/L online somewhere. That led me to finding 5 or 6 different websites offering profitability calculators. I cannot determine which ones are trustworthy as I have no prior knowledge/experience with hardware myself. One calculator says I’ll make about $400 or $500 a month. Another calculator with the same parameters filled says I’ll lose $150 a month. The only ones to show profitability were for Scrypt based coins. SHA-256 (i.e. BTC) has been effectively declared non-profitable by almost every article I could find and the calculators backed that up.

  2. Having decided I wanted to mine Scrypt coins and not Bitcoins (SHA-256), I could not decide at the time if rigging up a GPU array was the way to go or if ASIC was the way to go. If I understood the way the algo works correctly, Scrypt mining is much less optimizeable (yup, made that word up) via GPU than SHA-256 and it’s more a factor of how much RAM you can throw at it. At least, based on everything I read about the algo itself, yet actual miners didn’t really seem to talk much about the RAM they’re throwing at the problem nor whether they were also throwing high-end CPUs + high-end GPU’s at the algo. Logic dictates, at least to me, that I should throw all three in equal amounts at the algo.

So…standing at an impasse, I looked and saw BTC climbed $300 in the 2 days I was heavily researching all this and said, screw this, I’m buying in now and I’ll figure the mining part out later.

Now, let’s figure the mining profitability thing out. First thing’s first. Are any of the profitability calculators floating around online any good? As near as I can tell, the most hyped ones are also written and backed by the very sellers building and selling ASIC hardware, so that automatically makes me suspect as to integrity of the numbers reported.

Has anyone published a set of formulas anyone can load up in an Excel spreadsheet and fully follow the logic and math that leads to the reported bottom line?


#14

Ok back of the napkin here but I was really surprised how accurately this matches my payouts:

Firstly, this is one of the only trusted number you need, the total network difficulty: https://etherscan.io/chart/hashrate

Today that’s 82,477 Gh/s

Ok so my miners are running at about 1.1 Gh/s https://eth.nanopool.org/account/0xF00eE7A27c593B2b977ff00734D06c8c0E014a7d

The average ethereum blocktime right now is 22 seconds (doing 20 for easier math)

5 Ether mined per block plus uncle payouts (not included)

5 Ether every 20 seconds

5 x 20 (seconds) x 60 (minutes) x 24 (hours) x 30 (days) = 1 month of ethereum payouts to the whole network
648,000 Ether produce a month

Ok so back to network difficulty. My miners (1.1 Gh/s) / 82,477 Gh/s is my ratio of the network.

Ratio * 648,000 = your Ether per month

Sorry if that’s unclear, I’ve been up all night dealing with a frozen condenser on an AC that overheated the whole house, I’m sooooo sleepy.


#15

Here’s a quick formula for you to use in a spreadsheet

Your mining speed / Total Network Mining Power (https://etherscan.io/chart/hashrate)

Calculate monthly ethereum by check avg block time.

5 ETH per block plus occasional uncles. At 20 second block time that’s 648,000 ETH a month

Multiply by total ethereum in a month to get monthly payouts.


#16

Yes, ASIC’s are far superior to GPU’s in every way. They don’t tend to compete with GPU’s however. Either a currency is mined with ASIC’s or GPU’s. Not both, once ASIC’s show up, GPU’s go out the window in weeks for that algorithm.

When you hear the conversation about competitive ASIC mining, that pertains to just BTC pretty much. In that case, it’s a matter of power cost more than anything. Those with the best power costs keep increasing the difficulty (buying more hardware, even at a loss) to drown out the competition. We haven’t seen the same arms race in other currencies yet.

GPU’s haven’t saturated the market yet because the hardware is not dense enough to be massively profitable on any enterprise scale so you don’t see the same level of investment as ASIC’s. Also GPU manufacturers are constantly running out of supply, HOLY CRAAAAAP.

All my opinion, I don’t have much data to back any of that up off hand.

Historically speaking, mining is better in the long run however, it’s increasingly becoming a gamble to get into that game now as the returns are a diminishin. I should say that daily swing trading is the single most profitable thing you can do, but risk is a bitch. Trading less, diversifying, and going in for the long haul when necessary is a far better approach I believe because you really reduce your risk overall.

Mining on any kind of a large scale though is difficult and requires knowledge, knowhow and dedication. (and a loving girlfriend who tolerates the power cords… oh the power cords…)


#17

Speaking to ASIC vs GPU, is my understanding correct that scrypt algo was designed to put the balance of the work onto the RAM so it’s not easily contained in ASIC hardware? Meaning, there’s a good chance it’ll be a long while before ASIC that blows away current commodity hardware rigs built from motherboards, risers and multiple-GPUs per motherboard.


#18

You’re referring to ASIC resistance and yes, there are a few algorithm’s designed to not allow ASIC’s. ETH is actually one of them as well (Dagger-Hashimoto) I’m not too well versed on that subject to be honest.

The reason I see GPU mining having a pretty serious shelf life is because, as the markets adopt currencies more and more, they’re really going to look at energy usage. The green initiative that is going on globally will hammer down on crypto mining as it gets more popular and more commonly understood. The amount of power that goes into mining a single block is ludicrous. When we get to the point that every American under 30 has some for of crypto wallet (maybe 10 years from now, hell maybe 5) GPU mining will be gone or on the fringes I think. Once a few cases of PoS are successfully demonstrated, mining as a whole may have a target on it’s back.


#19

Vega 64 hashrates from two different people
https://postimg.org/image/lzm14hg8r/
https://postimg.org/image/5xrrzxw2h/


#20

So in a spreadsheet…your numbers come out something like this…

image

I get 604,800 ETH per month simply because I 4 x Weekly instead of 30 days as your path took.

To me, the “magic number” seems to be the block discovery frequency. Is that number computable ahead of time based on hash difficulty and expected processing power of the hardware I’m purchasing?

Secondly, I wasn’t sure how 604k ETH mined translated into a payout of 8.23 ETH coins each month. That seems to basically say all the coins from all the miners are thrown into a pot and it’s scaled down to some sort of payout / discovery ratio.


#21

That’s hilarious:

CALLLLLLEEEED ITTTTTT