Ever since Bitcoin’s inception its trust-minimizing consensus has been enabled by its proof-of-work algorithm. The work is performed by so-called miners; power hungry machines that produce as many hashes per second as possible hoping to create a valid new blocks of transactions for the Blockchain. In return, miners that produce a valid block are rewarded with a fixed amount of coins, along with the transaction fees belonging to the processed transactions.
Nowadays, the miners in the Bitcoin network are producing a whopping 3.6 exahashes (one exahash of mining power equals a quintillion hashes) per second. This has raised many concerns over the sustainability of the protocol, as Bitcoin can only handle three or four transactions per second at best. By comparison, VISA has a peak capacity of 56,000 transactions per second, but requires nowhere near the same computational power. It’s thus quite obvious that Bitcoin must be many times more energy intensive per transaction than VISA.
The question “how much more?” is a lot harder to answer. The total network hashrate doesn’t easily translate to a certain electricity consumption, as there is no central register with all active machines (and their exact power consumption). In the past, electricity consumption estimates typically included an assumption on what machines were still active and how they were distributed, in order to arrive at a certain number of Watts consumed per Gigahash/sec (GH/s). This figure could then be used to arrive at an electricity consumption estimate, based on the observed hashrate. This arbitrary approach has led to a wide set of electricity consumption estimates that strongly deviate from one another, sometimes with a disregard to the economic consequences of the chosen parameters.
Update: The methods discussed in this article hereafter have been anchored in peer-reviewed academic literature. The full paper can be found here.
One of the first papers that tried a different approach was titled “A Cost of Production Model for Bitcoin” (February 2015) written by Adam Hayes (The New School for Social Research). In his paper Hayes assesses that Bitcoin production “seems to resemble a competitive market” and therefore “miners will [in theory] produce until their marginal costs equal their marginal product”.
These marginal costs are almost entirely due to electricity costs (and some negligible variable costs). For at least the observable hashrate other costs (such as the original purchase of the mining equipment) can be ignored. Hayes also briefly explains this as “each unit of mining effort has a fixed sunk [unrecoverable] cost involved in the purchase, transportation and installation of the mining hardware. It also has a variable, or ongoing cost which is the direct expense of electricity consumption”. In other words, a miner’s decision to mine doesn’t depend on how much was paid for the equipment, just on the future (electricity) costs. (Note that this model is different when it comes to deciding whether to build/buy new mining hardware.)
Since the marginal product of mining is equal to the number of Bitcoins received per unit of mining effort, it would thus be expected that miners will either add more hashrate if the resulting revenue exceeds associated electricity costs, or reduce the hashrate once electricity costs start exceeding the revenue per hash. This also means that it is expected that the total network of Bitcoin miners is always mining at the calculate-able break-even efficiency. The break-even efficiency for Bitcoin mining can simply be calculated as:
W per GH/s*=(price∙BTC/day*)/(price per kWh ∙ 24hrday)
The resulting efficiency provides a very useful and objective indicator for miners on whether they should undertake or give up mining.
Unfortunately, the world of Bitcoin mining may be less competitive than it initially seems as a result of some serious entry barriers. For example, costs per KWh may differ strongly per region, and large scale mining operations (requiring significant startup capital) will typically outperform smaller ones due to economies of scale. Additionally, variation in the actual average costs of mining may occur due to price behavior. When the price of Bitcoin falls mining revenues are squeezed while the associated costs remain the same, causing their average costs to go up. Likewise, a (strongly) increasing price might lead to mining requiring some time to “catch up”. It, however, doesn’t change the fact that the calculate-able break-even efficiency still provides an objective upper bound on Bitcoin’s total electricity consumption. Using this number alone can already provide some powerful insights.
To get an overview of the historic break-even efficiency, it is first required to know past revenues available to miners. Since the block rewards are known, we only need to add information about the price level. Doing so reveals that Bitcoin miners are earning far less (in total) today than they were during the peak near the end of 2013, as shown in the image below. This is partially due to a drop in the Bitcoin price, as well as the block rewards being halved in July 2016 (hence the sharp decline at this point in time).
The loss of revenue certainly hasn’t prevented Bitcoin’s hashrate from growing exponentially to its current all-time high. This is the first sign that the efficiency of Bitcoin mining must have increased dramatically over time.
If the average price of electricity was a fixed number, it would be easy to derive the break-even efficiency based on the total revenue numbers. It is, however, most likely that the average price paid by Bitcoin miners per kilowatt-hour (KWh) has decreased over time. In 2013 it was still possible to mine with a GPU at home. With the start of the age of application-specific integrated circuits (ASICs) mining, the previous started to change. Increasing startup capital was required to benefit from economies of scale, in order to be be competitive in a continually more centralized environment. These industrial scale mining operations are unlikely to be paying more than 5 cents per KWh, while residential rates may easily exceed 12 cents per KWh. For this article, it is assumed that the average price paid per KWh decreased from 12 to 5 cents per KWh during the transition period in 2014. This causes a graph of the historic break-even efficiency (up until the end of 2014) to look as follows:
This graph shows that Bitcoin mining was once perfectly profitable with a GPU miner at home, as the break-even efficiency exceeded 1,000 J/GH. The total network hashrate has increased by more than a factor 100,000 since, as shown in the first graph. It also shows how the introduction of ASIC miners has rapidly pushed the break-even mining efficiency ever closer to zero Watts per GH/s. This trend has continued over recent years (shown hereafter), albeit at a slower pace.
Increasing total electricity consumption
Along with the historic break-even mining efficiency we also get an overview for the historic upper bound on the total electricity consumption by the Bitcoin network, after combining this number with historic network hashrate.
This graph shows that the derived upper bound for Bitcoin’s electricity consumption is trending up despite the halving event in 2016. This is remarkable considering mining revenues are far from their all-time high as shown earlier. The trend can be explained considering the drop in the average price paid per KWh. As a result of mining happening on an industrial rather than on a residential scale, the revenues from mining can now pay for more kilowatts-hours than ever before.
Economic theory is extremely useful for finding an upper bound on Bitcoin’s total electricity consumption, but a lower bound is more difficult to estimate. This number is frequently determined by taking the efficiency of the best available miner and applying this to the network hashrate. Since the total network hashrate was about 3.6 exahashes per second by the end of March 2017, we would have to multiply this by 0.10 J/GH (required for the Antminer S9) to find a lower bound. We would then find an electricity consumption of almost 3 TWh per year.
This approach is obviously flawed, as it ignores many older machines that are still profitable. In reality, new machines will slowly make older ones obsolete, starting with the least efficient ones. After all, based on economic theory, they are objectively expected run for as long as their electricity costs remain below the produced revenues.
Even if this wasn’t the case, and we could safely assume all machines in the network were the most efficient ones, the minimum electricity consumption would remain a purely theoretic number. A detailed examination of a real-world mine, running only the most efficient mining machines, shows us that that the actual electricity consumption can still be 70% above the theoretic optimum in this case. This is due to the fact that relevant factors such as machine-reliability, climate and cooling costs aren’t part of the approach. But at least the number guarantees actual electricity consumption won’t be any lower.
The previous leaves the question how much electricity the total Bitcoin network is consuming exactly. By the end of March 2017, it could be anywhere in between 3 to 16 terawatt-hours per year. The latter would mean Bitcoin’s electricity consumption equals that of Croatia, while the former would place Bitcoin closer to Jamaica. Objectively, electricity consumption should be closer to the upper bound (based on the break-even mining efficiency), as any less would indicate miners could still profit from adding more hashrate. At the very least the network should therefore be trending towards the break-even point.
Regardless of the exact electricity consumption, we have observed that this number on the rise due to miners squeezing in more kilowatt-hours in the same (and even decreasing) amount of revenue. At the same time, the price of Bitcoin is back at record-highs. As a result, the current lower bound on Bitcoin’s electricity consumption now exceeds the upper bound from the first half of 2015, despite the block reward halving in 2016. With the Bitcoin network processing 300,000 transactions per day, this means electricity consumption per single transaction equals 55 kilowatt-hours even in the most optimistic case. This is enough to power a single U.S. household for two whole days. And since a new block reward halving event is years away from now, Bitcoin isn’t likely to become more sustainable any time soon.
Live Energy Consumption Index
A daily estimate of Bitcoin’s energy consumption is provided by the Bitcoin Energy Consumption Index. This index is built on the economic theory discussed previously, but has a few specific assumptions that are put in place to produce the “most likely” day-to-day estimate.
The total costs for miners can be decomposed in operational costs (mainly electricity) and capital equipment costs. The first assumption of the index is that miners will ultimately spend a maximum of 60% of these total costs on their operational costs. (The effective percentage cost will be lower than this number when the price is rising, due to the lag introduced in the next section.) The chosen percentage ensures that the production costs of ASIC miners are always covered, as can be shown with the help of Bitmain’s Antminer S9.
The Antminer S9 is estimated to cost about $500 to produce. Based on the performance of both the older Antminer S5 and Antminer S7 it can be expected that the average lifetime (from the production phase up until the moment profitability falls below the break-even point) of the Antminer S9 will be close to two years (exceeding at least 450 days).
Given an expected lifetime for the Antminer S9 it’s easy to determine how much electricity the machine will consume during that lifetime (knowing each Antminer S9 runs at about 1380W). At a rate of 5 cents per kilowatt-hour this yields the following estimate for the lifetime costs of an Antminer S9:
A decision to add more hashrate (via newly built Antminer S9 machines) will thus be based on an expected marginal cost of $1.659 per machine (or at least $1.245 in a pessimistic scenario). Similar to before, it is expected that more of these machines will be added until the marginal costs equal the marginal revenue. Once the network reaches an equilibrium, the average electricity costs of the network would then have to equal at least 60 percent or more of the total costs. This finding is in line with Croman et al. (2016) who finds the following:
“Our calculation suggests that, at the maximum throughput, the cost per confirmed transaction is $1.4 – $2.9, where 57% is electricity consumption for mining.”
After finding the operational costs of mining, it required to know how much is being spent per kilowatt-hour (KWh), in order to be able to arrive at an energy consumption estimate. For industrial scale miners this is a limited price, although it still differs significantly per country. It can be 4 cents per KWh in some Chinese regions, as confirmed by several professional miners. For example, a visit to Bitmain’s mining facility in the city of Ordos in August 2017 helped reveal the facility was paying just 4 cents per KWh. Another example is ViaBTC. The latter offers cloud mining contracts that include electricity costs of 0.35 CNY per KWh (per November 2016). This translates to roughly 5 cents per KWh in USD, and likely includes some other expenses since it is a cloud mining platform. This would be in line with a general rule of thumb to add 1-2 cents to the per KWh to cover other costs such as labor (in China that is). It is well known that the majority of Bitcoin mining takes place in China, so also the index runs on the assumption that 1 KWh is paid for with every 5 cents spend on operational costs.
By design the Bitcoin Energy Consumption Index is strongly linked to the Bitcoin price (after correcting for fees and average time between blocks), but under certain circumstances this may cause a discrepancy with the actual mining hardware in the network. Intuitively it is to be expected that an increase in available miner income is followed by more miners on the network trying to take advantage of this. It, however, might take some time before these machines can be up and running. This happens typically if price increases exceed expectations miners may already have had on the future price of Bitcoin, or simply when production lines are already running on full capacity.
The time it takes for miners to fully respond to a changing situation is therefore dynamic as well. This is done via an approach that is heavily inspired by a mean reversing generalized autoregressive conditional heteroskedasticity (GARCH) model commonly used in finance (the actual method itself is too complicated to be handled by the tools used in making the index), and by linking the resulting volatility to the index’ time window. In the end, responding slowly to an increasing price does represent an opportunity cost for miners, so the gap is expected to exist only the minimum possible amount of time.
Lastly, one may always reach different conclusions on applying different assumptions. The chosen assumptions have been chosen in such a way that they can be considered to be both intuitive and conservative, based on information of actual mining operations. In the end, the goal of the Index is not to produce a perfect estimate, but to produce an economically credible day-to-day estimate that is more accurate and robust than an estimate based on the efficiency of a selection of mining machines.