After its introduction in 2008, increasing Bitcoin prices and a booming number of other cryptocurrencies lead to a growing discussion of how much energy is consumed during the production of these currencies. Being the most expensive and the most popular cryptocurrency, both the business world and the research community have started to question the energy intensity of Bitcoin mining. This paper only focuses on computational power demand during the proof-of-work process rather than estimating the whole energy intensity of mining. We make use of 160GB of Bitcoin blockchain data to estimate the energy consumption and power demand of Bitcoin mining. We considered the performance of 269 different hardware models (CPU, GPU, FPGA, and ASIC). For estimations, we defined two metrics, namely; minimum consumption and maximum consumption. The targeted time span for the analysis was from 3 January 2009 to 5 June 2018. We show that the historical peak of power consumption of Bitcoin mining took place during the bi-weekly period commencing on 18 December 2017 with a demand of between 1.3 and 14.8 GW. This maximum demand figure was between the installed capacities of Finland (similar to 16 GW) and Denmark (similar to 14 GW). We also show that, during June 2018, energy consumption of Bitcoin mining from difficulty recalculation was between 15.47 and 50.24 TWh per year.