If you care about the environment even a little bit (like turning off lights in rooms you're not occupying) then you will reject Web3. Even the most efficient blockchains use more energy than the status quo unnecessarily.
This is also to say nothing of the fact that it's more expensive per USD/KB transferred, slower and more complicated.
I think what Web3 should be is a way to use your laptop or any commodity computer as infrastructure for your data, and there should be APIs for websites such that it uses your computer as the source as opposed to their own servers.
For example this comment could be saved on my computer, but accessible to everyone viewing even if my computer is off via caching, but ultimately I could invalidate and delete.
> it uses your computer as the source as opposed to their own servers
> this comment could be saved on my computer, but accessible to everyone viewing even if my computer is off via caching
It sounds to me like you're just renaming datacenters from "origin" to "cache", without any meaningful difference in how the data is stored and retrieved in practice.
What about proof of stake chains, do you reject those too? How many companies with huge carbon footprint do you reject? What about cows did you reject them?
the public dismay at the carbon footprint of crypto is always fascinating to me. The network rewards are setup in such a way the the most profitable miners are the ones with the cheapest electricity as this is their biggest overhead. This pushes miners to the cheapest forms of electricity, i.e renewables
> This pushes miners to the cheapest forms of electricity, i.e renewables
I would love for this to be true (and that's why I used to believe it). But there are two problems with this:
• Renewables aren't the cheapest form of electricity; low-value (dirty-burning) or subsidised fossil fuels are cheaper in many places. You've heard of people buying and re-commissioning old coal power stations for crypto mining, I'm sure?
• Using any grid electricity drives up the price of other electricity, by market forces. The effect is local, but when cryptomining is happening globally, that's a global effect. That means that otherwise-infeasible inefficient (and polluting) electricity generation is now viable.
Greenest ≠ cheapest. If this were a universal truth, we wouldn't have a climate problem in the first place.
From https://www.globalpetrolprices.com/electricity_prices/ - the countries with the cheapest energy prices (and show up in the hash rates) are those that are using fossil fuels (and likely trying to subsidize those prices from the government to avoid civil unrest).
> If you care about the environment even a little bit (like turning off lights in rooms you're not occupying) then you will reject Web3. Even the most efficient blockchains use more energy than the status quo unnecessarily.
On an Intel NUC (Core i3, low power mode) I'm running a non-mining Ethereum 1 full node[1] plus a staking Ethereum 2 node[2] (comprising two active validators) on mainnet. Measured with a Kill A Watt[3] since genesis of the beacon chain, it's using approximately USD 140 kWh of electricity per year (about USD $15/year where I live), and makes use of the Internet connection that I use for everything else personal and work related. The Ethereum 1 node also acts as my personal gateway to Ethereum vs. say my needing to connect through Infura.
There are today 279235 active validators[4] on Ethereum's mainnet beacon chain. Now, I know that Ethereum hasn't made the switch over to Proof of Stake yet (that's what Eth 2 is all about) but it's coming this year. Let's ignore the kWh usage of my non-mining full Eth 1 node and assume the 140 kWh is split evenly by the validators (it's not even close, the Eth 1 node is a pig in comparison, but for sake of argument), then round each one up to 100 kWH per year and assume that's the average per validator going forward, and let's grow the beacon chain to 1 million active validators. So that's 100k MWh per year. Amazon reported[5] that they consumed 24 million MWh in 2020.
I'm not sure how many combined MWh are consumed by the data centers for VISA, traditional banks, etc., but I'm guessing it's nothing to sneeze at. According to Statista[6], it costs about 150 kWh for VISA to process 100k transactions. According to VISA[7] they processed about 206 billion transactions over 12 months. So that's about 309k MWh.
A couple of things to consider also. Ethereum devs are concerned about energy consumption, and there are active efforts to drive down the energy cost per validator by the various projects (nimbus, teku, etc.). Also, my Core i3 Intel NUC is pretty heavy-duty compared to lower-end hardware capable of running a validator node. So I expect the energy cost/year of Eth 2 to improve in coming years.
Via carbon offsets, so they emit all the co2 up front and then hope the forests their partners plant are both real and will be properly managed for the next few decades.
This is also to say nothing of the fact that it's more expensive per USD/KB transferred, slower and more complicated.
I think what Web3 should be is a way to use your laptop or any commodity computer as infrastructure for your data, and there should be APIs for websites such that it uses your computer as the source as opposed to their own servers.
For example this comment could be saved on my computer, but accessible to everyone viewing even if my computer is off via caching, but ultimately I could invalidate and delete.