Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Taking a closer look at AI's supposed energy apocalypse (arstechnica.com)
58 points by ctoth on June 29, 2024 | hide | past | favorite | 86 comments


It's not about an energy apocalypse at all. It's about the cost versus benefit when the principal cost on large scale deployments is energy. AI stuff takes a lot of energy to run therefore costs a lot of money to run therefore requires revenue that matches the energy used.

Now the problem is that if someone puts a $10/month price on the AI functionality, they will stop paying for it because it doesn't really have a $10/month ROI.

Literally every AI thing I've seen people use has got an interest taper off way before the free trial even ends.

The constraining factor will prevent an energy apocalypse: It doesn't materially improve most people's lives at all.


> It doesn't materially improve most people's lives at all.

What about all the people here who were claiming GPT-4 had become an essential part of their developer job?

They seem to have gone quiet recently, but they must still be here.


Two reasons they went quiet:

1. At some point it generated something that cost the business money. Happened to us. Direct root-cause analysis lead to ChatGPT generated code not being aware of the correct way to handle a transaction. First this took out stuff due to leaving transactions open. Then after that was fixed by someone who didn't know what they were doing, it turned out that the scope of the transaction was wrong which caused invisible data loss when a rollback occurred.

2. The free trial ended and no one wanted to pay for it.


I'm still here. GPT-4 (and now Claude 3.5 Sonnet) are still tools I use every day to help me write better code more productively.

https://simonwillison.net/series/using-llms/


These are the same "developers" who claimed their application was secure because they used multiple blockchains.


The energy equation can be improved dramatically with tailor-made hardware. Bitcoin has showed how much was possible, these days using even the fastest gpu is totally unviable. Of course llm inference is much more complex but there's still much optimization to be had.

And 10$ isn't an awful lot of money to justify. It's like half an hour a month (and like a smoke break's worth of salary in silicon valley lol)

I think AI has its uses. I just think they are overblown and applied too widely. There's a few things it's really good at. And in those cases it will shine. The rest is hype.


$10 a month isn't a lot. Until you have 10,000 staff like us. That's $1.2m a year.


I expect that silicon valley developers will indeed pay for some AI products long term. Some other professionals will too.

But, people are spending shocking sums on these products. They need to make a vast sum to pay back what they've spent, and that sort of income can't cover it.


I dunno. ChatGPT is still ridiculously good as a search engine. Learning a new language now is a completely different experience for me. I've recently found myself putting something into Google, realising there is nothing of value in the first page if results and taking my question to ChatGPT instead. It's answers might not be correct, and man is it confident about things it just made up, but it is still a much better experience than clicking though ad laden blogspam, forums and question boards to find the same information.


Yep, same. It’s a different world for using latex cause i can get it to spit out example code for all sorts of things.


> Literally every AI thing I've seen people use has got an interest taper off way before the free trial even ends.

Midjourney, Suno, Copilot, ChatGPT, Jasper, Luma, etc would all like a word with you.

I think the real problem here is that the barrier of entry to start a new "AI thing" is so low, you're often flooded with subpar services that give more recency bias toward "every AI thing I've seen people use has got an interest taper off way before the free trial even ends".


This is an economics problem: if the cost of energy is lower or equal to the benefit of consuming it, the energy will be consumed. As more and more energy is consumed to run these datacenters, price of energy will go up, and datacenters will learn to either be more efficient, or will pass on these costs to their users who will learn to be more efficient. Of course, there might be inefficiencies like subsidies and monopolies that distort the market for energy for a while, but in the long run, this is a self-fixing problem.


Are environmental externalities accounted for? How do we accurately measure the 'benefit' of something when it's still largely speculative (ie. AI is not providing as much value RIGHT NOW, but we think it will in the future).

Economic problems are rarely just economic problems, and saying that it will resolve itself in the long term is dismissive and offers no value to the discussion.


That's why high-carbon sources like coal and oil should be subject to carbon taxes: to price in externalities.


Not a carbon tax as such, but the supreme court in the UK recently ruled in favour of an activist lead challenge to stop Surrey County Council from ignoring the environmental cost, when granting permission to dig four new oil wells, in what is seen as a landmark case.

Emissions from burning the fuel had been ignored but will now need to be taken into account.

https://www.theguardian.com/commentisfree/article/2024/jun/2...


We're too late for this to be the main model, we need to ramp down fossils usage way faster than can have carbon taxes happen in most of the world. (Eg when would USA realistically stop/decimate oil and gas production?)


I 100% agree. A carbon tax is a purely capitalist thing to do, and by not having one you incentivize bad outcomes.


The environmental externality of new energy in the US is likely ~0, potentially even net positive, because approximately all new energy in the US is renewable, and displacement effects are likely to dominate the small-single-digit percentage of nonrenewables and the marginal environmental cost of the renewables themselves.


Even if that's true, why would we only price in the externalities of new energy?

If we have a coal plant and add one wind farm we still have one coal plant.


Because the article is in context of new demand.


I'm just not sure if that's a helpful way to look at it though because we don't know the counterfactual. If we didn't generate new demand would we still have created more renewable energy and shut down non-renewable energy sources.

I suppose it's fair to say energy demand will always go up, and if all new energy creation is renewable then when you extrapolate to a limit the amount of energy from non-renewable sources is an infinitely small percentage of all energy production? I don't know if that is a fair way to model the future though.


The price of renewables scale down as demand rises, and the capacity we have building them is growing at an exponential rate at least somewhat capped by demand. It follows that if you want to displace existing power sources, you're probably better off driving demand up, because that lowers their price and makes replacement more economically attractive.

As long as you're not driving demand up faster than the industry is able to increase its rate of scaling (which doesn't yet seem the case with AI, though that's not to say it couldn't be at some point), this should help a renewable transition.

Another effect is that you push issues that arise in transition earlier. For example, the sooner you have a lot of solar on the grid, the sooner grid-scale batteries are incentivized, and the sooner battery production scales. This feeds back into the feasibility and profitability of replacing existing nonrenewable sources.

A simple example where this effect is very clear is synthetic fuels. Synthetics don't compete on a dollar-for-dollar basis yet, but from an energy basis they're getting close. There's an obvious demand phase change once you hit that price point, and the sooner you hit it the better. If you believe the scaling trends, and I think you broadly should, more solar brings that point forward and so is clearly environmentally beneficial.


Appreciate the insight here. It's not an obvious way to think about this to me, but it does make a lot of sense, especially as you think about the trend over time.


Lithium extraction to build millions of solar panels, batteries etc is not a 0 impact operation.


Solar panel production doesn't use lithium. They're made from sand and a lot of energy. A bit like silicon wafers.

You might mean the storage to keep those data centers running throughout the night though and yeah. But this problem is not very big in the scope of other battery challenges like transportation. Because for data centers you don't need the best energy per weight or volume which you do need in a car.

And there's other sources of renewable energy too.


Hence why I said “displacement effects are likely to dominate [...] the marginal environmental cost of the renewables themselves.”...


Well good news you don't need lithium to build solar panels.


This is a straw man usually used by anti electric car people.

There’s always an impact. If you had a sandwich for lunch, the wheat in the bread destroyed a grassland somewhere.

Usually in context of energy, we’re talking about carbon impact.


Yes, however the economic situation develops not necessarily to most people's advantage.

A (hypothetical) AI that can perform any intellectual labour at for an energy cost of 1 kWh to produce output at the quality and quantity levels of a median human working for an hour, at $0.1/kWh is going to be a better economic choice than 50% of humans on earth even if those humans cut their wage demands to the UN abject poverty threshold.

But at the same time, there's not (currently) enough electricity for 50% of the world's people to be replaced by such AI regardless of the price.

Thus energy prices rise until the machine labour is as expensive as the human labour.

But that likely results in a lot of people even above that threshold no longer being able to afford to keep the lights on.

If we only get this AI quality/cost/generality level around 2031-ish, that may be fine, because renewables are on an exponential growth curve and around then exceed current global demand all by themselves.


The issue is an environmental one. The cost here is environmental damage. The solution will need to be policy decisions that take that damage into account.


That's why much of the world is gradually shifting over to renewable energy usage. In a world where energy is renewable, does not generate greenhouse gas and (hopefully) does not produce other externalities, our best way of dealing with these resource allocation problems is the free market.

The alternative is regulation, which, I'm not even sure how it could be used to address this particular issue. Ban AI or crypto? Humans will find new ways to waste vast amounts of energy with computers. Require that computers have a certain efficiency rating of megaflops/joules? People will use more of this now far more efficient computing, resulting in more overall usage. Require corporations to be audited for energy efficiency and be certified? Goodbye all small players that cannot handle all the overhead of such legislation.


The trouble I see is that new consumption matching (or even some locations where proposed DCs exceed) new green generation may not let us displace dirty generation such as coal, which we should be looking to eliminate.

The other issue that comes up with green generation is how variable it is, sometimes it'll be cloudy and still air, others will be windy and sunny, and the grid needs to capable of supplying everyone in both scenarios and not overbuilding generation they can't turn off. Having industries available to soak up an appropriate amount of energy available within their ecosystem would be great, but it's a hard sell to anyone to invest billions in sites where there could be no guarantees on utilization if they get bad weather for a few months. If there is any "plays nicely with others" regulation on consumption of a common resource, it's going to be reactive to problems than ahead of time.


The solution to all of these problems is an aggressive buildout of nuclear fission plants to provide baseline power and accelerate the closure of all remaining plants which burn fossil fuel. Subsidized if necessary, as essential infrastructure.


Which is better -- subsidize what you perceive to be the solution or tax what is known to be the problem (carbon emissions)?


Infrastructure is routinely subsidized because of the benefits it brings. Taxing the problem can help pay for that subsidy, that has some hazards because now government revenue depends on carbon emissions, but it can be part of the solution, sure. Tax revenue all goes in the same bucket no matter what the government claims.

It helps that the actual electrical generators don't care where the steam is coming from, so we could be leveraging the gensets from decommissioned coal plants in fission reactors. All that's missing is the political will.


Tax data center usage


And watch the entire compute infrastructure and industry of vital importance (state of the art compute capabilities in AI) move to other nations and then later end up being dependent on them.


State of the art compute capabilities, aka hardware, is already built in other nations.

Mandating domestically owned servers is commonplace.

And this was in response to someone whose best idea was “ban ai”


Is the problem the data center or is the problem carbon emissions?

How about tax the carbon emissions.


It would be great if more externalities were priced in to the cost of energy, but there doesn't seem to be anything special about AI datacenters vs. any other use of energy. Data centers actually seem much more consciences about the impacts of their energy. I don't think I've read any headlines about cement plants trying to switch to 100% renewable.



Everyone else has to pay the price that data centers are willing to as well. And they can afford quite high prices before having margin troubles - likely much higher than what we as residents want to pay for energy.


This is on the assumption that supply will continue to remain constrained, excess energy usage, might as well be the investments necessary to shift to more Solar Panels and renewable energy storage, high compute intensive workloads can be executed anywhere, higher energy usage, might as well incentivise optimising AI, Research and Other Compute-intensive training workloads to shift to day time, when solar power peaks, and process everything during that time. The capabilities that this unlocks are tons, driving more investment into advanced battery tech, solar tech, etc.

Higher Consumption also drives investments to generate Higher Supply and hence future surplus and improvements in standard of living.


Yes, you are right in that. However we are not yet seeing much uptake in very flexible compute that works only during high energy production. And while I would love to see it, I am not sure we will anytime soon. One reason is capital costs of the compute equipment - if one only runs 25% of the time (daily solar peaks) - then it takes 4x as long to pay back the equipment. And we are still improving energy efficiency of compute a lot over time, so upgrading the equipment often is very beneficial in terms of TCO. Longer payback times is in conflict with that.


In theory, there's no difference between theory and practice. In practice, there is.


(people sing a different tune when the power consumer is bitcoin miners)


Bitcoin network produces exactly the same number of bitcoins over time regardless of how much power it consumes; AI, even if you are unimpressed by the quality, does actually make more stuff when more energy is consumed.


> The important thing to remember, though, is that there are economic limits involved in the total energy use for this kind of technology. With bitcoin mining, for instance, the total energy usage has jumped up and down over time, tracking pretty closely with the price of bitcoin. When bitcoin is less valuable, miners are less willing to spend a lot of electricity chasing lower potential profits. That's why alarmist headlines about bitcoin using all the world's energy by 2020 never came to pass (to say nothing of efficiency gains in mining hardware).

> A similar trend will likely guide the use of generative AI as a whole, with the energy invested in AI servers tracking the economic utility society as a whole sees from the technology.

This comparison is inappropriate. Cryptocurrency was and is mainly an object of speculation, not of intrinsic economic value. I'm contrast, AI has clearly a lot of economic value, and that value increases with time, since it gets better with time.

Moreover, the rate of improvement over the last few years have been staggering. Things that seemed far out of reach science fiction technology ten years ago are now unremarkable reality. So there are solid empirical grounds to expect that progress in the coming 10 years will be transformative as well. Nothing like that was or is the case for cryptocurrency.


> I'm contrast, AI has clearly a lot of economic value

It's clear it has some economic value. It's not yet clear if it has a lot of it.

So far the only economic value it has has been inflation the valuation and stock of any company that has word AI in it


This is just plain wrong. Autonomous taxis are operating in several cities, I know of several companies that use AI to detect and fix bugs in code, and at least one that is currently experimenting with completely automating the process of updating dependencies, and using AI to do any necessary changes when new dependency versions contain breaking changes.

Those are just a few things off the top of my head, and we're already talking millions if not billions of dollars in actual value, actual services rendered to society with zero speculation involved.


> Autonomous taxis are operating in several cities

So, some economic value. BTW are those companies profitable at all? Or are they like Uber: losing billions of dollars every year with no chance of ever turning a profit?

Though I will agree that decades-old and decades-long research into computer vision and traffic modelling is finally paying off a bit in automotive industry.

> AI to detect and fix bugs in code, and at least one that is currently experimenting

So, unverifiable claims and experiments

> Those are just a few things off the top of my head

Those "few things" are exactly two things, and are evidence of some highly disputable economic value. Not evidence of a lot of economic value.


> > Autonomous taxis are operating in several cities

> So, some economic value. BTW are those companies profitable at all? Or are they like Uber: losing billions of dollars every year with no chance of ever turning a profit?

AI taxis mean that there are no wages for the driver. The AI driver is basically a slave. The cost of developing the AI may be quite high, or the increased cost for the cars (more sensors), but those things probably scale better than wages for us humans.


Detect and fix bugs? Sounds like snake oil. Even compilers analyzers are full of false positive warnings, LLMs are even worse.


I've personally verified and approved a actual bugfix made by a LLM. Wasn't the trickiest nor most severe bug I've ever seen, but it had slipped through the cracks in our testing and review process, and the LLM caught it.

Nothing snake oily here.

Also: What compiler are you using that's giving you false positive warnings?

I have never heard of a false positive warning nor error from a compiler, but perhaps you're using some weird tooling I'm not familiar with?


I mean if we count autonomous taxis why not the entirety of the recommendation-engine-driven internet, including ad networks, social media feeds, streaming services? Or like object detection and tracking on cctvs? Does AI mean just LLMs and maybe image generators when we're talking about power usage, but anything with a neural network if we're talking about economic impact? Where do we stand on other ML, like clustering algorithms?


> Cryptocurrency was and is mainly an object of speculation, not of intrinsic economic value.

i say this is not correct.

What is "intrinsic" value? I argue that if people want it, then it has intrinsic value.


> I argue that if people want it, then it has intrinsic value.

In other words, what you’re arguing is “if it has extrinsic value, then it has intrinsic value”. Which would break the point of the distinction.

Intrinsic value is the exact opposite of what you’re saying, it’s something having value by itself.


That intrinsic value recently enabled me to spend some Bitcoin on a new tablet device. Works for me!


This just means you passed something intrinsically worthless on to another person in exchange for a useful good. There is a name for this in economic theory, but it's a bit of an insult.


> There is a name for this in economic theory, but it's a bit of an insult.

calling it money is no insult.


What some people seem to be describing or expecting is that everything comes to a screeching years-long halt. What will actually happen is we will be on a slightly slower exponential growth curve than we counterfactually would have been in a world with much more energy infrastructure. The subjective experience will still be one of being on an exponential trend. The constraints are just different.


This ai energy stuff makes zero sense..... Unless you have an existing metric for the energy cost of fiat currency. So what is the cost of fiat? And I don't mean merely the production of paper, but also the cost of moving it, transferring it online, etc. The whole idea of an energy cost for currency is nonsense imo. But it does make sense if you want to usher in the idea of a carbon tax.


What I’d love to see is a rundown of new efficiencies in generative AI. With things like quantization and specialized transformer hardware, the costs will hopefully be less shocking in the future. And fwiw, I don’t find the article’s stats very shocking anyway. At least it’s a net win type of game, whereas the previous environmental bugbear, PoW crypto, was entirely premised on net-energy loss.


All optimizations inevitably end up just being funneled into larger and larger models. The issue with AI is its seemingly endless ability to scale in size for marginal gains in model performance (log(n) scaling essentially).


>All optimizations inevitably end up just being funneled into larger and larger models.

Well of course if you're trying to beat the SOTA, bigger sizes allow for better models, but not everyone is trying to use or train the latest SOTA model, maybe llama-3 8b is perfect for what you need to do and having better optimizations to run it locally is gold.


Defining "good enough" is something that rarely seems to happen.


Wasn't crypto supposed to cause this same energy apocalypse?

Does AI even use more energy than crypto ever did? Doubtful.

Besides, we're projected to ingest all of humanity's written knowledge by 2026, and after that there will be a cliff in power consumption, we will mostly switch to inference on purpose built hardware, which is vastly more efficient.


> Besides, we're projected to ingest all of humanity's written knowledge by 2026, and after that there will be a cliff in power consumption, we will mostly switch to inference on purpose built hardware, which is vastly more efficient.

Does this claim presuppose that people will forever be content to use a circa 2026 ML model? If so, that seems unlikely to me.


Well AI is only just getting started of course. Many people don't even know what it is.

All the products and services based on it like Microsoft's many copilots are totally beta quality and not really ready for the mainstream. In a couple years they probably will be and I'm sure the usage will have increased tenfold.

On the other hand I'm sure many optimisations in hardware have been accomplished. But I don't think it can be compared to bitcoin which is a more mature technology (as much as one can call a disorganised macro-economic experiment mature)


AI and bitcoin consume similar amounts of energy, but bitcoin uses more green energy than AI:

https://wired.me/science/energy/ai-vs-bitcoin-mining-energy


Is crypto gone? Otherwise it's now crypto plus AI plus the already existing rest.

More is more.


not sure but my money is on AI using wayyyyyy more.


> And for those opposed to generative AI on principled or functional grounds, putting similar energy into millions of Nvidia AI servers probably seems like just as big of an energy waste.

while there is value and money-grifting involved in both crypto and genAI, there is a major differentiating factor between the wastage argument - artificially-adjusted proof of work, which is/was used for major cryptocurrencies.

people spend time and energy (figuratively and literally) on the internet on many ways one could define as wastefully (take scrolling down the "for you" feed for example). but for majority of cases, the "efficiencies" or their lack of are not done intentionally and are often improved on (e.g. new codecs like AV1 needing less bandwidth).

but to make crypto tokens work, the original decision of using PoW has led to a massive surge of energy consumption to achieve the same results. while there might be some work on countering its negatives (e.g. with eth network), it is still a major driving force, despite an arms race to make more efficient hardware.

while a lot of current AI has major inefficiencies (will be too long to list down here!), the root cause is often the fast-moving innovations and everyone building on someone's imperfect work.

coming back to the original argument, you should expect more dedicated hardware to do the underlying matrix multiplications to make AI work. for most people who would only care about inference, it is already becoming a reality at consumer level. but expect people to push the new stuff to its limit to stay competitive, whether you find all of this ridiculous or not.


This article is a great example of a really frustrating thing people are doing, which is pretending that all this AI business started two years ago

I get it, GenAI became popular as a consumer-facing product and tech industry PR blitz in that year. But people in the tech world should know better. AI as massive industry-wide GPGPU workload took off in the 2010s, including widespread usage of smaller models like CNNs throughout both FAANG's stack and in scattered startups, as well as supermodels used as recommendation engines by everyone and their grandma. Arguably the entire business models of every irresponsibly large tech company ran on this shit. All the telemetry. Ad targeting. Social Media feeds. Hell, GPT-3 came out before 2020. It was a scandal in this world when OpenAI exclusively licensed it to Microsoft, who was already probably using it in search at that point. None of this was actually that long ago, this is way too soon to have cultural amnesia. I get that I'm in somewhat of a bubble as an AI researcher but surely tech publications should at least know these basic facts, right? Is this yet another reason to be annoyed at the 2010s lingo for calling all these pervasive neural networks "the algorithms"?

From the perspective of energy expenditure from AI workloads, the statement that it's a major driving force of the rising energy demands of datacenters is a perfectly reasonable conclusion given a graph where the TWH more than triples between 2012 and 2024. The article sometimes specifies "generative AI" (which did exist in 2012, but was in a way less interesting state for most people and businesses until 2022), but often just says "AI", which is a big umbrella term people have at least consistently been using for most neural networks for that entire span of time (and longer, and for lots of other things, and it's hopelessly overloaded to the point of being nonsensical sometimes, but regardless this is an incredibly uncontroversial usage). So someone at a data center with a graph that basically tracks the rise of GPGPU neural networks and shows a big jump in energy expenditure over that period attributing this to "AI" is very reasonable!


The graph starts taking off in 2008 in fact and I'm willing to bet that coincides with companies like Amazon starting seriously scaling up.

For instance:

Visits to amazon.com grew from 615 million annual visitors in 2008,[46] to more than 2 billion per month in 2022.[47] The e-commerce platform is the 14th most visited website in the world.[48]

https://en.wikipedia.org/wiki/Amazon_(company)#Products_and_...

I'm not fingering Amazon specifically as the culprit but when you say "algorithm" in that sense, as you point out, I'm thinking of recommender systems and Amazon's must be one of the er scaliest; probably together with Meta, and I guess Netflix. Doesn't have to be on GPUs either. CPUs need power too.


I mean while I agree that CPUs can be hungry, a lot of traditional beefycore proc tech in this time period has gone into energy efficiency and miniaturization whereas a lot of GPU tech has been all about more and more cores. Research and enterprise workstations I've worked with have increasingly required industrial power supply. Like yea raw scale's gonna be a factor but I'd also think that per-datacenter power might grow less quickly with scale if the per-server consumption wasn't drastically rising because company scale also predicts building more datacenters


I don’t take environmentalists seriously because they primarily act to accelerate climate change. So I’m just going to dismiss this out of hand. They’re a poor source of information.


First of all the summary of the article is that things aren't as bad as presented.

Second, how do environmentalists accelerate climate change? I honestly don't get that.

I'm sure some environmentalists have a high footprint sure (thinking of Al Gore private jetting all over the world). But I'm also sure my footprint is a lot smaller than my peers'


Environmentalists have fought nuclear, wind, solar, and geothermal. They have fought infill housing and push for sprawl. The Sierra Club pushes for policies that increase water use and sprawl. I’m obviously right to ignore them and history will show me as correct.

One or two private jets are irrelevant but pushing for bad policy and opposing good policy places them firmly on the other side. It’s all right. Their time is maybe another decade. Then we roll the whole thing back and get to fixing this world without the opinions of these people who want to ruin the only planet we have so far.


I'm not familiar with the Sierra Club. I assume it's an American-specific organisation. Here in Europe we have "green parties" in most parliaments.

But as most wide interest groups, environmentalists are not a homogeneous community. There is no general 'agenda' even though subgroups might have one. I'm sure there's subgroups that use environmentalism as a false flag to further their own goals.

I'm personally against nuclear as I feel it externalises the problems into the future (waste management) which is exactly how we got into this situation we are in now. But I'm very much in favour of really renewable energy and public transport instead of private cars.


Yes, the German Green Party has many pro-wind people who nonetheless oppose wind when it’s in their backyard. Germany’s transition from nuclear to coal under their influence is well known. Tragic that they have any power at all. But it won’t last.


That nuclear take is bad though. We can manage the waste, but even if we fail the consequences are dramaaaaaatically less severe than those of extra warming.


> these people who want to ruin the only planet we have so far.

You’re free to disagree with the policies and results of environmentalist actions and even argue that they’re having an undesired effect, but saying environmentalists want to ruin the planet is nonsensical. Anyone who desires that is by definition not an environmentalist.


I judge people by actions not by stated claims. For instance, I don’t consider the Democratic People’s Republic of Korea to be democratic or a republic.


You are conflating the word used for the name of an entity with the word for a concept. When you say the Democratic People’s Republic of Korea is neither democratic nor a republic, you’re comparing the names “democratic” and “republic” to the concepts of what a democracy and a republic are. In other words, you're not (I hope) taking issue with either idea but with the fact that a government is misappropriating those words to make false claims.

“Environmentalism” is a concept. An organisation called The Environmentalist Society for the Betterment of Puppies and Rainbows could be a nefarious river polluter while another called The Very Bad Toxic Wastelands could be a philanthropic conversationalist which plants forests. An environmentalist is someone acts in the best interest of the environment, regardless of what they call themselves.


Sure, find and replace “environmentalists” with “those who call themselves environmentalists”.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: