archive.is link to article from allabout.ai at https://www.allaboutai.com/resources/ai-statistics/ai-environment/
A lot of these studies they list are already years outdated and irrelevant. The models are much more efficient now, and it’s mainly the Musk owned AI data centers that are high pollution. Most of the pollution from the majority of data centers is not from AI, but other use.
The old room-sized ENIAC computers used 150-200 kW of power, and couldn’t do even a fraction of what your smart phone can do. The anti-AI people are taking advantage of most people’s ignorance, intentionally using outdated studies, and implying that the power usage will continue to grow- when in fact it has already shrunk dramatically.
I did some research and according to some AI’s this is true. According to some other AI’s this is false.
OP, this statement is bullshit. you can do about 5 million requests for ONE flight.
i’m gonna quote my old post:
I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.
The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.
Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!
Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.
Read up here !
I have started using Copilot more lately, but I’ve also switched from plastic straws to paper, so I’m good, right?
Your article doesn’t even claim that. Do you have any idea just how carbon intensive a flight is?
I imagine people making that claim accept air travel as useful and “AI”, really, all datacenters as not useful. I’ve had people tell me oh, air travel is more efficient per mile that road travel. But this ignores that people wouldn’t drive thousands of miles if it was not as easy as booking a flight.
Or a LLM query?
Bitcoin or crypto?
It also pollutes the mind of ignorant people with misinformation. Not that that is anything new. But I do think objective truth is very important in a democratic society. It reminds me of that video that used to go around that showed Sinclair Broadcasting in like 20 some different ‘local’ broadcast news all repeating the same words verbatim. It ended with ‘This is extremely dangerous to our democracy’. With AI being added to all the search engines, it is really easy to look something and unknowingly get bombarded with false info pulled out of the dregs of internet. 90% of people don’t verify the answer to see if it is based in reality.
deleted by creator
Which is why I threw up in my mouth a little when my boss said we all need to be more bullish on AI this morning.
Replace your boss with it.
Same. And they basically jizz their pants when they see a practical use for AI, but 9 out of 10 times there’s already a cheaper and more reliable solution they won’t even entertain.
There’s practical use for AI?
My boss is also a fuckwit
I’ve mentioned it before but my boss’s boss said only 86% of employees in his department use AI daily and it’s one of his annual goals to get that to 100%. He is obsessed.
They’re salivating at the chance to reduce head count and still make money. Employees are by far the largest cost for any company. They hate paying it out when it could be for them.
You should correct their spelling of “bullshit”
What is this masterpiece ? Pro-pornography subliminal propaganda ?
The emoji usage, heading & bold text pattern makes me certain the article was written using AI.
Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn’t require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).
Well, most of the carbon footprint for models is in training, which you probably don’t need to do at home.
That said, even with training they are not nearly our leading cause of pollution.
Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days
Basically every tech company is using it… It’s millions of people, not just us…
Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.
What does it mean to consume water? Like it’s used to cool something and then put back in a river? Or it evaporates? It’s not like it can be used in some irrecoverable way right?
if they take the water and don’t return to the source, there will be less available water in the water body, and it can lead to scarcity. If they take it and return, but at a higher temperature, or along with pollutants, it can impact the life in the water body. If they treat the water before returning, to be closest to the original properties, there will be little impact, but it means using more energy and resources for the treatment
I think the point is that it evaporates and may return as rain, which is overwhelmingly acid rain or filled with microplastics or otherwise just gets dirty and needs to be cleaned or purified again.
They need to use very pure water, and it evaporates completely, so it must be continually replenished.
Need is a strong word. There are much more efficient ways to cool data centers. They’ve just chosen the most wasteful way because it’s the cheapest (for them).
This is my main issue with it. I think its useful enough but only if it uses about the same energy as you would use doing whatever without it. Most conversations I had with someone trying to convince me it does not use to much power end up being very much like crypto ones were it keeps on being apples to oranges and the energy consumption seems to much. Im hoping hardware can be made to get the power use lower the way graphics cards did. I want to see querying an llm using about the same as searching for the answer or lower.
It’s using energy, we need more renewables. That’s not a problem with AI. Direct your opprobrium where it belongs