Limiting environmental costs of AI?
January 22, 2025 9:09 AM Subscribe
I am looking for best practices, resources or your thoughts about how users in an organization, or the organization as a whole, might address concerns about the environmental costs of artificial intelligence. I am part of a group working on a policy.
Best answer: I'd suggest reading this blog entry with a skeptical eye for some context. The author is pro-LLM in the sense that he finds them useful. I also personally believe that it's worth reducing energy consumption period; yes, China's effect on global warming is bigger than anything I can personally account for, but so what? Chipping in with a small impact is still chipping in.
However, it's still useful to contextualize the real impact. And maybe your working group should be addressing Zoom calls, hard for me to say.
posted by Bryant at 10:25 AM on January 22 [2 favorites]
However, it's still useful to contextualize the real impact. And maybe your working group should be addressing Zoom calls, hard for me to say.
posted by Bryant at 10:25 AM on January 22 [2 favorites]
One way to approach it is to recognize that the pricing for each model tends to be linked directly to the environmental impact of using that model. The biggest marginal costs for the model providers are the electricity used to perform a model's computation and the additional electricity used to remove the heat created by that computation. Those, in turn, are directly linked to the biggest environmental costs of using AI: generating the needed energy via whatever methods are used on that power grid. As gregr suggests, smaller models processing less data use less energy than bigger models / more data.
There's an additional substantial environmental cost (again, via electricity usage/generation) in the initial training of each model. That will usually be directly correlated with the size of the model, too.
Therefore, while absolute numbers will be difficult to come by, you can get a decent estimate of the relative environmental impact of any two systems or approaches by comparing the amounts an AI provider would charge for each. And then if your goal is to minimize the environmental impact of your usage, that corresponds to minimizing the cost of your usage.
As an additional bonus, the time taken by any given model will scale with the size of the model as well. So lots of things are in alignment: environmental impact, cost, and time taken to produce a result. Use the cheapest model that does what you need in any situation, and you both minimize environmental cost and maximize speed at the same time.
Beyond that, of course, simply reducing usage obviously decreases cost and environmental impact, too. It's worth considering whether generative AI is the best or necessary tool in every case.
posted by whatnotever at 8:38 PM on January 22 [1 favorite]
There's an additional substantial environmental cost (again, via electricity usage/generation) in the initial training of each model. That will usually be directly correlated with the size of the model, too.
Therefore, while absolute numbers will be difficult to come by, you can get a decent estimate of the relative environmental impact of any two systems or approaches by comparing the amounts an AI provider would charge for each. And then if your goal is to minimize the environmental impact of your usage, that corresponds to minimizing the cost of your usage.
As an additional bonus, the time taken by any given model will scale with the size of the model as well. So lots of things are in alignment: environmental impact, cost, and time taken to produce a result. Use the cheapest model that does what you need in any situation, and you both minimize environmental cost and maximize speed at the same time.
Beyond that, of course, simply reducing usage obviously decreases cost and environmental impact, too. It's worth considering whether generative AI is the best or necessary tool in every case.
posted by whatnotever at 8:38 PM on January 22 [1 favorite]
Beyond that, of course, simply reducing usage obviously decreases cost and environmental impact, too. It's worth considering whether generative AI is the best or necessary tool in every case.
This. The talk about how much electricity it costs to generate data sets is completely lost in that the actual valuable use cases of generative AI are less than 'search replacement', but since that's all its currently good for [removing the nonsense features like making up [royalty free] stories, music, and images] is what's it's being used for.
posted by The_Vegetables at 8:16 AM on January 23
This. The talk about how much electricity it costs to generate data sets is completely lost in that the actual valuable use cases of generative AI are less than 'search replacement', but since that's all its currently good for [removing the nonsense features like making up [royalty free] stories, music, and images] is what's it's being used for.
posted by The_Vegetables at 8:16 AM on January 23
It's worth thinking about whether the goal here is to reduce environmental impact, with genAI as a lever in that, or to discourage genAI for other reasons, with environmental impact as a lever in that.
For instance, if it's the former, you might find yourself writing a sentence like "company policy is to discourage use of environmentally costly tools such as photoshop, in favour of less electricity-using tools such as generating images via Chat-GPT's Dall-E interface". Which, obviously, you wouldn't do in the latter case.
The assumption that AI is always the most environmentally costly way to do things isn't true, so it's (imo) worth thinking through what you want to do with that.
posted by wattle at 12:29 AM on January 24 [1 favorite]
For instance, if it's the former, you might find yourself writing a sentence like "company policy is to discourage use of environmentally costly tools such as photoshop, in favour of less electricity-using tools such as generating images via Chat-GPT's Dall-E interface". Which, obviously, you wouldn't do in the latter case.
The assumption that AI is always the most environmentally costly way to do things isn't true, so it's (imo) worth thinking through what you want to do with that.
posted by wattle at 12:29 AM on January 24 [1 favorite]
You are not logged in, either login or create an account to post comments
Minimize the total amount of input and output. Longer input/output again increases electricity/cooling use.
I think a lot of the information you might want to base your policies on if you're using OpenAI or Anthropic models is going to be hard to come by, model size, training energy expenditure, power per request, etc.
In the Llama 2 paper & model card provide some presumably hard data on carbon. They say that they've offset the carbon (for whatever that's worth), so maybe small open models, that have had their carbon offset is better than the alternatives?
posted by gregr at 9:48 AM on January 22 [1 favorite]