#PSA Please when designing new tech consider that not everyone has a reliable 24/7 broadband internet connection, or a large screen, or a keyboard, or a mouse, or is able to get a domain name, or is able to run services from home, is able to power their home-server for the whole day, or can rent a VPS, or can have their ip-addresses be linked to an identity, or has one legal identity, or has one name, or has a recent computer, or has easy access to a debit or credit card, or has a bank account with either, or has a PayPal account, or has bitcoin, or one of many other things some and often many people just don't have, etc…
When I wrote this, I still thought it was mostly bluff from the AI companies to keep the hype alive and talk up their share price. Now, it looks like they are really hell-bent on making my worst-case scenario a reality.
5 GW data centres means 10x the current largest ones, and these are already 10x compared to the largest ones from a few years ago. This is pure madness. For reference, all the nuclear in the UK produces 6 G, all wind 30 GW.
https://wimvanderbauwhede.codeberg.page/articles/the-real-problem-with-AI/
@macrumors Sadly, these five reasons not to don’t even include #sustainability, #degrowth, #frugalcomputing, #DigitalAutonomy. What a miss.
This is in reaction to that article by Tim Bray.
"The real problem with the AI hype"
https://wimvanderbauwhede.codeberg.page/articles/the-real-problem-with-AI/
I wrote that 6 months ago.
@maswan and I wrote a paper in which we develop a detailed model for the LCA of HPC centres, including embodied carbon, server replacement and expansion. It is also applicable to other data centres. We also share the source code.
It shows how important embodied carbon becomes when the grid has more renewables.
"in the long run economic considerations are partly a function of culture, as preferences and social concerns come to be expressed in the form of market conditions."
It is naive to think that fusion will provide the world with limitless energy.
According to the paper "Can fusion energy be cost-competitive and commercially viable? An analysis of magnetically confined reactors" (from 2023), the cost of a 1 GW fusion plant is of the same order as a comparable fission plant. In other words, per plant order of £10 billion. Construction times are order of a decade, also comparable.
https://www.sciencedirect.com/science/article/abs/pii/S0301421523000964
Let's assume this data centre actually draws 300 MW all the time then it consumes 2.628 TWh/year. At a typical Water Usage Effectiveness of 0.3 l/kWh it would need 788 thousand m³ of water for cooling.
The site for the data centre, Ravenscraig, is in Motherwell (near Glasgow). That is a town of 33,000 people. The households of such a town consume 2 million m³ of water per year. So that data centre will consume 40% of that. (3/3)
The article calls the developing company, Apatura, a "renewable energy developer", but the reality is that they specialise in the land acquisition, design, planning, and operation of large-scale Battery Energy Storage Systems for hyperscale data centres. (2/3)
The Whitelee wind farm near Glasgow, the largest on-shore wind farm in the UK and one of the largest in Europe, has a maximum generative capacity of 539 MW and covers an area of 55 km² (about the size of Manhattan).
Plans have been announced for an AI data centre of 550 MW, and this is one of five such sites planned in central Scotland. (1/3)
From @ana_valdi's paper on "Data Ecofeminism":
Principles of Data Ecofeminism
Principle 1: Examine power structures within the climate crisis
Principle 2: Consider digital materiality and its supply chains
Principle 3: Make visible and accountable AI environmental impacts
Principle 4: Prioritise frugal AI computing
Principle 5: Reclaim digital sovereignty
Principle 6: Foster the commons through mutual aid
Principle 7: Weave the pluriverse
@ana_valdi Just read your paper on Data Ecofeminism, I liked it a lot. These are principles I can very much stand behind.
(And wow, 145 references! I don't think I ever wrote anything that thorough)
Thanks a lot for mentioning #FrugalComputing !
And now, our software boots up on our hardware prototype of the #SmolPhone The keyboard kinda works, too. More work in the future but OK for now.
The substrate is able to display a "modern" UI with buttons, text areas, labels, checkbox and such under the RP2040 constraints (about 200k of RAM but rather OK compute power).
The goal is to allow users to build apps with lua scripts, as in #Scrappy https://jrcpl.us/contrib/2025/Scrappy Maybe before the end of the year, if we're lucky.
IBM's new processor-in-memory (which is, besides, not a new idea, and for AI it's mostly MACC-in-memory) will reduce the energy consumption per computation for LLMs.
But if energy efficiency gains would reduce emissions, we would not have climate change. The entire history of the industrial revolution starting with the steam engine is one of energy efficiency gains.
#FrugalComputing
New blog post about the #SmolPhone (our take on #FrugalComputing): I gave a short talk about it, and decided to write the things I usually say. It's here: https://people.irisa.fr/Martin.Quinson/blog/250528/Smolphone-Magellan/
I think that the result is a nice introduction to the project. Please comment and tell us what you think of it!
Now, that 10x growth is not what OpenAI, Dell etc want. No, they want 100x growth.
For what that would mean:
Global GHG emissions are 57.1 GtCO2e/y (2023 figure; probably 1-2% more now).
So there's two ways to look at this:
(1) The apologist: So even if AI results in extra emissions of 4 GtCO2e/y (*) by 2035, that is less than 7%, surely that is not an issue.
(2) The climate reality: to stay below 1.5ºC the global CO2 budget for 2035 is 25 GtCO2e/y. Sacrificing 16% of that for AI growth is madness.
https://www.unep.org/resources/emissions-gap-report-2024
(*) my estimate based on 10x AI growth in 10y