Don't Let the Asphalt Bury the Garden
I’ve spent 30 years watching tech cycles come and go, from the first dial-up modems in rural Austria to the mesh networks I’m currently stringing across the Australian bush. Each time a “next big thing” arrives, we see the same pattern: a frantic rush to centralise, followed by a slow, painful enclosure of what should have been a common resource.
The current noise around AI in open source feels different. It feels heavier. There’s a justified fear that AI-generated code is hollowing out our commons. Maintainers are being buried under a drift of unvetted, mediocre pull requests, while a handful of platform monopolies strip-mine decades of community work to feed their proprietary black boxes.
Being an ostrich isn’t a way forward. The tool is here. The challenge is not to reject it, but to master it responsibly and teach others to do the same.

The Asphalt Trap
To see where the corporate AI path leads, look at the concrete graveyards of cities like Houston or Los Angeles.
When roads get congested, the reflex is to throw more concrete and asphalt at the problem. Houston’s Katy Freeway has ballooned to 26 lanes in places, yet it remains one of the most congested stretches of road in North America. This is induced demand: more lanes just encourage more people to drive, leading to the same gridlock on a more expensive, resource-heavy scale. Buses don’t help much either — they just end up stuck in the same traffic as everyone else.
True resilience comes from decentralising: light rail, bike lanes, walking paths, and — most importantly — reducing the need to drive by building local hubs. We need to stop building more lanes and start building better tracks.
The corporate AI giants are stuck in the same asphalt trap. OpenAI’s “Stargate” initiative promises $500 billion in data centre infrastructure. Microsoft, Google, and Meta are running a parallel arms race. More hardware, more energy, more water. The scale is staggering — and it generates the same induced demand. The more centralised compute they build, the more dependent we all become, and the higher the toll climbs.
Enshittification as a Business Model
Cory Doctorow calls what comes next “enshittification”: centralise the service, make everyone dependent, then squeeze them for every cent of rent. We’ve watched this exact pattern with social media, cloud storage, and SaaS tools that started free and now charge you to breathe. There’s no reason to believe AI will be different — in fact, the infrastructure investment makes the rent-seeking mandatory. Someone has to pay for those data centres.
The energy and water consumed by this centralised infrastructure is a direct drain on the planet’s future. NVIDIA and Hynix are doing very well, thank you. The rest of us are building a world where we pay a toll just to think.
As I explored in Are You Buying a Future Brick?, this is the same pattern we’ve already lived through with IoT devices: corporations build dependency, then exercise it. The lesson applies here with even higher stakes. If the API goes down, if the pricing changes, if the model is deprecated — your tools stop working. You don’t own them; you rent them.
The Efficiency Lesson (and Who It Belongs To)
When US export controls blocked Chinese researchers from accessing the latest NVIDIA hardware, something instructive happened. Teams working under hardware constraints — including those behind the DeepSeek models — demonstrated that careful architectural work can match brute-force compute. And to their credit, they released the result: DeepSeek-R1 is MIT-licensed, with open weights. You can run it locally, modify it, and build on it. That is a genuine contribution to the commons, and it deserves acknowledgement.
But open weights are not the same as transparent training. The data, the reinforcement learning choices, the values baked into the outputs — none of that is open. Probe politically sensitive territory — Taiwan’s status, Tiananmen Square — and the model goes carefully quiet. These are design choices that reflect the interests of the entity that built it. The same is true, in different ways, of every closed model from Silicon Valley: different censors, same lack of transparency.
The lesson is worth separating clearly: open weights let you run a capable model locally and keep your data sovereign. That matters enormously. But running it locally does not tell you what assumptions are embedded in the model’s worldview, or what it has been trained not to say. That requires a different kind of literacy entirely — one I’ll explore in a forthcoming post.
Those lessons have also been absorbed by projects with a longer track record of openness. Meta’s Llama family runs locally on open weights. Mistral — a French company committed to open approaches — publishes models you can download and run on your own hardware today. The Ollama project wraps this into a tool so straightforward that you can pull a capable model to your laptop in minutes, with no account required and no data leaving your machine.
For too long, especially here in Australia, we’ve looked to Silicon Valley as the blueprint for progress. We adopt their tools, replicate their centralised models, and act as a branch office for their corporate interests. The arms-race model of hyper-centralised, hardware-heavy AI is a warning, not an aspiration. It is what happens when you prioritise platform rent-seeking over community sovereignty.
The future of the digital commons isn’t in a Californian cloud. It’s in efficient local models that run on everyday hardware.
The Commons Already Exists — We Just Need to Use It
This isn’t theoretical. Right now, on a reasonably modern laptop, you can:
- Install Ollama and pull a capable open-weights model in under ten minutes
- Add Open WebUI for a familiar chat interface that runs entirely on your own machine
- Use Mistral 7B or Meta’s Llama 3 for most practical tasks — summarising, drafting, coding assistance — without a single byte leaving your network
No subscription. No API key. No data processed on servers in Virginia. Your queries, your data, your hardware.
This isn’t about rejecting AI. As I wrote in Opti-Morons and the Death of Critical Thought, the neo-Luddite reflex is just another way to stop thinking. The question is whether we engage critically — choosing tools that serve the community rather than extract from it.
Reclaiming the Commons
Using local models means more than saving electricity, though that matters too. Sovereignty is the point — keeping power and data in local hands rather than outsourcing our thinking to a server on another continent.
For Australian institutions — councils, schools, health services, agricultural co-ops — this is a concrete governance question. Patient records should not be processed through US cloud APIs subject to foreign data access laws. Farming intelligence built from Australian soils should not feed corporate training sets without consent. Community knowledge belongs to the communities that generated it.
This is the 2026 version of what I was doing in those Austrian tele-working centres in the early 1990s: building local capability so communities don’t have to depend on distant, extractive infrastructure. The technology has changed dramatically. The principle hasn’t.
We need a Digital Commons built in a distributed fashion — not a mine for platform profit, but a regenerative resource. Open-weights models, local inference tools, and community AI literacy. These already exist. The gap is not the technology; it’s the will to build habits around it.
What You Can Do This Week
Don’t wait for policy or for institutions to lead. Start local:
- Run a local model today. Install Ollama, pull
mistralorllama3, add Open WebUI. Takes twenty minutes. Costs nothing. - Choose open-weights where you can. When you have a choice between a closed API and an open model, choose the open model. Vote with your toolchain.
- Push back on institutional lock-in. If your workplace or organisation is signing AI contracts, ask: can we run this locally? Who owns the queries? What happens if pricing changes?
- Support the maintainers. The open-weights model community runs on volunteer labour and thin funding, just like the open-source projects I wrote about in Open Source Is the Hope, But It Needs Our Help. Find a project you rely on and contribute — code, documentation, or a few dollars a month.
The choice is not between AI and no AI. It’s between being passengers on a corporate toll-road and being the builders of our own infrastructure. We’ve done this before with mesh networks, with home automation, with community telecoms. We know how to build.
It’s time to get some dirt under our fingernails and start.
Comments
Be the first to comment! Reply to this post from your Mastodon/Fediverse or Bluesky account, or mention this post's URL in your reply. Your comment will appear here automatically via webmention.
Follow this blog on Mastodon at @gaggl.com@web.brid.gy or on Bluesky at @gaggl.com