The Invisible Hand Is a Server Farm
Artificial intelligence isn’t magic — it’s infrastructure, money, and an experiment we’re all tangled in. At She Zine, we’re not chasing the hype, but we also can’t ignore it. AI creeps into the defaults of our devices and workflows whether we invite it or not. We use it cautiously — the occasional brainstorm, outline, or quick tool when nothing else fits — but never as a stand-in for human voices.
Our approach is deliberate: treat AI like tape or scissors. Useful sometimes, but never the whole project. Ignoring it won’t make it disappear; blind optimism won’t make it ethical. What matters is seeing it clearly for what it is — not magic, but metal racks, server farms, and cooling systems burning through water and power to keep up with our clicks.
We’re experimenting with AI in public, because ignoring it won’t make it disappear and blind optimism won’t make it ethical. We want to know what it can and can’t do, where it pushes us creatively, and where it absolutely crosses the line. That choice makes some readers cheer, others roll their eyes, and a few send strongly worded DMs about “AI slop.” Fair. But this magazine has always been about making things out loud, even when it’s messy.
So here we are: me, you, the bots — and a very real question about what happens when the invisible hand of capitalism looks less like a metaphor and more like a server farm drinking rivers dry in the desert.
Part I: The Birth of the Hungry Machine
Old-school neural networks were clunky, shallow, and easily bored. They couldn’t recognize a cat unless you basically told them “here is a cat.” Then came deep learning: layered networks that could inhale billions of examples and start seeing patterns with uncanny fluency. Suddenly cats, cancerous cells, climate data, Kanye lyrics — all grist for the algorithm’s mill.
That progress came with a cost: GPUs burning through electricity, data centers roaring to life, and training runs that required as much energy as transatlantic flights . The bigger the model — GPT-3, GPT-4, Gemini, Claude — the bigger the footprint.
And yet: those leaps also opened doors. AI began diagnosing diseases faster, modeling climate data more accurately, and supporting disabled users with new accessibility tools. The same GPUs driving meme generators also powered breakthroughs in science and medicine. It’s not one story — it’s two, unfolding at the same time.
Part II: The Earth Becomes a Dataset
Mining the Planet for Data
AI doesn’t just eat energy. It eats culture. Books, songs, images, conversations — all scraped, stripped, and spun back into predictions . Consent often skipped. Copyright dismissed as “old-fashioned.”
And behind the datasets are workers, often in the Global South, paid pennies to label toxic content or moderate the worst of the internet . This is labour, not magic.
But here’s where hope creeps in: artists are suing, unions are forming, Indigenous nations are asserting data sovereignty. New frameworks like “consent-based datasets” and “ethical licensing” are gaining traction. We’re starting to ask not just what gets scraped, but whose voices are included — and under what terms. That shift matters.
Part III: Energy, Water, and the Heat of Machines
Cooling the Beast
Data centres are heat monsters. Imagine warehouses full of servers, running day and night, generating staggering heat. To stop them from melting down, companies pump in incredible amounts of water.
In 2022, Google alone used more than 5 billion gallons of water — much of it sucked from drought-stricken regions. That’s the hidden footprint behind your AI-generated playlist.
The Greenwashing Playbook
Tech giants love to advertise “AI for climate” while ignoring AI’s climate cost. They talk about optimizing wind farms and modelling weather while building server farms that strain local water supplies and belch carbon into the sky. It’s a shell game: move the burden somewhere else, preferably out of sight. You can read about She Zine’s climate commitment here.
But solutions exist. Some centers are switching to closed-loop cooling , reusing heat to warm nearby buildings , or building in cold climates where water isn’t scarce. Toronto’s deep lake water cooling system is already a global model . The problem isn’t that AI has to be destructive — it’s that corporations choose the easiest, cheapest option unless we demand better.
Part IV: The Politics of Scale
The question isn’t whether AI is “good” or “bad.” It’s who controls it, and for what purpose. Right now, Big Tech pours billions into systems that optimize logistics, maximize profits, or surveil borders. Less money flows into tools that could protect water, strengthen communities, or amplify Indigenous knowledge.
But alternative visions exist. Community-owned AI collectives are experimenting with smaller, localized models. Feminist researchers are building “slow AI” that prioritizes consent, transparency, and efficiency over brute force. The narrative doesn’t have to be monopolies forever — unless we accept that as inevitable.
Part V: The Cultural Smog
Yes, this piece is about infrastructure. But culture doesn’t escape. AI is sold as neutral, but its priorities are anything but. What gets funded are systems that maximize profit, optimize logistics, surveil workers, and police borders. What doesn’t are the tools that centre equity, justice, or Indigenous knowledge. Intelligence is being defined by whoever owns the servers.
The Algorithm as Boss
In workplaces, AI is less muse than manager. From call-centre monitoring to warehouse scheduling, algorithms increasingly dictate pace, productivity, and even who gets hired or fired. Less liberation, more surveillance.
Whose Intelligence?
Artists, writers, musicians — all watching their work scraped into datasets, spat back out as cheap knock-offs. It’s not just plagiarism; it’s cultural pollution. A haze of derivative content that makes authenticity harder to breathe.
Creativity is where the fight feels most personal. AI doesn’t just “inspire” — it mimics. Writers, musicians, and illustrators have seen their styles cloned without credit . The flood of AI-generated work risks drowning out the weird, raw, imperfect things only humans make.
Part VI: Planetary Feedback Loops
The Climate Crunch
AI thrives on energy. Energy still mostly comes from fossil fuels. That means every new model nudges us closer to climate collapse, which in turn creates demand for AI models to help us adapt to disasters.
Ecological Blind Spots
While AI models can predict floods or optimize crop yields, they remain tools built in a narrow worldview. They don’t account for traditional ecological knowledge, community-based resilience, or the messy complexity of ecosystems. They promise certainty but deliver bias. A machine eating its own tail.
So yes, AI contributes to climate collapse. Yes, it can reinforce bias. Yes, it destabilizes creative work. But it can also be bent: toward renewable grids, toward collective ownership, toward accessibility breakthroughs. The problem isn’t the existence of the tool — it’s the extractive logic currently running it.
The future is not fixed. We can still write it.
Part VII: Resistance and Alternatives
Not everyone’s rolling over.
-
- Data Strikes and Dataset Boycotts
-
- Writers, artists, and journalists are beginning to resist, filing lawsuits, launching data strikes, and demanding opt-outs. The message: our work is not raw material for your model.
-
- Indigenous Protocols
-
- Indigenous communities worldwide are developing frameworks for data sovereignty, insisting that AI systems respect collective rights and cultural knowledge. These protocols offer a radically different approach: one that treats information as relational, not extractive.
-
- Slow Tech and Degrowth Computing
-
- Some researchers are pushing for “small AI” — models that are efficient, localized, and transparent. Instead of planetary-scale behemoths, imagine community-owned systems that serve specific needs without devouring the grid.
Resistance doesn’t have to mean no tech. It means different tech.
Part VIII: What Comes Next
AI is not destiny. It’s infrastructure, policy, and profit disguised as inevitability. The question is not whether the algorithm will eat the Earth, but whether we’ll keep feeding it.
The Fork in the Circuit
We can double down on corporate AI, handing over ever more power to a handful of companies. Or we can build alternative systems: feminist, ecological, community-driven. The tools are not neutral, but they are still tools.
The tools aren’t neutral. But they’re still tools.
Conclusion: Refusing to Be Consumed
AI’s story is the Anthropocene in fast-forward: extract, exploit, expand, regret. But unlike climate change, where the feedback loops are decades old, AI is still young. The future isn’t set.
We can refuse the myth that bigger is better.
We can resist the enclosure of culture into datasets.
We can demand systems that serve life instead of stripping it for parts.
This is why She Zine uses AI — carefully, publicly, and with boundaries. We don’t use it to fabricate trauma, mimic identities, or erase human voices. We do use it to brainstorm, to stress-test ideas, to stretch creativity into new shapes. We believe ignoring AI cedes the ground to corporations; experimenting with it lets us drag the machine into feminist, messy, transparent light.
We’re not naïve. The contradictions are real. But so is the possibility of re-engineering how these tools are used, who gets paid, and what values shape the future.
The algorithm may be hungry — but it doesn’t have to eat the Earth. With intention, accountability, and imagination, we can teach it new tastes.
That doesn’t make AI ethical. But it does make the work transparent. And maybe that’s the most radical move we have right now: admit the contradictions, hold ourselves accountable, and refuse to give up our creative agency to corporate inevitability.
The algorithm may be hungry. But it doesn’t get to eat our weird, handmade, feminist corner of the internet without a fight.
🔥 Want In On The Experiment?
AI is messy, polarizing, and unavoidable — which is exactly why we’re testing its limits out loud, in public, and with you watching.
Join the #NewGirlArmy. We’re building a magazine from the margins — and we mean it when we say it’s yours, too.
→ Subscribe to The Edit — our weekly(ish) dispatch of rebellion, DIY media, and creative fire.
→ Submit Your Work — tell us how you’re experimenting, resisting, or remixing AI in your own practice.
→ Share Your Opinion — should AI have a place in indie publishing at all? We want to know.
→ Follow @shezinemagazine — join the New Girl Army.

Sources & Further Reading
MIT News – Explained: Generative AI’s environmental impact (2025)
Wikipedia – Environmental impact of artificial intelligence
Nature – The hidden labor behind AI
Time – The Secret Cost of Google’s Data Centers: Billions of Gallons of Water
EESI – Data Centers and Water Consumption
Latitude Media – Where does the AI boom leave Google’s cooling strategy?
Quartz – Google’s data centers consumed 6.1 billion gallons of water in 2023
Rutkowski Interview – AI art and the grief of being copied

AXO (she/her) is a multidisciplinary creator, editor, and builder of feminist media ecosystems based in Toronto. She is the founder of She Zine Mag, Side Project Distro, BBLGM Club, and several other projects under the AXO&Co umbrella — each rooted in DIY culture, creative rebellion, and community care. Her work explores the intersection of craft, technology, and consciousness, with an emphasis on handmade ethics, neurodivergent creativity, and the politics of making. She is an advocate for accessible creativity and the power of small-scale cultural production to spark social change. Her practice merges punk, print, and digital media while refusing to separate the emotional from the practical. Above all, her work invites others to build creative lives that are thoughtful, defiant, and deeply handmade.