This Berkeley professor is exposing the hidden physical toll of our digital world
Alex Saum-Pascual proposes that new artistic representations could help bridge the gap between knowing a technology is harmful and actually changing our behavior.
Brandon Sánchez Mejia/UC Berkeley
January 21, 2026
It’s easy to forget that the cloud isn’t an amorphous ball of fluff, says UC Berkeley Professor Alex Saum-Pascual — that it is, in fact, physical internet infrastructure that takes many forms in many places across the world.
In her forthcoming book, Earthy Algorithms: A Materialist Reading of Digital Literature, Saum-Pascual argues that digital tools like generative AI mask the messy reality of the internet — the massive energy, hardware and human labor it requires — to trick us into thinking we are separate from nature.
This vast footprint stretches across every corner of the globe. Hundreds of fiber-optic cables, buried underground and crisscrossing seabed floors, connect to millions of servers housed in nondescript data centers located from California and Florida to Ireland and Indonesia. With them comes significant environmental and social costs, from surging carbon emissions and local water scarcity to habitat fragmentation and community displacement.
“I’ve always been interested in that exercise around visibility, because there’s something so perverse in digital technology,” says Saum-Pascual, an associate professor of contemporary Spanish literature and culture and of new media. “Isn’t it surprising that the biggest thing that humans have made — the internet — is the most hidden?”
In this UC Berkeley News interview, Saum-Pascual discusses how she aims to pull the reality of the internet to the surface, and why we should treat AI like a “pharmakon — a kind of drug where a little bit can cure you and too much can kill you.”
UC Berkeley News: It’s easy for many of us to overlook the massive amount of energy the internet requires to continuously function. When we do a Zoom interview or stream a movie, it’s using significant resources. Why do you think it’s important to address the materiality of the internet?
Alex Saum-Pascual: It’s so true that we forget or don’t know everything that goes into making the internet and wireless technologies work. For this interview, I think I’m talking to you, right? And I am, but I’m also talking into this mic and then all these screens made by all these companies. I also don’t know where the recording platform is storing this data. We don’t know. There’s a lot of opacity in that.
Isn’t it surprising that the biggest thing that humans have made — the internet — is the most hidden?
And it does matter because different companies and their data centers are located in different parts of the world that have different policies, that have different approaches to sustainability, to labor — and all these things seem to be fading into the background. We either don’t know or don’t want to know.
In digital tech, we have this object that is virtual — and we use this word “virtual,” which points to immateriality, even ephemerality. And the software part of it is soft — it’s like a cloud. It sounds harmless.
But then we have the massive hardware or infrastructure systems that require space and resources and changes around local policy, and they move people from one place to another, they use their energy resources, they produce emissions. And it seems that the proliferation of these technologies requires their invisibilization, of this massive land infrastructure, for us to keep engaging with them.
How is the recent explosion of artificial intelligence and its ever-growing data needs contributing to this eating up of natural and human resources?
The environmental impact of digital tech has always been there. It’s always been bad. And people don’t usually think about it. I do think that it’s become a little bit more mainstream in the past couple of years, perhaps, I guess, thanks to the big resource push from AI generation.
Artificial intelligence requires a huge amount of energy to be running and developed in the way that we are right now. And it will require an even bigger push in the future, because as models become more sophisticated, they will be deployed in more and more areas of our life. So it’s that sort of perverse and weird Jevons Paradox, which talks about how as something becomes more efficient, the demand for it increases and with that demand increase comes a demand for more resources, so in the end they cancel each other out.
In a way, though, this growing awareness around Gen AI is sort of hiding the environmental impact of other technologies that we use all the time.
Like what?
Streaming requires massive data centers to be on all the time just for you to engage in whatever show that you’re watching, and that’s very polluting. Or the thousands of unread emails that you have in your inbox, they’re stored somewhere. Maybe just delete them. Or the push for people to have bigger and bigger televisions in their house that require 4K. That requires more data, more consumption.
We never think about those things because they’re so commonplace. When something becomes automatic, it becomes invisible — we stop noticing it. It becomes such a common feature in our lives that we just don’t pay attention to it.
How might the arts and humanities help us see the impact of these digital technologies when they’re already so woven into our everyday lives?
In his 1917 essay “Art as Device,” the Russian Formalist critic Viktor Shklovsky proposed that the purpose of art is defamiliarization, that estrangement of the everyday could allow it to be seen anew. It gives a sort of ethical and even an activist component to the roles of literature and the arts — that by making things strange, they can be noticeable again.
But I’m finding that argument less and less convincing, because we know. We know a lot of the things we’re doing, we know they’re bad. I mean, I just told you streaming was terrible, but I’m probably going to go watch a show tonight. There’s this kind of terrible possibility that understanding something doesn’t lead to a change in behavior. That’s a problem for me.
What do you think would break people out of knowing something is harmful and doing it anyway to changing their behavior, even a little bit?
Maybe we need to look at different ways of knowing something, even if they’re not intellectually understood or they’re not captured by the current representational modes that we have. That’s a question that I am exploring, both in my own creative practice and with my students at Berkeley.
How so?
I recently published a digital poem, “Resistance to” — it was commissioned by the Los Angeles Review, and they asked me to reflect on the concept of joy or resistance through joy.
I tried to step away from using automatic generation because since everybody is writing with bots now, I think the challenge is not to make the bot sound more human, but for humans to sound less like bots. And I wanted to bring on a digital object that presented very clearly my intervention, my body, my presence.
Since everybody is writing with bots now, the challenge is not to make the bot sound more human, but for humans to sound less like bots.
The poem starts on the website, and you can read the first part of it, but that requires a download to your own personal computer to show this journey of information between the website and your own computer to highlight that infrastructure. And I map the distance between my computer at home and the server where this website is located, and I map a possible journey. And that made me reflect not just on the materiality of the object, but on my own journey as an immigrant body in the United States.
So anybody experiencing this work has to do some work. They have to click and then load and then zip and open, and then it opens a website that is interactive. You have to navigate through the different screens and find the hyperlinks or the hidden hyperlinks. As they do that, they see that the work itself is moving — it’s being copied, it’s occupying space, it’s using resources.
And I started learning to teach my students how to look at a work and say, “How was that made? Let’s see if we could make it,” and engage in that more hands-on practice around digital work and the literary.
You teach a course where your students create works of electronic literature — pieces that rely on digital technologies to exist. What sorts of pieces do they create and what do you hope they gain from doing this?
In the class, we look at different modes of electronic literature. They might experiment with geolocated narratives that shift based on a reader’s physical location or with creating algorithmic poetry generated by code in real-time. I also invite students to create bots. In the end, the point is for students to realize that it’s not really the technical skill that is going to make them successful in this class, but the critical use of those technologies with the right poetic or narrative intent.
How do you think about teaching your other literature and cultural studies classes that don’t directly interface with digital technologies? Do you have an AI policy in these classes?
It’s difficult because I’ve become sort of the AI person and yet I’m always in my classes being like, “Don’t use it, don’t use it.”
But it’s not exactly that. I would say, like everything, use it understanding that you are participating with something that has this history, that has these origins, that is creating and having this impact in the world. Just like when I decide to eat a burger: I know it’s unsustainable, so then maybe I try to eat less meat, let’s say.
The thing I think is important is to be very intentional in our uses around Gen AI and, in particular, the more commercial products that we use. Like Berkeley partners with Google, and so we use Gemini. And whenever I open the platform, I’m invited to use some kind of image generation, and I’m like, “No, I don’t need this.” But there might be a case where you really could benefit from this use. So use it lightly.
I like the idea of digital technology being sort of like a pharmakon: a kind of drug where a little bit can cure you and too much can kill you.
This interview has been condensed and edited.