Here are a few choice tidbits from the interview:
“There are specific technologies—automation is maybe the most obvious—which will kick a lot of things loose, though they won’t actually define what the next stage looks like; that’ll be determined by how we respond. So we could end up aiming for a Basic Income pseudo-plenty society (we’re not really post-scarcity without some pretty dramatic biotech or physics advances), or we could end up in a kind of hypercapitalist wasteland with no social security net and chrome city-states scraping the cloud layer. Or whatever.”
“The ultimate end point of being under that control—and surveillance is first and always about control of the environment and of people as an environment—can be startling. That end point can be a fruit-seller setting himself on fire in a marketplace. It can be collaboration in a fugue state — which turns out to have been weirdly common in East Germany. As the Stasi files were released, it turned out that basically one-third of the population was spying on the rest, and a lot of those people were horrified. They didn’t remember doing it. Now, you can be skeptical about that, but it seems possible that a given percentage of them legitimately had no recollection of having been part of a surveillance machine. There was a massive psychological twisting to deal with an intolerable pressure. The other outcome can be revolution in the real sense. Bentham called the Panopticon a mill for grinding rogues honest. He was half-right. It’s a millstone, and it can force compliance, but it also creates explosive resistance.”
“Reality is literally not what it seems. Your memory is wrong. Almost everything you think you know about yesterday probably didn’t happen quite the way you think it did, and every time you go over it, you’ll change it. And that’s before we talk about the real difference between the Newtonian world we experience of billiard balls zinging around, and the world Carlo Rovelli explores in Reality Is Not What It Seems, which will really bake your noodle.”
“Why are we trying to build AI? It’s not because we want to have something that makes coffee properly and walks the dog. It’s because we want a perfect, wise friend to stop us from doing stupid shit. We’re trying to build the angels we were promised who never show up.”
“Our societies are defined by the technologies that enable them. Humans without tools are not magically pure; they’re just unvaccinated, cold, and wet.”
“Science fiction is how we get to know ourselves, either who we are or who we might be. In terms of what is authentically human, science fiction has a claim to be vastly more honest and important than a literary fiction that refuses to admit the existence of the modern and goes in search of a kind of essential humanness which exists by itself, rather than in the intersection of people, economics, culture, and science which is where we all inevitably live. It’s like saying you can only really understand a flame if you get rid of the candle. Good luck with that.”
“It was infuriating. I loved it, but it was horrible, like climbing a cliff face and then someone comes along in the night and just moves you back to the bottom. I had to write, go back, rewrite another character, then rewrite what I’d just done in line with the new text, then rewrite something else in line with both… and so on. The book is iterative in a very literal sense, or accreted, or laminated, or… I don’t know… 3D-printed rather than just formed or extruded.”
“Narratives are compressed expressions of identity, cross-sectional slices. They can tell you things you need to know, but like any section or map, they do not tell you everything.”
Complement with my conversations with other authors about craft and big ideas, this New Books in Science Fiction podcast interview, and Cyrus Farivar on why Cumulus should be your favorite surveillance-fueled dystopian novel.
This blog exists thanks to the generous support of loyal readers. Become a member.