KIM STANLEY ROBINSON:

And this is a science fiction thought that we are experiencing the future as impinging on the present in ways that perhaps didn’t happen in the past, where you had stable technologies for quite a long time, and then changes. And it’s pretty clear that whatever’s going on in these large language models, they are certainly doing a lot of work. Trillions upon trillions of operations to get to what Marx would have called “the general intellect.” And then, maybe under-acknowledged here is everybody’s going, “Oh my God, these AIs are amazing.” What I think is, ‘Oh my God, these programmers are amazing.’ Not that the AI isn’t doing interesting things. It is. But it’s doing them because the programmers have thought up inputs and instructions that are actionable at the level of computation.

REID:

Hi, I’m Reid Hoffman

ARIA:

And I’m Aria Finger.

REID:

We want to know what happens if, in the future, everything breaks humanity’s way.

ARIA:

We’re speaking with visionaries in many fields, from art to geopolitics, and from healthcare to education.

REID:

These conversations showcase another kind of guest. Whether it’s Inflection’s Pi or OpenAI’s GPT-4, each episode we use AI to enhance and advance our discussion.

ARIA:

In each episode, we seek out the brightest version of the future and learn what it’ll take to get there.

REID:

This is Possible.

REID:

This season on Possible, some of my longtime favorite builders and thinkers are joining us. And our guest today is certainly one of them. In my opinion, our guest is among the most qualified in the world to help us think big and speculate about the best possible future.

ARIA:

Today we are talking to none other than Kim Stanley Robinson. For decades, he’s been a respected and prolific author of science fiction novels, short stories, and essays. The Ministry for the Future is currently making its way through my household, but many of you know him for the Mars Trilogy, which has won numerous awards. The book spans the 200-year journey of colonizing Mars and,  eventually, the rest of the solar system.

REID:

Stan’s work is thought-provoking, prescient, and fearless, especially as he delves into social and political commentary. His books address issues like economic inequality, governance, and the ethical dilemmas posed by technological advancements. So, I’m sure you get why we’re so excited to speak with him. He nimbly operates where science fiction becomes science fact, an area that we all inhabit as technology and society continue to interweave.

ARIA:

Stan once remarked, “By writing about the future, you’re really writing about now in a way that has poetic and symbolic meaning.” So we sat down for a wide-ranging conversation about technology, civilization, and the future. What technological advancements can we expect, and how might society and humanity change?

REID:

And, top of mind for many of us: How might AI change us? Perhaps an incisive and seasoned futurist like Stan can light the path ahead.

ARIA:

Here’s our conversation with Kim Stanley Robinson.

REID:

Much has been said about the technology or future you’re imagining coming true, but as a serious fan of your writing, I’d like to ask you about the characters in your stories. If you could bring one character from your writing into real life, who would it be? And you know, would you go on a mountaineering hike? Would you have a dinner conversation?

KIM STANLEY ROBINSON:

Yeah, that’s interesting, Reid. Thank you for that. My characters mean a lot to me. I’m a novelist, and, you know, despite the, the content that is visionary or futuristic, the way science fiction portrays different societies— still, as a lover of novels, I feel like they’re always based on characters in a plot. And so, my characters matter to me. And it occurs to me that my characters are based on people that I know. And so I don’t really need to have a character brought to life to have them around me. I don’t have superheroes or omnicompetent characters. For the most part, they’re pretty ordinary. And so that leads me to this: I once wrote a novel with a very great lead character, because it was Galileo. And so I wrote Galileo’s Dream to give it a sense of what I called in that novel “the first scientist.”

KIM STANLEY ROBINSON:

In the novel, Galileo is dragged off by time travelers to the, to the year 3020, and so that he can be on the moons of Jupiter. And the people there want Galileo’s judgment on what they’re doing as scientists. And it’s a comedy, really. It’s a time-traveling comedy. But it occurred to me, thinking about this question, that I could make Galileo into a time-traveling detective. They could take him hither and yon in order to have his mind cast onto various topics. And it struck me funny. And for me, what I would like to do is just follow him around in his life as the character Cartophilus—a time traveler and narrator of that novel follows him around, lives with him, watches him at work. So, if I could bring back one of my characters to life—since he’s not really mine, he’s the world’s—I would like to see Galileo in action.

ARIA:

That is awesome. So, Stan, I’m going to zoom out from characters. You know, through your writing, you’ve been envisioning our future for more than 40 years. And that takes deep imagination, but it also requires, you know, a deep understanding and conviction about, like, what are humans capable of? Like, both good and bad. And so we’d love to hear from you: How do you develop these prophecies about humans and our universe? What is your process like?

KIM STANLEY ROBINSON:

Well, thank you for that. Especially asking about what humans are capable of in the future. I’m interested in that. That’s what a science fiction writer’s certainly pursuing as a question. So that makes you into a historian. I’m very interested in how we got to what we are now, because then you can run trajectory lines off into the future and say, “Well then, maybe we could become that.” But, so, sociobiology has been of great interest to me. How did we become human? What were we as pre-humans? What caused the brain to expand by a factor of three in just a couple million years in evolutionary terms? And of course, language is implicated there, but other things as well. And then, when I have stories that are set in the future, I’m working off of that basis of understanding us as social primates with a certain particular history.

KIM STANLEY ROBINSON:

And then, the one thing I want to say here is that I have avoided doing what certain science fiction writers have done, quite famous ones, in establishing a “future history” that all of my works conform to and fit into a timeline. So some greats have done that. Heinlein, famously. Asimov, to a certain extent. There’s some retroactive making-it-all-fit-when-it-didn’t-fit-in-the-first-place type work. I’ve always abjured that—that’s not the right way. Each novel should have its own history and past leading back to now, each science fiction novel. So that’s what I’ve done many times. The assignment has been weird. I’ve said to myself, like in 2312, let’s see what humans are like 300 years from now. Well, for me, that’s an enormous stretch. For some science fiction writers, that’s like a snap of the fingers.

KIM STANLEY ROBINSON:

For me, that is just as long out as I can possibly imagine. And then you have to do the historian thing again. Well, if things are so different in 300 years, what happened in the first half of the 21st century? Then the second half? Then the 22nd century? Oh my God. Can you even think that? And so for each novel, I do something that we all do all the time: Cast your mind forward. And it’s sort of an if-then: If we do this, then we’ll get to that. It’s extremely basic to human cognition. We are all science fiction writers. We all do this all the time.

REID:

So, you know, part of the thing that is an interesting parallel between what you do and, and kind of what I do with my tech investor hat, is: You’re making predictions about, you know, kind of based off what humanity currently is, and the patterns of humanity, where it’s going, and then also technology, right? But what would you say—if you think about this as kind of the timeline of humanity is—where is humanity? One way of asking this question might be: What’s your theory of human nature? Where are we on the narrative arc of humanity?

KIM STANLEY ROBINSON:

I’ll pursue it kind of one strand at a time, ’cause these are several interesting strands. One of the things that has gotten stronger for me as the last 40 years have passed is a conviction that humanity coevolved with Earth. It will never prosper off of Earth. Our trips off of Earth are trips into the death zone, which is what climbers call the elevations above about 26,000 feet, where the body starts to break down when you’re a mountain climber no matter what you do, and you have to get back down out of the death zone. I think all space above 26,000 feet is the death zone. And while we’re up there, we are in the process of breaking down. And this has been caused by many factors, but I would point out one main one, which is the gut microbiome. Not something we talked about when I was a young science fiction writer.

KIM STANLEY ROBINSON:

And so, the thing about that one is this information that now is common—50% of the DNA inside your body is not human DNA. Well, this is news, and it changes everything. Because you can’t take Earth with you when you go off, even to the moon or Mars, much less off into interstellar space, where I have actually written an entire novel suggesting we’re not going to the stars because we coevolved with Earth, and we’re only healthy here. And in my solar-system novels—where you can see some from the mid-eighties and some from the last 10 years—you’ll see a shift in that in the ones in the last 10 years, people always have to go back to Earth for their sabbatical. They have to go back and eat some dirt. They have to get connected with the biosphere of Earth, which is their extended body, in order to be healthy again.

KIM STANLEY ROBINSON:

And then when they go back out into the moons of Jupiter or the various solar-system spaces that I’ve invented over the years, they are, in fact, getting unhealthy. So that’s one consideration that has grown in my mind. This geocentricity of the humanity—that we are coevolved with this planet and biosphere, and that the Earth is our extended body, and we begin to die as we leave it. Maybe slowly, you know, maybe it just will cut years off your life, like smoking cigarettes or something. But, in any case, it’s bad for you not to be on this planet. And then you mentioned human nature. I’m very interested, as I said, in social biology. We do, we evolved by certain activities with each other—we’re social species. So human nature is intensely social, and therefore there’s both competition and altruism.

KIM STANLEY ROBINSON:

And I love this phrase by Raymond Williams, an English literary critic—structures of feeling—that in any historical moment we have basic biological feelings for sure, as animals—fear, love, hope, et cetera—but culture and language together shapes a much more articulated array of emotions that strike us as basic and natural. But in fact, they’re cultural and learned. And so the big, basic biological emotions, strong as they are, are split up into—I mean, in English, you can quickly come up with 300 or 400 words for emotions. And so this huge, articulated system of emotions is—changes over time in what Williams called a structure of feeling, where what feels normal at any one given time, at an earlier time it might not have felt normal at all. And the, the emotion that we could, in English, tag as love.

KIM STANLEY ROBINSON:

I mean, famously the ancient Greeks would split that up into eros and agape and a couple more. But in, it’s so clear that in different cultures that means different things to people, depending on what they’ve learned in their cultural system. And so when you’re writing science fiction, you have to think to yourself: Is this future society going to have a different “structure of feeling,” and can I possibly think my way into that, or suggest it in a way that alienates the reader? The reader is going, “Oh my gosh. These people, they’re different than they are now. They’re different than me.” And that’s an accomplishment, both for historical fiction and for science fiction.

ARIA:

So it’s interesting, Reid and I, last week, were with Ariel Ekblaw, who is a previous Possible guest from season one. If anyone hasn’t listened to her episode, she’s an expert on going into space, and she dreams of going to the moon and going to Mars, et cetera. But what she was positing last week, interestingly enough, was perhaps Earth is for us. Earth is for humans. And actually, what is going to happen in space is offshoring, off-planeting. We do chemical processes up in space, and then we come down and live on our sort of beautiful green Earth, where we don’t besmirch it with some of the bad stuff we’re doing right now. So, lots of science-facters agree with, agree with that science fiction take. Also really interesting to see you talk about the difference between, you know, some of your books take place in plus-15 years, and some of your books take place in plus-300 years. And there’s like such a different experience in creating that. And I think sometimes people underestimate what’s, how different the world’s going to be in 25 or 300 years. So, would love to ask you about progress. Progress of humanity. Progress of the human race. Are there, like, epicenters of progress? How are we doing right now?

KIM STANLEY ROBINSON:

Oh, I do think there are moments of intense acceleration that are mostly accidental. Sometimes they’re pushed, as in wartime development. World War II very famously began what social scientists called “the Great Acceleration.” Everything that we measure since World War II has been accelerating pretty consistently, with jumps along the way of intense acceleration. I’m going to say one of them was very definitely 2020 in the pandemic, an acceleration in our awareness of our dependence on the biosphere, and the ramifications of that are still unfolding for us. And sometimes you can think to yourself—this is something for everyone to ponder— that, after a period of acceleration, things slow down. But all these things are relative to each other. And it could be that, in a, after a pace of intense change like these last three years, that what we now think of as being slower—if we do—is actually just adjustment to that more rapid rate of change, and that we’re on a perpetual speed up of change.

KIM STANLEY ROBINSON:

And this is a science fiction thought: that we are experiencing the future as impinging on the present in ways that perhaps didn’t happen in the past, where you had stable technologies for quite a long time and then changes. I just went to the bicycle shop yesterday because I was irritated, because things I knew about the bicycle that I learned 50 years ago were no longer true, and I couldn’t get the tire off, to my chagrin. And that’s because there was a new technology. Well, it was 50 years since I last looked at how a bicycle works, and of course there’s going to be some changes. Bicycles are very stable compared to other technologies. But, so this speed of acceleration is of huge interest to me. And indeed in all my, one of the things I’ve retained in my science fiction. And let me quickly put in an aside: There’s near-future science fiction that is really day-after-tomorrow or now. There’s far-future science fiction: Space opera—Star Wars, Star Trek, a kind of fantasy space ’cause, where people are zipping around the galaxy.

KIM STANLEY ROBINSON:

And then in between those, I call it future history, ’cause there’s not a good name for it—a couple hundred years out. So in my, in my future histories, the Mars trilogy, my earlier work set in the solar system, 2312, I keep kind of coming back to that zone that’s a couple hundred years out as being extremely interesting, hard to write about, and somewhat depopulate in the science fiction landscape. In other words, people usually do near future or far future. They don’t usually go to this zone that I’m interested in, a couple hundred years out. So right from the Mars trilogy, and even before I had what I called the Accelerando—a period that, it changed in different novels, but call it, say the 22nd century, where we solved certain problems about our relationship between us and the Earth, and got a solid platform underneath ourselves in terms of our bio infrastructural life support system.

KIM STANLEY ROBINSON:

And then all of the other advances we were making kind of wound together into what I call the Accelerando. Amongst science fiction writers, I’m kind of a slowpoke in terms of belief in what kind of progress we’re going to make. And I’ve been more interested in the socioeconomic features than in the high tech features. So for me, rather modest changes seem like a big acceleration. [Laugh] And acceleration of justice. These, these issues have consumed me my whole career. Utopian fiction is about making a better social system. A more just, more equitable, more successful for all, rather than just a few. And so that has put me into a slightly different zone of what, what kind of acceleration I’m talking about.

REID:

This actually goes to a pattern that I’ve been curious to talk to you about for a number of years, which is: it’s this interplay between the future of humanity and future of technology. Because, you know, part of the question about, you know, what opens up new possibilities of human organization, civilization, societies, et cetera, is in part technology. ‘Cause you have media technologies, which are the constitution of the state. You’ve got, you know, questions around, you know, financial technologies. One thinks about technology as a central thing about the reorganization, and you tend to focus very intensely on the, kind of the, the social and politics of it. And, you know, my reflex tends to be that the best way to change the, sort of, politics is change the landscape of, of technology, of possibilities, of, you know, kind of incentives and so forth. You know, so a classic one, you know, for climate is either geoengineering or, you know, nuclear power, you know, clean energy for doing stuff. And, say a little bit about why you focus where you do, why you’re, you know, to some degree more optimistic than most about the fact that we can make social progress without these other things. Shine a light there for, for how you’re thinking about it.

KIM STANLEY ROBINSON:

It is an interest of mine, and I think that all art has a political aspect to it, but science fiction is inherently political because, as you postulate your various feature histories, you’re revealing a theory of history that has a political element to it. What’s driving change? Et cetera. And so technology, there are entire books called Does Technology Drive History? But what I always want to suggest is software matters as much as hardware. And so, coming from the field as you do, from the angle you’ve come from, you must agree with this. The software is crucial. It isn’t just the metal, the plastic, the bits. It’s the software. Okay. Software is a technology. Therefore law is a technology, and justice is a technology. They’re software systems. And so at that point, I’m talking about technology as much as anyone else, but I’m talking about the software of how we run our affairs with each other.

KIM STANLEY ROBINSON:

There’s feudalism. Then there’s capitalism. Then there’s whatever comes next, because it isn’t gonna end. And again, Raymond Williams, there’s residual and emergent. At any moment in culture, there’s parts that come from the past, the system right before you, or even the deep past. There’s emergent features of the next system that are already here. And they’re harder to see, and they’re harder to tell whether they’ll be, come to dominate or not. But they’re there. And so residual and emergent, what Williams would say is that feudalism is a strange combination of the ancient empires and emerging capitalism. Then capitalism is a strange combination of residual feudalism, where there are kings, there are serfs, there is a managerial class that runs the thing for the kings, and then there’s something emerging that’s different. What might come next? And since we come from a system that—which is to say America—that values, and at least states as a value, equality and justice. Then you can imagine increases in those factors.

KIM STANLEY ROBINSON:

That would mean that the intense power relation and inequality of capitalism is going to give way to something more just. What would it be? How would it work? Well, these are interesting questions for stories to tell. So that’s my technology. I mean, that’s my focus in the way some people might focus on AI, or on geoengineering, or on engineering in general, on rocketry. These are technologies that have driven a lot of science fiction stories in, in interesting ways. But for me it’s been this, this software system that we call, I don’t know, government.

REID:

I think we would be remiss getting into this kind of like, how is the future being affected by the software and, and, and humanity, not to think about, you know, kind of AI and civilization. So how do you think AI is going to affect how we organize ourselves?

KIM STANLEY ROBINSON:

Well, it’s a good question Reid. And first, I want to confess this: I am confused and even stunned in my thinking about AI. I come to it as an English major and an outsider, but also as a science fiction writer. So I have been thinking about it for many a year, but the recent rapid developments have left me in a state of confusion as to what’s might come next. That’s how I want to start. And then I want to say that the Turing test was always a very low bar. We maybe didn’t know that. But it was. And also, artificial intelligence is a hype name. It is not an accurate descriptor for clarifying our thinking. So if you were to call it extremely rapid computation, you would have a completely different vibe coming off of it than when you say artificial intelligence. ‘Cause what’s intelligence? And okay, it’s artificial.

KIM STANLEY ROBINSON:

Yes, indeed. We’ve concocted it. But what is intelligence? And if it’s only just extremely rapid computation, and if humans are doing something different in their brains and bodies, in their embodied minds—if they’re doing something different than simply extremely rapid computation, which is a big if—then we’re talking about different things and different capabilities. So, okay, with that said—and I want to still say that I’m still blown away by the most recent developments—the course that we’re on now with the systems that we’re using now, as amazing as they are, they’re not going to lead to minds thinking like ours are. They simply aren’t. So sometimes they call it AGI—artificial general intelligence—which might involve, what would it be, there you begin to drop into philosophy at this point. Questions of agency, will, judgment, desire. As I say, you, you almost immediately plunge off into the deep ends of philosophy and human thinking, which is already murky enough, to the point where, to think that the, that you might put—here we come to software again. 

 

KIM STANLEY ROBINSON:

The algorithms that we ourselves have written, we can ask these algorithms to recursively improve themselves, but they don’t know what that means because we don’t know what that means. We can’t expect what we’re doing now in AI to result in artificial general intelligence, because we don’t know what that is, and we don’t have algorithms that will make it happen. So let’s go back to just extremely rapid computation. What can it do for us? Well, it can do a lot. For one thing, and now it can imitate ordinary human discourse and even human visual imagery. And so it can be, in the hands of the good actors or bad actors, it can be a really powerful tool. And that makes it interesting. Powerful tools are always interesting. And since we are talking about software systems of how we run society, there are people who are saying, “Look, our current levels of AI, we could have direct democracy. The entire world could be one eight-billion-person congress running the world, and AI would facilitate that.” Probably true. And interesting as hell.

KIM STANLEY ROBINSON:

So, so many different social and philosophical questions are being mushed into one topic—as if it were one topic, when it isn’t. And the last thing I want to add to this, about this notions of AI and rapid progress in, towards artificial general intelligence, is this: We don’t know how brains work, and we can never know how our brains work. Whether we can develop better, more exact tools to figure out how, see how we’re thinking while we’re thinking, well, this is one of those science fiction questions. I think there are going to be limits into how well we can understand how we think. And since we don’t understand how we think, and maybe can’t ever—it may be one of those barriers that we can’t get past—the idea that we could then concoct something that thinks like we do seems to be more remote than ever.

KIM STANLEY ROBINSON:

But I have my reasons for thinking about this in terms of AI, and I have written entire novels in which the AI is the narrator. So I’ve had a lot of laughs, I must say, pondering what artificial intelligence might be like when it’s tasked with, say, for instance, write me a novel. Because famously, right now, AI cannot summarize very well. They can’t find out points of importance and then summarize them. If you give it the task and it just tries to recombine in the ways that it’s been told, well, you get results, and they’re not uninteresting.

REID:

Just to go into the, some of the AI stuff a little bit deeper—I do think it actually is doing pretty good job on summaries. Like, part of what you’re getting is, you’re getting a kind of almost like a, kind of a, a social collaborative intelligence by the training on large corpuses of language, which allows it to do linguistic tricks easily. It allows it to do translation, allows it to do, you know, code generation, a bunch of stuff. And some of it is, like if you said, like, one of the things I think will be a, will be a common part of our toolset in the next year or two is, you know, you get a document, and it says, “For you, here’s the summary.” Or, “For you, here’s the three pieces that are most important to look at.” And it will be in a fitness function loop by which it will be getting much better than that.

REID:

And I do think that one of the things that’s interesting about this is that is, if you go kind of steam-engine-of-the-mind, is we are going to be kind of changing our thinking, communication, and analysis patterns, because it’ll change, kind of, where our cognition most leverages the tool. So, for example, you know, like, you know, when we have writing, memory wasn’t as important [laugh], right? When we have Google, memory is not as important. So it changes the shape of which things we need to do. And AI is going to do the same thing, I think, in this kind of symbiotic, you know, relationship with us. And so I think that’s some of what we’ll see. But I think it plays very well into your—kind of the social technology post-capitalism science fiction for, you know, some of the thinking you’re doing. And that’s the reason I’m kind of, I’m, I’m nudging you a little bit more in that direction, ’cause I would love to see what you will make of it in your future writings.

KIM STANLEY ROBINSON:

Well, thank you for that. And you did send me indeed a personalized narrative where Kim Stanley Robinson was the prompt for the AI to personalize, and it was pretty amazing. And it’s pretty clear that whatever’s going on in these large language models, they are certainly doing a lot of work—trillions upon trillions of operations—to get to what Marx would have called “the general intellect.” This is what everybody has said about this before. And you’re right, it’s, well, I don’t know if it’s summarized so much as it’s collated. If you see the distinction I’m making: Enough sentences have been searched that match that when you put it together, it makes a grammatical—and sometimes a logical—whole that is new. And that I find stunning, because really there the programming, the algorithms, and the inputs from humans have to have been powerful, suggestive, and generative in some ways that are not trivial.

KIM STANLEY ROBINSON:

I would not—there are some people you can see just scoffing at large language models as being nothing but recombinations of previous sentences that have been sorted by probability. I don’t think that’s true. I can see that there’s something a little more generative going on. And so that implies to me that the programmers are doing interesting things. Not that the AI is doing interesting things. It is, but it’s doing them because the programmers have thought up inputs and instructions that are actionable at the level of computation. This is pretty great programming. And maybe, you know, under-acknowledged here is, everybody’s going, “Oh my God, these AIs are amazing.” What I think is, ‘Oh my God. These programmers are amazing.’ And that’s where the ingenuity and the creativity—let’s try this—and the reiterative aspect of working on these problems from the point of view of the programmers, not just the AIs, is, it’s a hidden intellectual effort.

KIM STANLEY ROBINSON:

And some of it is ’cause it’s proprietorial, and people are hoping to make money off of it. And some of it’s because it’s so much human work that you couldn’t describe it to an outsider, even if you tried. So this is where it gets interesting. And at the level of commentary and journalism, you don’t have a proper acknowledgement of the, the human work behind these new large language models. And the ingenuity, the creativity. What it’ll lead to is where I have to just confess—I’m just too flabbergasted at the current progress to be able to think very well about what could come next.

ARIA:

I also love that, that like, everyone’s like, “The amazing ingenuity of AI.” And it’s like, well, who, who do you think created it? [laugh] We did. We’re the ones who are creating these tools. And, you know, I think it is so important, you talk about nomenclature. It’s like the journalistic headlines, you know: “Extremely rapid computation is going to take over the world and kill you.” Like, might not be as, you know, people might not get as scared, and so when you use different words, that really matters. Which brings me to the words that you use to try to influence policymakers, public discourse. I mean, my husband and I both read Ministry for the Future this past summer. It was just incredible what we were learning while we were, you know, postulating about the future of our planet. So how do you think about that? Are you, are you writing these books, and your more recent stories to save our planet, to influence the public policymakers? Is there a call to action with them?

KIM STANLEY ROBINSON:

Yes. I would say yes, especially for The Ministry for the Future. But also the whole line of books back have been ways of thinking about how we decide what we do and what we should decide to do from where we are now. And just briefly, my novel Aurora suggesting that this notion that humanity’s going to go to the stars—and that’s a, that’s a measure of humanity’s success—is a gigantic category error, and a story we told ourselves before we knew certain things about the size of the universe and the nature of our biological being, that it turns out that story is wrong, and we should toss it aside and really focus on Earth. So I’ve been trying to impress ideas, notions, philosophical positions, political positions through my novels all along. With Ministry for the Future, I was thinking, let’s start in the present.

KIM STANLEY ROBINSON:

Let’s do a best-case scenario for the next 30 years; which means we dodge a mass-extinction event, we began to decarbonize. And if we do that in the next 30 years, that’s a utopian story at this point. ‘Cause We are in terrible, terrible danger. So I did all that on purpose, and indeed it was a kind of an everything principle. Even the things I wasn’t going to cover, I needed to list in the novel as things that should be covered in some other novel. I wanted to be comprehensive. I wanted to throw in the kitchen sink. And I did. And it seems to have worked partly because of the juggling act and the formal games of the novel. Partly because of the characters; Mary and Frank and Badim and all the rest of them. And so it functions as an ordinary novel that you can come back to like every third chapter.

KIM STANLEY ROBINSON:

But in between you’re getting all of the cards on the table—the, the reasons why this future could happen and that kind of thing. The, the ideological understructure is kind of, it’s like looking at the backside of the carpet. You have to look at the threads, and you have to appreciate threads as threads, which some people don’t. They’re like, please just give me the picture. But Ministry is a particular kind of a novel. It’s not unique, and it’s not new. There have always been some novels like it. And I, I read those, I drew on those, I loved those. Just for one: John Dos Passos’s U.S.A. trilogy, which is the great description of the 1920s in America. That kind of multi-stranded, many-charactered comprehensive novel. They exist. And I did one there. And indeed, I was hoping—but I have to say immediately that my hopes have been exceeded so hugely—that again, I’m stunned, and I’m at a loss to explain what’s happened in these last three years since Ministry, since it came out.

KIM STANLEY ROBINSON:

I’m wondering, and more and more I’m convinced, A. People are scared. They want a story that tells them that the ordinary workings of civilization could actually succeed, even without a plan. Yeah. And then, B. That the pandemic was a slap in the face, such that after the pandemic, during the pandemic, reading Ministry had an extra spur in the butt or in the brain to make you think, ‘Okay, this isn’t just a science fiction story. This kind of stuff does happen, ’cause I’m living the pandemic right now.’ And there was an urgency there that has been, yeah, it’s been amazing and stunning. I’m, I’m still in the midst of it, still trying to cope with it.

REID:

Two related questions: What have you seen as, kind of, feedback from Ministry for the Future that, you know, gives some lens to the use of science fiction—you know, Accelerando, 300 future years of history—in order to try to help us navigate and shape? And then similarly, what would you call on scientists, government leaders, business people, and so forth to go, “Hey, look, part of what I’m trying to do in this kind of leadership is to get you to, to expand your thinking, your capabilities, your action in the following kinds of ways.” And that’s obviously—the two questions are related.

KIM STANLEY ROBINSON:

I’ve seen, I think, that people want to put their own particular efforts and their own disciplines into the context of the totality of the larger historical picture. So if you get a novel like Ministry in particular, it allows you to think for a while the big picture and then see your own work as a brick and a wall. And you can see the wall. And you need to see the wall, partly just to keep from losing hope that your brick isn’t big enough. ‘Cause nobody’s brick is big enough. And so that I’ve seen, and that’s one explanation for the rather incredible response that that book has gotten. But then also what I’ve been learning is everybody was already working on all this stuff, and they maybe didn’t know about the other efforts. And I myself didn’t know about the efforts. So I wrote about the carbon coin based on this one paper by Delton Chen. It’s a good idea. It’s a good concept.

REID:

Pi, how would you explain this?

Pi:

Delton Chen’s carbon coin is a concept that aims to address the issue of climate change by creating a new currency backed by the removal of carbon from the atmosphere. The idea is that this coin would be used to reward companies and individuals for their efforts in reducing carbon emissions and removing carbon from the atmosphere. Chen argues that this could provide a financial incentive for climate action and help to reduce the amount of carbon in the atmosphere. The concept has gained some traction in the scientific and financial communities, with some seeing it as a potential solution to the challenge of climate change. However, it’s important to note that implementing such a currency would require significant political and economic support.

KIM STANLEY ROBINSON:

I didn’t know the network for greening the financial system already existed. It was 90 of the biggest central banks. They’ve been trying to figure out how to green money right at its source. Fiat money, as it’s created by the central banks of the world, will be tilted green right from the get-go. That’s the carbon coin. It was being worked on before my novel was published. But since my novel came out, people have been pointing out to me things like that. That one I think is crucial, ’cause we need to pay for all this good work. But there’s so many other areas where what I was postulating as being futuristic and might happen in the 2040s after horrible disasters, it was already happening in the twenty-teens. And it’s efflorescing and coming to the fore in the 2020s because it’s such a scary moment. And because of all these things that are making the 2020s a crucial decade. 

KIM STANLEY ROBINSON:

And because of the response to my book and people inviting me, and also just telling me, “Oh, you need to look again at chapter 67.” “Oh, you know, I can see that you know about landscape restoration, but come out and see these 3,000 acres in the Central Valley where we are working our butts off just to change one cow field back into a tule marsh,” et cetera—it’s an ongoing living process in the world. And of course, I found that immensely reassuring. So Ministry for the Future is not prophetic, or it’s not calling for things that didn’t exist. It turns out to be a kind of an individual scattershot bit of reporting on things that were already happening, and sometimes I didn’t even know about them. So, I would say that here we are in late 2023. The education I’ve gotten in these last three years has led me to believe that although we’re going to, it’s going to be a wicked political fight, financial fight, and possibly physical fight with weapons, that the possibility of success is truly there. And that’s been a kind of, you know, I was a more worried person in 2019 than I am now. And that’s kind of amazing.

REID:

And with that, I think we shift to rapid fire. So to open, other than obviously your own work, is there a movie, a song, or book that fills you with optimism for the future?

KIM STANLEY ROBINSON:

There are so many: Galaxy Quest, Blast from the Past—but here’s the one: Groundhog Day. Because it’s one of the greatest movies of all time, and it’s existential. How do you live? How do you live? And you got to keep working at it. [laugh] So, let’s say Groundhog Day. Yeah.

ARIA:

Oh my God. Stan, that was, that was the best answer yet. So you have, obviously, you know, you’ve been interviewed hundreds of times. Is there a question that you wish people would ask you more often?

KIM STANLEY ROBINSON:

Not really, because so many have been asked. Maybe it seems to me that it’s kind of fun that I don’t get asked this much, you know, “Okay, now we’re done. What are you going to do with the rest of your day?” Because then everybody’s got the rest of their day—they, either listening to this podcast or, you know, us making it. And so, I never get asked that. And I can say, “I’m going to go garden. I’m going to go weed nutsedge, my bête noire in my garden, the unstoppable nutsedge, and plant cabbages. And then I’m going to go run the Frisbee-golf course with my running friend Neil and, and hang out with the family here at home. So that’s the rest of my day, and I hope everybody else has one as, as pleasant.

REID:

Going and climbing a mountain was one of the things I would’ve expected as a possible answer from you.

KIM STANLEY ROBINSON:

Yeah, I like that. But not today. Yeah.

ARIA:

So leave us with a final thought on what you think is possible to achieve if everything goes the right way in the next 15 years. And, and what’s our, sort of, first step in that direction?

KIM STANLEY ROBINSON:

Well, the, the potential is, is huge, which I would just say general prosperity and every human living their full life. That’s possible. So the first step is, we take it every day. It’s a grinding minute, and millions of different developments have to all happen. And everybody’s got that brick that they’re working on in the larger wall. So it, it, I think that climate dread is real. I think despair is real and, and justified by the situation that we’re in. And the, the kind of negativity that we face from some humans that it is going to be a wicked political fight to get to this best-case scenario. But the potential at the end of it is truly remarkable if everybody gets to be fully themselves. We will still have all the natural tragedies of death and loss, love, and all that. It will not be a world of universal happiness. Not at all. But we won’t have the unnecessary tragedies or the blighted potentialities. It’s quite possible to get to eight billion people, and the human population will slide downwards over time, into, even a more comfortable levels, in terms of the biosphere supporting us. But adequacy and prosperity for all—quite possible, and the, what can I say, the, the reason for the fight.

REID:

Possible is produced by Wonder Media Network, hosted by Aria Finger and me, Reid Hoffman. Our showrunner is Shaun Young. Possible is produced by Edie Allard, Sara Schleede, and Paloma Moreno Jiménez. Jenny Kaplan is our executive producer and editor.

ARIA:

Special thanks to Katie Sanders, Surya Yalamanchili, Saida Sapieva, Ian Alas, Greg Beato, Ben Relles, and Little Monster Media Company.