PHAEDRA ELLIS-LAMKINS:

We are trying to better understand how to make sure it doesn’t reinforce the things that aren’t great about society. And we were talking to some engineer, and I was like, “Oh, we need to take money out in the morning.” And he was like, “Why does it matter?” And I was like, “No, but their money’s gone by the morning.” And he was like, “How could someone’s paycheck be gone the day they get paid?” And I was like, “Ohhh.” If we build a system built on your assumptions, you don’t get what’s going on. Efficiency isn’t necessarily liberation, right? Especially if it’s just, you know, institutionalizing the same stuff. And so we’re also thinking about how do you build access, instead of efficiency, so that it changes the way someone gets to the model, right? And so that’s a lot of how we think about the technology we build: How do we make sure that our team is really making data-driven decisions based on the inherent experiences of the people that we serve? Which is why our payment plans have a 90% repayment rate. Because we’re like oh, people shockingly don’t want their water off. They don’t want to lose their electric. And we want to accelerate their access into the system that we believe should fundamentally work for them.

REID:

Hi, I’m Reid Hoffman.

ARIA:

And I’m Aria Finger.

REID:

We want to know what happens if, in the future, everything breaks humanity’s way.

ARIA:

We’re speaking with visionaries in many fields—from art to geopolitics, and from healthcare to education.

REID:

These conversations showcase another kind of guest. Whether it’s Inflection’s Pi or OpenAI’s GPT-4, each episode we use AI to enhance and advance our discussion.

ARIA:

In each episode, we seek out the brightest version of the future and learn what it’ll take to get there.

REID:

This is Possible.

REID:

There are billions of dollars of unclaimed social welfare money in this country right now. That means millions of Americans in need of aid—people living near or under the poverty line, single parents, unemployed people—there is money in the system that could help them, but it hasn’t been accessed. One point of friction: Many of our systems are confusing and outdated.

ARIA:

In addition to that, they take so much time. And talking about reforming our social welfare system—which basically is a ton of paperwork—it might not be as glamorous as going to space or creating a fusion energy reactor, but millions of lives could change for the better if we made these systems smoother and faster. We have the technology, but it’s not built and deployed to the benefit of everyone at scale quite yet.

REID:

How can we make social aid accessible to everyone? And as we think about developing technology like artificial intelligence, how do we keep it equitable? That’s our topic for today. If we get into the numbers here, fewer than one in five eligible families actually receive the food stamps they’re entitled to. For childcare assistance, that drops to one in 10 eligible families.

ARIA:

And ProPublica reported in December 2021 that the federal TANF program—the Temporary Assistance for Needy Families —had $5.2 billion in unspent funds, while applications for those funds are actually plummeting. And I guarantee you that they’re not plummeting because people don’t need the money. There are other barriers.

REID:

But there’s technology out right now that, if it were applied correctly, could make these welfare programs more accessible. What would happen if AI-powered chatbots could guide applicants through the process—or even fill out the paperwork for them? How could AI data analysis help connect people to the aid they need? And I even wonder: What would an ideal social welfare system designed by an AI look like?

ARIA:

We’re going to dig into some of these questions. Our guest today is one of the major actors out there who’s figuring out why our current aid systems are broken and how to connect people who need help to the help that they need.

REID:

Phaedra Ellis-Lamkins is a social justice advocate and the founder and CEO of Promise. That’s a payment technology platform that simplifies government debt and helps people navigate these archaic social welfare systems. We sat down and talked to Phaedra about many ways that technology could serve all of humanity. And that the important questions to ask, the important ways to engage, that you could make it for all communities—and especially the communities where the elevation, for them, is life-changing, is the thing that enables the world that we want to see.

ARIA:

It’s so clear that Phaedra is comfortable in different worlds. Like I could imagine her being a labor organizer and being comfortable in that world and giving dignity and empathy and respect to low-income communities. But she also talks in the language of capitalism. She talks about total addressable market. And she talks about how, you know, the government is a multi-trillion-dollar addressable market that could be, you know, a source of clients, you know, customers, et cetera for so many businesses that could be built that are profitable and make money, but that also, you know, serve these markets that are often, you know, overlooked and underserved.

REID:

And part of the thing on the conversation that I think is so useful is, it’s not just her amazing company—you know, Promise getting the scale, showing how you can do this—but also sharing the principles by which all entrepreneurs, all technologists, all inventors could also participate in this very important elevation of humanity. Phaedra is not just focusing on what are the blind spots, but also what to do and how to do it. And it’s not rocket science. It was just simply spectacular.

ARIA:

So here is our conversation with Phaedra Ellis-Lamkins.

REID:

Phaedra, welcome to Possible. It’s awesome to have you here with us.

PHAEDRA ELLIS-LAMKINS:

I’m so excited to be here. A little nervous, but really excited.

REID:

You know, before we dive into the really important scale work you’re doing with Promise, I also happen to know that you spent some important time as Prince’s manager. And I wanted to know if you had any fun stories that could kick us off.

PHAEDRA ELLIS-LAMKINS:

One that probably speaks to his total smarts and expectation of everyone’s brilliance—and sometimes my, and everyone else in the world’s—utter failure to be as brilliant as him was: We were, he was performing in Baltimore, and he’d gotten somewhat frustrated with the sound crew, so he fired everyone and kicked them out of the arena. And he said, “Phaedra, go do soundcheck.” And I am not a musician. I have not done soundcheck. So I said, “I haven’t done soundcheck.” And he’s like, “It’s really easy. You can’t mess it up.” So I go to the microphone, and I’m like, “Check. Check. Mic check,” in this huge arena with all these instruments which I do not play. And I go to each instrument as you kind of could imagine a child. Kind of clicking it, hitting it.

PHAEDRA ELLIS-LAMKINS:

And then, at the end of the concert, he was like, “You did a horrible job at soundcheck.” [Laugh] He was like, “You didn’t check”—there are these things that look like sponges and rocks that are somehow connected to the whole sound system. And he was like, “You didn’t do that. You didn’t do this.” And I was like, I am shocked, and he just, you just realized his pure brilliance and frustration with the world. But it was just, and he would tell that story in the most loving way, which was like, “Can you believe? Like, she’s so smart, but she just can’t do soundcheck.” And, as though it was like a huge failure in my life that I was unable to be effective at soundchecking pianos and drums and the equipment. So it’s one of my favorite stories because he was just shocked at how horrible I was at soundcheck.

REID:

Well, maybe we’re going to have to end this podcast with showing soundcheck on the podcast. Yes. That is, that is amazing.

PHAEDRA ELLIS-LAMKINS:

Check, check. Mic check. I was like, I don’t know what you’re supposed to do.

ARIA:

Shocking. Something tells me that’s not going to hold you back in your career, but we will see. We will see. [Laugh] Let’s get straight into it.

PHAEDRA ELLIS-LAMKINS:

Great.

ARIA:

So I’m obsessed with Promise’s mission. I love that you are a for-profit, doing good for the world, and you have obviously gained the support of so many investors. So I would love to hear from you: What do you think was, why were you a successful fundraiser? What was it about your mission and what you’re doing that resonated with people?

PHAEDRA ELLIS-LAMKINS:

I was lucky about a couple of things. One is I’d been in a company called Honor, which I’m on the board of now. And I ran revenue—everything that wasn’t engineering. And so I think when you have made money before in a company that’s a unicorn—and I’d been there from the very beginning—that people have a sense you know how to make money. Right? So I think one thing I always tell folks when I meet people is like, if you understand how to make money, people are more likely to invest and feel compelled by you. You understand your total addressable market. You understand how you’re going to hire a team. And so I think I was, I had raised money also at Honor. I was part of the fundraising team, and so I knew investors. And so I’d raised money. I’d ran revenue. And then I think I deeply understood, and had subject matter expertise, on the topic that I was interested in. And so I think that—I talked to one of our investors, and they were like, “It was great because you knew more than we did.” So I came in, and I was like, “Nope, this is the market.” And I had a strong opinion, and you know, I just had a very deep sense of purpose. And last is: I think I felt like I was going to win. And so I think you operate differently when you aren’t kind of asking permission.

ARIA:

And do you ever feel—sometimes I feel like when you go in with like a, “and this is going to be awesome for the world,” people like pat you on the head and say like, “Oh, that’s nice, go get some NGO funding”—but they don’t believe that you’re like, “I’m here to crush the financial goals.” Did you see that as a barrier? Or no? Do you see it differently?”

PHAEDRA ELLIS-LAMKINS:

I definitely felt like the person I met with that was at a large VC firm that was—there were two things I think that were different. One is: People often see technology as part of the policing infrastructure, which is facial recognition versus ways to kind of more effectively arrest people and incarcerate them. And so people aren’t used to technology on behalf of people. And so one thing is just trying to convince people that this is a business, right? That there’s investments made outside of policing was definitely a disconnect. And then I just think, quite frankly, as a Black woman, and I hadn’t gone to an Ivy-League school. I hadn’t, I’d been in one tech company before; it happened to have done well, and I’d been successful, and they’d been successful. But I could just have a sense when people would ask me questions, like as though Black and Brown people didn’t deserve products built for them.

PHAEDRA ELLIS-LAMKINS:

Because the people who are designing them don’t design them for folks that I grew up with. And so, so definitely I think people had questions like: How do you build a product for poor people that doesn’t charge them high interest? How do you build a product for poor people that doesn’t—that serves them in, in a way that I think has dignity and grace? And so yeah, I definitely think there were some. But most of our investors I knew from before. And so I didn’t have, I think, the same tension that other folks do.

REID:

You know, with Promise you help people navigate the kind of archaic systems to access social welfare. And you’ve mentioned that New York and California are actually some of the worst offenders. And as a former California resident [laugh], that was an unfortunate surprise. What makes those programs so challenging to navigate, and why do you think the system is so confusing?

PHAEDRA ELLIS-LAMKINS:

It’s interesting because we, we just launched in the state of Florida a couple weeks ago, having a beautiful experience. The challenge for us as a company in California and in New York or other places is that there’s a lot of infrastructure that exists. And so an example might be this program we’re working on to provide water assistance. The way we work with the state of Virginia, as an example, is we get data from the state that says, “Here’s people that are already receiving benefits.” And then we get information from a utility company about their bills. So we don’t need to ask someone to prove they’re poor. We don’t need to ask to get a copy of their bills. And so we can just reach out to you and say, “You qualify for this program.” And in these other states, the way it works is you go into an organization, you bring a copy, and you have to prove your income.

PHAEDRA ELLIS-LAMKINS:

You have to bring a copy of your utility bill. And so to me that’s where software works beautifully, where we can say, “Let’s aggregate this data, let’s identify folks.” And I think, you know, there’s a lot of reasons. There’s institutional interests that exist that are part of the systems that don’t want to change. And there’s concerns about privatization. Is it, you know, privatizing things? There’s a role of the labor movement. I came out of the labor movement. It’s also scary, I think, to think through the idea that people deserve a certain level of treatment. And some of the other states, where the programs are new and huge for them, I think they’re actively trying to figure out: How do I get this money out quickly? So they’re open to, kind of, the newer systems, I think. California, we don’t have a contract with the state of California. I’m a Black woman running a company in the state of California, one of the folks who makes the case, who stays here.

ARIA:

And so you talk about, you know, you get the water bill information; you get the information about who’s low-income. I think all of us has had the experience of: You walk into a doctor’s office, you fill out the form, and then you go into the next room, you fill out the same form, and then the third form, and the fourth form. And it feels like some of your magic is your use of data. So can you talk about, like, how do you use data to help people access aid? Are all of your client states? Like, how does that interplay?

PHAEDRA ELLIS-LAMKINS:

Our clients are states on aid, usually. We have a basic premise—which the federal government allows—which is: If you’ve already certified someone, for example, for SNAP, which is food stamp assistance, you don’t actually need to re-certify them for every program. So our fundamental belief is: If someone’s already brought you all their documentation, you certify that they qualify for this benefit. They shouldn’t have to keep proving to you their income in that same calendar year. And so what we do is we get that information that they’ve turned in—household income. And the easiest way to think about it is: We apply for someone. We apply on their behalf. So we’re able to say, we insert their income, we insert the number of people in the household. We get whatever else they’re owed, and then we basically fill out the application. And so we get a lot of data, and we get it from multiple parties.

PHAEDRA ELLIS-LAMKINS:

And I think one thing that’s really critical is that we protect that data. We don’t share it. Like, so even though we get information from a state, we also get information from their utilities. We don’t share any of that information. One thing we are doing now actively is saying, “How could we think about applying for other programs? What else exists in the rest of the country? What other things should you—rebates are coming for, based on the investment for climate. So should you think about, okay, what else do you qualify for?” And so I think that’s really going to be the future: How do you think about almost a wallet that goes beyond the program we’re doing, but to think about: What should everyone get? And the basic thing I think about is like, the only people we work with are people that are struggling. I always joke we’re the only company that gets thank-you notes, and governments like get, we just, this woman just got a card from Virginia. They find our address, and they’re just like—because people just aren’t used to it being easy.

REID:

You know, speaking of people at very challenging moments, another major issue that Promise focuses on is bail. And you know, when we talk about criminal justice reform, the bail system is often left out of those conversations. But as you know a thousand X better than most people, why should we be talking about it? And what are some of the ways that bail systems fail those inside it? And what should we be doing?

PHAEDRA ELLIS-LAMKINS:

Actually, bail is what drove us to focus on finances, because we realized that so many people are incarcerated because of poverty. What we saw when we were looking at the data is, we said, “What’s the people who are most likely to fail to appear?” And it was someone with a suspended driver’s license, and mostly lost through a parking ticket or a traffic ticket. And so it seemed like the goal of bail is to make sure you show up for court. The goal is to make sure that if—people believe there’s some level of public safety that they’re protecting, but we saw mostly poor people who couldn’t get out. And the thing is, when you lose your driver’s license, it’s like a trigger for losing like so many other things in your life. And so the opportunity, I think, with bail reform is: How to think through, how do we make sure that this is really solving a problem?

ARIA:

What I love about what you’re doing, because I feel like we have dropdown fights in this country over politics and policies and which way to do it, and then we get to implementation, and everyone’s like, “Oh, no big deal. Whatever. Good luck.” Like, it’s just not sexy. It’s just no one wants—sorry, Promise is very sexy—but you know, no one wants to talk about it. No one wants to talk to how it’s done. But it does seem like in this country, there’s so much more money in law enforcement and in prisons than in welfare. When in reality we all just want public safety. It’s like democrats, republicans, conservatives, liberals—we all want public safety. Why do you think there is so much more money in enforcement than in welfare? And are there levers that we can use to change that? Are there things that you’re seeing in your work?

PHAEDRA ELLIS-LAMKINS:

One of the hardest things is there’s like a narrative of the deserving and the not deserving. And I think we saw this during COVID with unemployment. It was like: Wait, who’s unemployed? You’re like: We should have benefits. And that’s not enough money. That doesn’t really give you enough to survive. Part of what we have to be really honest with ourselves about is race and class. And the fact that it feels icky, it’s sometimes uncomfortable for people to talk about, but these institutions, right, policing is, and the criminal justice system is, excellent at incarcerating, specifically, Black and Brown poor people. I don’t think there’s a lot of innovation on behalf of technologists because it’s not the issue that impacts them. I think that a lot of people have a story like, oh, someone got robbed, or something happened. Or people think about the market being so clear. It was shocking to people when I said, “Oh, there is this—this state spends five billion dollars on this aid alone.”

PHAEDRA ELLIS-LAMKINS:

And I think both investors don’t understand the scale and depth of the public welfare system, and I think a lot of the model for how entrepreneurs work is problems that personally impact them. And I think it also is one of the places that government outsources a lot. Right? What’s different in the public sector and the public welfare system is that it’s much more run by people who work for government, whereas in policing, prisons are outsourced. Technology is outsourced. I think the innovation that we see in some markets, it’s like larger companies…there’s just not that same level of investment. So I think it is all those things.

ARIA:

I mean, what I like about what you’re saying is that we often talked about these problems, these are intractable problems. And I’m hearing from you, it’s like: No, these problems aren’t unsolvable. We actually just perhaps don’t have the entrepreneurs that are going out and solving them, because perhaps the entrepreneurs haven’t been in those situations. Or we need people who come from low-income backgrounds because they’re the ones who are closest to that problem. And so, I know this is like an impossible question—and so you don’t have to say ideal—but I wanted to ask: What does the ideal welfare system look like? Or is there even a first step that we could get closer? It sounds like we could use technology to make our current welfare system better. Like, is there a first step that we could head, a direction we could go?

PHAEDRA ELLIS-LAMKINS:

Yeah. There was a piece that The Atlantic did maybe a year or two ago about time, the time tax. And I think unemployment was a good example. It’s like the amount of time you need to be able to receive some of these benefits. We had a state talk to us and say, “We’d like to turn this application into 20 pages. Wouldn’t that be amazing?” And we were like, “Or we could not have pages at all, unless someone needs it.” And so I think the first step is actually trying to use data that the governments already have to answer the questions that they have for individuals. Right? The basic premise that we have is: Oh, if I already have this information—which I think is what technology does—is, like, instead of data being a resource for revenue generation, to actually be a resource for making the system work better. And so to me, because you are—government has such levels of data. Right? You have tax information, employment information through payroll taxes—it’s incredible how much information exists. So to me the first level is just: What do you not need people for?

REID:

I mean, one of the things I think is really important about the work you do—the reason why we’re doing this podcast—is for people to understand that there’s super important markets here that can be really economically favorable, but also enormously great for society: Your installment plans used by older people on fixed pensions, working-class, people impacted by COVID. Now let’s turn a little bit more into focus on the technology of it. So, you know, how have you created the technology that is accessible to such a diverse group of people? And then, how are you thinking about AI in this context?

PHAEDRA ELLIS-LAMKINS:

I’m deeply interested, which I so appreciate Aria and others talking to me about this topic. We are not sure AI is ready to be public-facing for us as a company. We think it’s a tool to make our systems more effective. I’ll give you some examples of how we’re experimenting with AI. One is document processing. Government actually has a lot of written information. And so being able to use language models to just capture that information, being able to use it to process information. The other piece, I think for us internally, is we’re trying to better understand how to make sure it doesn’t reinforce what already—the things that aren’t great about society. So that looks like: if decisions are made in a certain way, how do we not use the data to reinforce that? I saw this for us where we hired an engineer, amazing human and super smart. 

PHAEDRA ELLIS-LAMKINS:

But we tried to use a payment processor—this software payment processor—and we were talking to some engineer, and I was like, “Oh, let’s, we need to take money out in the morning.” And he was like, “Well, why does it matter? It makes more sense, like, if you get, if you take the money at 10 o’clock on the 30th, 10:00 PM, we should take it again at 10:00 PM on the 30th, because it makes more sense for software.” And I was like, “No, but their money’s gone by the morning.” And he was like, “How could someone’s paycheck be gone the day they get paid?” And I was like, “Ohhh.” If we build a system built on your assumptions, you don’t get what’s going on. Efficiency isn’t necessarily liberation. Right? Especially if it’s just, you know, institutionalizing the same stuff. And so we’re also thinking about how do you build access, instead of efficiency, so that it changes the way someone gets to the model, right? 

PHAEDRA ELLIS-LAMKINS:

The real opportunity for us would be, not if it just became easier, but it meant that people, like in those rural areas, all of a sudden got access to something. Rural folks tend to get the least amount of resources because it’s most hard to, it was most hard to get somewhere to get things. So how do we not just use what already exists and assumptions about who, what percentage of aid goes out? And so that’s a lot how we think about the technology we build, is how do we make sure that our team is really making data-driven decisions based on the inherent experiences of the people that we serve? Which is why our payment plans have a 90% repayment rate. Because we’re like oh, people shockingly don’t want their water off. They don’t want to lose their electric. And we want to accelerate their access into the system that we believe should fundamentally work for them.

REID:

You know, one of the things that, you know, I’m kind of doing with Inflection and Mustafa is kind of creating a, you know, personal artificial intelligence assistant to help kind of navigate through life. And one of the things that this is suggesting to me is kind of like, okay, how do I make sure this set of concerns can also be reflected? What would be your suggestions to us thinking about a, a personal AI, kind of conversational navigator? Like, “Make sure you do this. Avoid this.” Like a couple of reflexes on it. And I ask this in this public environment because I’m hoping all the AI people are listening to this too.

PHAEDRA ELLIS-LAMKINS:

One, I appreciate it, because I actually have been tracing what you’re doing, and I’m, I’m very interested in it. It seems to be part of what you’re doing is trying to make it more humane—which is like, how do you think about, which is very, you know, I mean deeply interesting to me. One thing I think is important is to speak to people in their native tongues, not through translation. We don’t use any type of translation through technology ’cause it’s often wrong, and different words mean different things. So one thing that we just think is really important is, like, if you’re building in English, is not to use some translation software. Because it doesn’t mean the same thing, and it works differently. And I think AI is so specific that the vernacular being different does really matter. The second is, like, trying to think through cultural stuff—to the experience of like, assuming everyone has the same amount of money or that their paychecks don’t go out on that day is—a lot of the assumptions, the underlying assumptions I think we make are based on what the experience of the people who work in those places.

PHAEDRA ELLIS-LAMKINS:

And so I think being deeply committed to understanding what people’s experiences are outside of—look I, you know, I have a bunch of engineers on my team, and it’s hard work to find folks outside of the pattern that is, you know, Ivy League school, you know, maybe you work at a consulting company. [Laugh] You go back, you get your MBA, or you’re an engineer. You know, being able to build something that doesn’t just work for the folks that work at the company, who are, do not represent the experience of most humans in this world. You know, I think about the number of Black women who face issues in childbirth, what it’s like to go through pregnancy—I think even Serena Williams wrote a piece like, “People wouldn’t believe me.”

REID:

Pi, what would you say about this?

Pi:

The statistics surrounding Black women and pregnancy-childbirth are alarming. Black women in the United States have the highest maternal mortality rate: almost three times that of White women. In 2021, the maternal mortality rate for Black women was 69.9 deaths per 100,000 live births. This disparity in maternal mortality is a significant public health concern. A number of factors contribute to this disparity, including unconscious bias in healthcare settings, structural racism, and lack of access to quality healthcare.

PHAEDRA ELLIS-LAMKINS:

So like if, even at the best, you have access and resources, and every Black woman and Black person I know had spent time advocating for themselves in the healthcare system to be heard through pain or to—so like, I would like AI so I don’t have to deal with a doctor. How do I make the case for myself? So I think it’s almost speaking to the experience that people are having and being able to offer real resources that actually make their quality of life better. You get to be at the forefront of thinking through this next wave of technology and what does it do for democracy? What does it do for, how do people get to participate in a different way?

ARIA:

So you talk a lot about empathy. And empathy and design. And obviously when you’re a technologist, you’re designing for people, you should know your audience. But one of the things that you flipped on its head is, people always talk about, “Oh you know, dogfooding, you should be using your own product, you know, you’re the person.” And to your point you’re like, well, we can’t, we can’t do that. We’re building for a different population. How do you do that? How do you design with empathy for a situation that you know nothing about? You’ve never had a zero-dollar bank account, so you just couldn’t even possibly imagine.

PHAEDRA ELLIS-LAMKINS:

One thing is to think about not knowing anything. Sometimes it’s, we don’t teach people to be, like, that it’s okay to be an idiot. And I think one of the most important things when you’re working with a community that you haven’t worked with before is just to assume you don’t know anything. The other thing I think that’s important is: Sometimes technologists have a way to talk about scale that assumes everyone but them is idiots. Right? Like, “Oh, that’s anecdotal. This is scale. You don’t get scale.” And so one thing is: There is actually learnings in anecdotes and experience. I think that sometimes, there’s like, it’s like, it’s black-and-white instead of like learning. And hearing people. I think sometimes we don’t hear people because we’re like, “Oh, I need 15. I haven’t seen it times a thousand. So it doesn’t count. It’s not even representative.” So I think, I think that’s important. Last is not to assume systems are punitive. Because most systems—like, if you look at payments as an example for poor people versus rich people—it’s, if you have a lot of money, it’s all about incentivizing your behavior. It’s like, we’ll make it as easy as possible. Zero interest. Do these things. Like—

ARIA:

You’re giving me airline points. It’s like, “How can I rack up the money?” [Laugh]

PHAEDRA ELLIS-LAMKINS:

Right? It’s like, the more you spend the kinder we are, right? It’s like, “Oh, you, you get a personal shopper now because you spend a lot. What? You don’t have money? We’re going to make it as difficult as possible, and you gotta go somewhere to return it, and you got to have a receipt.” And you know, so I just think also, if you assume that everyone does better in a system that is an incentive-based versus a punitive system, that you just design systems differently. I have a strong personal faith. For me, it’s a higher power. Like it’s a, it’s a clear like this-is-what-you’re-supposed-to-do. For some people, it’s not that, but there’s just like a basic humanity. And so, I just think that technology is such an incredible force, and if we do it right, we get to be part of redesigning and redoing it. What a rich opportunity to get to think for the future about how, what systems are we building for the future of this world?

PHAEDRA ELLIS-LAMKINS:

Sometimes we make mistakes as a team, and one thing we do is like, you need to call and apologize to the person. Which is hard. Like, it’s weird for people who come from the company, like, “I sent out the wrong text.” It means something if you think you owe money to the criminal justice system, and someone sends out a wrong text. So we’re like, “Call the people, and tell them you’re sorry.” And an engineer does not make that mistake again when you call someone. And one, they don’t want to talk to anyone, let’s be clear, right? And two, if you, if you call and say, “Oh,” and the person says, “Gosh, here’s what it is.” And then they fix it so quickly. We have folks answer phone calls. And so I think also giving people connection, which sometimes we don’t think about in technology, is how do you actually feel the work? Like feel it in your soul. I think it, it changes the way that you work.

ARIA:

I mean, I think the assumption piece is so right. It’s like the person didn’t show up for their hearing because they’re a deadbeat and because they have no sense of responsibility and because they’re a bad person. Or because they don’t have a driver license [laugh] and it would take three hours and $20 and whatever. So to your point, like, you need the color. The data is like an amazing start, and then let’s have the color. Two things that I’m really excited about: One, obviously there’s a lot of talk about how AI encodes the bias that we already have. And obviously that’s something we have to prevent against. But what I would argue is that the discrimination and the bias is all around us already. It’s like, when you get the résumé and, you know, the person played lacrosse, you’re already like, “Oh, okay great, we’ll hire this person. Fantastic.” You’ve already, you know, let go the other person who didn’t play lacrosse, didn’t go to an Ivy league school, whatever it is. And so all of that sort of bias is hidden. And so AI, if we do it right, can make that bias plain. And we can make sure to train those algorithms so we’re actually not having that hidden bias that everyone has already. And so again, this is not where we are today, but we’ve, we’ve seen it a lot. And if we can get rid of that, I think economic opportunity, of course, is one of those things that we need to see in our society.

PHAEDRA ELLIS-LAMKINS:

It’s a good point because at the last company I was at, we saw this great idea. So we were like, it was home care, and so we put everyone’s photo up. And we’re like, “Oh, then people can pick, and they’ll know who they can work with.” And it will not be shocking to you that it went from light to dark in terms of how people picked people. And also the same things happened on ratings. Like people were, we could almost anticipate, and before we had thought, oh you know, like all this access to information is such a good thing. People are going to be able to see, and then what, where we could use technology in a positive way. But it is because it was about the design process, is—we could say, “Oh.” One, we took all the pictures down. We’re like, nope, no one’s going to get to pick ahead of time because we’re not going to let people from a biased perspective pick it. And then we could figure out how to assess people’s skills and score them so that matching was done based on language or other things. And so it was a way that we could intervene on, on bad human behavior. Right? Or communal behavior that was bad to us.

REID:

Yeah. And one of the things that, you know, I’d say is kind of a Pole Star on the AI side—although it’s challenging—is you’d like to be able to tell an AI like, “Audit the system and make sure it’s not racist.” And have it do that in a good way. Now obviously what that is is one of the things that you’re kind of learning. I tend to think that the other thing that can be used by AI here is an attempt to get us to get clear on what are the things we should be doing as a community and how to do that. You know, part of it is to think about, you know, kind of how we can take what are some of the current limitations—current data, current human feedback training, you know, kind of monoculture of engineers broadly who are building it—and kind of shift that to an advantage of being part of the solution.

REID:

Now some of it that you referred to earlier, I don’t think we can get to—which is, or we can’t, we can’t navigate exactly—which is we’re going to have to learn this iteratively, and so there’ll be mistakes. And so we have to be careful about what those mistakes are and how to make them. But let’s not make the, the mistakes we don’t have to. And let’s also be very careful about which ones, you know, kind of are more damaging to some communities than to others. So I think all of this is, is extremely important on the AI side. If AI companies were to be better, like kind of built dashboards or transparencies or things that they are kind of saying, “Hey, we’re holding ourselves accountable to being good stewards of humanity and all communities,” what kind of things would you like to see them paying attention to themselves? What avenues of communication or transparency might be really good?

PHAEDRA ELLIS-LAMKINS:

The way I would think about it is I think a lot about access. So I think about who’s using the tools. So one thing that’s just interesting is: How do we think about early adapters—not as just people who have cell phones, who have this—like, how do we actually make other communities early adapters? And what would technology have to look like if we said, for example, “We want young Black men to be the largest users of this technology”? And so one is, I think in most cases when we talk to other companies about early adapters, it’s often someone technologically sound or so, you know, like it’s a, it’s a certain profile. And so I think about it as like Uber Black to Uber, right? It’s like Uber Black is, you know, for the folks who pay for a car, and then eventually like all things, you know, Reaganomics, it trickles down.

PHAEDRA ELLIS-LAMKINS:

And so one thing would be to think about: How do these tools become strategies for folks, that early adapters are actually folks of color or folks with different economic status? And so that’s, that’s one thing to think about. And then I think about the reporting, I would love to understand how the AI systems were better than the current systems. Do we create more time for working moms? For different people, elevation is different. But it’s like, if AI were saying, “We actually are going to elevate X people or think differently, and here’s how we might measure that,” it may not be that there’s a perfect answer across all AI. It may be that each company gets to say, how do we, to your question, how do we elevate? How do we improve? What are the things we seek to improve? You know, my, in my last company it was like, oh, we know now people pick folks who work in their homes based on color.

PHAEDRA ELLIS-LAMKINS:

So what we’re going to do is build systems that make it difficult to do that. So if I were thinking about AI, I’d be like, what are, what systems do people get excluded from? How would we think about it differently if we think about the fact you’re more likely to be stopped because you’re Black? Okay, great. Would we tell people to drive different streets? Would we advise them that here’s where people are most like—you know, what are the ways in which the technology might perform better? And so, if you thought about AI as an opportunity to be able to be exposed to all the things that we are not inherently exposed to, it just, it could be opportunity. And I wish that, and I hope that.

REID:

And I think part of the thing that the–the emphasis is that it’s not necessarily, it’s not rocket science to do this. It’s making sure you got the right questions. You make sure you’ve got the right, you know, kind of focal points on what the issues are to not overly generalize from yourself, where frequently you will not see the thing that is a landmine or something else that’s, that’s super important. And to make sure you talk to people, you know, as people,

PHAEDRA ELLIS-LAMKINS:

We’re just so lucky that we get to have our lives be about like, how do we make the world better? And I think that’s what technology at its best should do, is to think about how do you scale the best of us? How do you scale the things that we, that we think are worthy of being scaled? And it’s the thing I learned when I worked for Prince is, I’d been in the labor movement, and we used to try to convince people to come to meetings. We’d be like, “free food!” You know, like whatever we could do. We begged, we’d call over and over. “Please come volunteer for some candidate.” And then I learned this very basic principle working with Prince, which is: People like to come to a Prince concert. And we never had people be like, “Oh, I don’t want to come.” [Laugh].

PHAEDRA ELLIS-LAMKINS:

So I was like, oh, it seems like when you make it easy and joyful, people are more likely to come. And I was like—it doesn’t seem like a big life principle—but I was like, I spent my whole life trying to make things seem like someone was called to do it, it was the right thing to do, kind of a little judgy. And I finally realized if you like, “Don’t you want to save the country? This person is the worst!” every election, “This person’s going to be the worst. Country’s going to fall with this person.” And I realized like when you invite people to something that’s enjoyable, they actually say yes. So I was like, oh, we should be inviting people to something that’s enjoyable. And I think that’s what technology has the opportunity to do.

ARIA:

I mean, Phaedra, we would talk about that every day at DoSomething trying to get young people to vote. “Hey, young people, you want to come to a party where it’s just old people, they’re just talking about welfare, there’s no good snacks, no good music, it’s real boring? Also, none of your friends are going because none of your friends like voting.” And it’s like, oh, I’m so shocked that that wasn’t like the most effective tactic. So, I hear you. [Laugh]

PHAEDRA ELLIS-LAMKINS:

Yeah. Yeah. It’s like, what is something that—and then we tell everyone, “This is corrupted, broken, and hard. Come make it better.” And you’re like, “Oh.”

ARIA:

Totally. No, forget it. Fun. That’s fun.

PHAEDRA ELLIS-LAMKINS:

It’s like, do you imagine if you’re dating: This guy is broken. He, everything is wrong with him. Do you want to go out with him? No. You’d never like—it’s like, yeah. So.

ARIA:

Right, but you at your job, you get thank-you notes. I mean that’s like—

PHAEDRA ELLIS-LAMKINS:

Oh my gosh.

ARIA:

Who wouldn’t want to go, you know, work somewhere where you’re getting thank-you notes? Everyone in tech talks about like, you’re going to succeed as a company if you have customers who love you. So it’s like that’s, that’s what you’re doing. You’re creating like, the white glove service. You’re, you’re treating your clients like the, you know, AmEx Black Card holders and sort of giving them that. What will it take to build more companies like Promise? So that everyone both gets the thank you notes, we’re serving all sorts of populations–like there’s so much promise with tech. Why can’t we have a hundred X more of you?

PHAEDRA ELLIS-LAMKINS:

It’s funny because I’m a nerd on customer service because I just want to make sure people are having an amazing experience, and I still answer calls, which drives everyone crazy. But the exact quote was, “I’m shocked. I’m pinching myself. I’m elated, happy, thankful, grateful, and all of the good stuff. Thank you so much.” And so those always root me. And I think, I feel like my obligation is for us to win in the way that capitalism sees winning, right? And so the best thing I feel like that I can do is I can make sure our company is worth a lot of money. My best service is to build a company that never compromises on the way we treat people and uses technology in a way that is authentic. My personal thing I have to do is to be able to demonstrate that this works. That you can do both without compromising.

PHAEDRA ELLIS-LAMKINS:

That is my greatest use I can do, is to do that. For other companies, I think we have to do a good job of showing TAM—which is to say the market for government is trillions of dollars, right? And so just to say—and there’s not a lot of innovation—so we get to win contracts like sole source—which is very unusual in government, because it means that there’s not someone else doing the same thing. And so one thing that I would just say is: There’s very few markets where you’re not creating the market that already exist, where there’s already opportunity, where there’s already clients, where there is this much opportunity and such a lack of, of innovation. Like if we could get people to think through, you can make money at scale without being a lender who charges interest. We have to convince people that innovation is about liberation, innovation is about moving people forward, and that technology at its best does that.

REID:

We are going to move to our rapid-fire questions. Is there a movie, song, or book that fills you with optimism for the future?

PHAEDRA ELLIS-LAMKINS:

There’s a book by Martha Beck about integrity. What it basically says is integrity is not about the way people think about it, honesty or other things. It’s about fulfilling your greatest purpose. And that you’re out of integrity whenever you are not doing what you are most designed to do as a human. And so like—and that people innately know when they’re out of integrity, because they’re doing something that is not what they’re supposed to do. So that just makes me believe, like, if we can tell people like to go towards their greatest use, as a society, it makes me very, very hopeful.

ARIA:

What is a question that you wish people would ask you more often? And this could be about your juggling skills or a professional question. So it could be broad.

PHAEDRA ELLIS-LAMKINS:

How did you get so fit? Because I’ve worked really hard [laugh] to be healthy.

ARIA:

Oh my God, Phaedra, this is another pod. I want to know all about this. This is a separate time.

PHAEDRA ELLIS-LAMKINS:

Because I feel—like I grew up in the labor movement, so I grew up differently than most people grew up. I grew up with truck drivers and teamsters and firefighters, and so I worked really hard to be healthy. And I think people are like not, you know, you would never say anything. And I’m like, all I want to talk about is “I worked out this morning” with you. So I wish people would be like [laugh], which I’m sure is the worst answer—

ARIA:

Oh my God. I can’t wait to hang out.

PHAEDRA ELLIS-LAMKINS:

…you’ve gotten, because I’m sure everybody else is like, “I want to talk about this.” I’m like, “I want to talk about my fitness.” So anyway, that’s what I want to talk about.

REID:

It’s super important for the world’s health. That’s another leadership thing.

PHAEDRA ELLIS-LAMKINS:

Right?

REID:

So where do you see progress or momentum outside of your industry that inspires you?

PHAEDRA ELLIS-LAMKINS:

I am so inspired by young people. I feel so hopeful. Watching the radical globalism that I think people are experiencing in what they, what they learn and then the concepts—I feel so hopeful about the way they consume and the way that they think about like what society is responsible for. And so I just want to rush them to the front of the class. Because I think, even as we think about the issues that we talked about, like around AI and stuff, it’s like they’re thinking about like who participates in the world. It’s like you can almost trace their loss of like globalism [laugh] as they get older and older. And so I feel very hopeful about young people.

ARIA:

Yesterday, my eight-year-old told me that all adults should go to school so that we can learn all the things that he learns. Because we don’t know any of them. So—

PHAEDRA ELLIS-LAMKINS:

Right?

ARIA:

…we can learn so much.

PHAEDRA ELLIS-LAMKINS:

My daughter just started Montessori school, and she took a picture of the schedule and then sent it back to them in a way that she thought made sense, because she went online, and she discovered that Montessori is about choices. So she didn’t want to take some of the classes that they had identified. And so she sent back her schedule. And I was like, this is her first day.

ARIA:

Love it. Love it. Love it. [Laugh] Alright. Alright. Back to the slightly more serious—although fitness and Montessori school, very important. Because this podcast is called Possible, can you leave us with a final thought of what is possible to achieve if everything breaks humanity’s way in the next 15 years? And what is our first step to go off in that direction?

PHAEDRA ELLIS-LAMKINS:

I believe that it is possible for the world to get better. I see breaks in the fact that we’re having this conversation when I don’t feel like this is the conversation that was happening around the internet or other things. And I think the fact that we’re saying, “How do we deal with bias?” Not as a function of it, of course there’s already inherent bias and there’s already things that have happened, but I think the idea that these discussions are happening at the, the formation of some of these companies is very exciting. It’s not lost on me, like, I’m sitting here with you; I went to Cal State Northridge. I went to public school. I failed more times than I succeeded. And I haven’t had one job consistently. Like I’ve hopped around different industries. And that we’re sitting here having a discussion about how to make the world better.

PHAEDRA ELLIS-LAMKINS:

Like that’s not lost on me. That’s not what’s supposed to happen. And so there’s a bit of luck. And so that luck continues to sustain in trying to, you know, make justice happen. You want to bend the arc of justice, right, in the service of love. And so, if we’re able to do that, that we can talk like that, that we can—I don’t know that 10 years ago we could have had a discussion, and I could say, “I want to bend justice, and I want to serve love, and I want to be fit and kind and good.” Like that that gets to be how we talk about these things feels very right to me.

REID:

We are delighted that you have bent your normal focus to come share a little bit of time with us, to also share it with everyone else as well.

PHAEDRA ELLIS-LAMKINS:

I’m so grateful to have been here. Thank you both.

REID:

Possible is produced by Wonder Media Network, hosted by Aria Finger and me, Reid Hoffman. Our showrunner is Shaun Young. Possible is produced by Edie Allard, Sara Schleede, and Paloma Moreno Jiménez. Jenny Kaplan is our executive producer and editor.

ARIA:

Special thanks to Katie Sanders, Surya Yalamanchili, Saida Sapieva, Ian Alas, Greg Beato, and Ben Relles. And a big thanks to Pure Dalisay, Melanie Jones, and Little Monster Media Company.