The internet needs new systems that aren’t only centered around efficiency

When Stanford political scientist Rob Reich set out to work on his book, System Error: Where Big Tech Went Wrong and How We Can Reboot, he thought he would have to master complex technical concepts like machine learning and differential privacy. Eventually, he came to the conclusion that what he really needed was to understand the distinctive mindset of technologists. “And that mindset is one of incessant optimization,” Reich said during a conversation at Unfinished Live.

The problem is that improving efficiency alone isn’t enough to make the world a better place. If a startup is optimizing for a bad objective, it’s actually making things worse. Take Google, for example, which helped make advertising ubiquitous on the internet in the early aughts. That turned attention into “the currency of the internet,” said Sridhar Ramaswamy, a former Google executive and the co-founder of the alternative search engine Neeva. “And so if you ask what do the ad teams optimize for? They optimized for attention because attention equals money.”

The internet needs new systems that start with a commitment to democratic ideals and putting the best interests of humanity first. Incremental reforms to our current models won’t be enough, because “the thing that you choose to optimize for builds the entire set of processes and systems that follows,” said Nabiha Syed, president of The Markup, a journalism startup that investigates the technology industry.

Watch the full conversation below, and scroll for a written transcript. The transcript has been lightly edited for clarity. Statements by these speakers have not been fact-checked by Unfinished and represent their individual opinions.

Nabiha Syed

Thank you to Unfinished and thank you to all of you for showing up in person, which is a thing that we used to do. Sometimes when I see these headlines, often in places like The Markup, about algorithmic discrimination or polarization or other digital harms, I just sit back and think it didn’t have to be this way, right? There are so many of us in this audience, and on stage here, who love technology, who’ve built technology, who know what it could be and know how it’s falling short. So I keep coming back to this James Baldwin quote, which is, “We made the world that we live in and we will have to make it over.” But I think before we do that, we have to understand exactly how we got here. And Rob, I’m going to ask you to help us understand how we got here in your book System Error, which you should all go buy right now from the online bookseller of your choice. Make good choices. You talk about a particular mindset that’s in the DNA of Silicon Valley. Tell us more.

Rob Reich

I’m a philosopher by training, as you heard the introduction, and I wrote this book together with one of the most popular computer science professors on campus, who also happened to be an early Google employee and then a policy expert, and having been at Stanford University, that kind of ground zero of Silicon Valley for 20-plus years, I was initially just a user of all of these technological wonders that came out of the university and of the Silicon Valley. And then I wanted to try to understand what I thought was a huge power grab by a small number of companies where the decision-making was determined by a very small number of people inside of those companies. And initially I thought, “Oh my gosh, I’m going to have to go figure out, like, what’s machine learning. What does differential privacy mean? How do I understand algorithmic models?” And I came to the conclusion eventually that you don’t need to understand, as a user or a citizen, every single technology, but you need to understand the distinctive mindset of the technologist. And that mindset is one of incessant optimization. 

Technologists are obsessed with optimizing, and optimizing on the face of it sounds like it might be a good thing. Who would be against doing something more efficiently, with fewer inputs, to get the same or better outputs? But optimization is a really limited mindset—especially if, as often happens with technologists, it’s not just applied to some technical domain, but becomes a mindset for your life. You try to optimize everything. The life-hacking approach, or as a colleague of ours at Stanford says sometimes to students, “Everything in life is an optimization problem.”

Alright, so what’s wrong with optimization? It’s a means to another end. To optimize a bad objective is to make the world worse, not better. And beyond that simple thought, you have to take in mind that what computer science does all the time is to get a problem in some computationally tractable form. You have to reduce the objective that you care about to some proxy that you’re going to measure, and you’ve tried to optimize that measure. And most of the things we actually care about in life are not reducible to easily measurable proxies. So in the optimizing mindset, the technologist then focuses on these imperfect proxies for something that we actually care about. At the end of the day, I think optimization is partly what’s broken a bit of Silicon Valley. It’s that incessant drive to optimize, where at Google, perhaps, there’s been a shift from assembling the world’s information and making it available to us, to optimizing for ad clicks, or at other companies, time on platform.

Facebook says its mission statement is to connect the world. And yet what it optimizes for, we’ve seen through all of the reporting that’s happened, is engagement on the platform. There’s a famous line about the Stanford engineers who suddenly woke up and found themselves thinking, “Why is this extraordinary talent devoting itself to maximizing ad clicks?” 

And that optimization mindset, finally, I feel is most broken, because it colors the way technologists think about democracy. We heard about lifting up the spirit of citizens and democratic institutions. And when I think about Silicon Valley, I sometimes reduce it to a little formula. The founders tend to be libertarian. The programmers who work for the founders tend to be optimizing engineers. And when you combine an optimization mindset with a libertarian mission, what you’re optimizing for is the minimization or avoidance of government itself. So I’ll just end with a quick story, because this brought it all home to me, and this isn’t in the book. There’s a whole bunch of other stories in the book that I think you’ll find arresting, maybe surprising.

I got invited to a dinner with a bunch of prominent names in Silicon Valley. People, if I told you the names, you’d recognize them. And the dinner was about, what would it mean if we could find the kind of plot of land on earth where the governance structure of this place was devoted to the maximal progress of science and technology? And people went around the table and said like, “Oh my gosh, we’ve already spec’d this all out over at this R&D lab within a tech company.” 

People went on for a while about how this would work. Someone actually said at this dinner, well, once we get the plot of land, once we get the thing up in motion, how do we decide who gets to be a citizen there? And the answer was the Google hiring test should be the citizenship test. 

Finally, I raised my hand in this conversation and said, “I’m the political scientist, the philosopher here—what’s the governance? How is this being arranged? Is this a democracy we’re talking about?” And almost unanimously, even on the table, people said, absolutely not. Democracy holds back progress. We need to put a beneficent technologist in charge. That’s the optimization mindset applied to governance: Democracy is broken, it’s slow, it’s inefficient. The technologists can generate better outcomes. 

There’s something lovely about the thought, but if you don’t even have an initial commitment to democracy as such, that’s where the power of big tech gets even deeper and more concentrated. And that, I think, has to do with the power grab we’ve seen over the past decades.

Nabiha Syed

Well, that was a horrifying story. I’ll trade you another one. When I was reading about the optimization mindset, it reminded me of a story from my childhood where my dad, who is an immigrant and an engineer, was called for jury duty. He’s very excited. Took a week. And after all was said and done, the judge said, “Thank you jurors for your time. Do you have any questions for me?” 

And my dad goes, “Yeah, I do have a question for you, your honor. Why did it take five days to do all of that? That’s very inefficient. We saw the video, we know what happened. And you just spend all this time authenticating evidence and doing this nonsense—it seems like a waste of time when you could have done it in one day.”

And the judge says, “You know, Mr. Syed, if your freedom was on the line, would you want to optimize for efficiency? Or would you optimize for getting it right?” The thing that you choose to optimize for builds the entire set of processes and systems that follows. And so something like due process is illegible in some of the examples that you have, right? So things that are bedrock to democracy, they just don’t show up. They don’t make sense. They’re incoherent, if you have the wrong mindset.

Now, Sridhar, you have been in Silicon Valley and some of the most interesting rooms anyone can imagine. You led the Google ads business. This question about what are we optimizing for—how have you seen the answer to that question evolve over time?

Sridhar Ramaswamy

Early Google, you’ll be shocked to know, was kind of embarrassed about ads. The ads team, these are the people that made the money, but [they were like], “Let’s not really talk about it a whole lot, because we would rather work on Google search and Gmail and change the world.” But what happened was, there’s the reality of being in a company that wants to keep growing. Remember if you’re not a 20% year-on-year growth company, you’re not a growth company. And so ads just ended up taking more and more space. 

None of the individual decisions were wrong. I was right in the room. I made many of these decisions. But you collectively string them together for 10 years, you end up in a situation in which ads dominate the platform. The other incredibly surprising thing that happened early on is Google and a bunch of tech companies convinced everyone else that dealt in information, the newspapers, the bloggers, everybody else, that ads were the future.

I mean, think about it. Here everybody bought into this idea that you no longer needed to know your customers. They would just come to your site. You didn’t care who they were. You’d show some ads. Google would show ads. The New York Times didn’t even show the ads. Google showed the ads, and you’d get a check, and that was it. That was the extent of the relationship that we were all convinced was enough to sustain us for the indefinite future. And so all of these created a situation in which very, very naturally the currency of the internet is attention. So if you ask what do ad teams optimize for? They optimized for attention because attention equals money.

There’s a reason why I tell people, if you look up how to mountain bike on YouTube or any other platform, and you watch a video and you’re like, oh, that’s helpful. It’s kind of useful. The next video is going to be watch the top mountain biker and the entire planet cross seven mountains. Why? You’re going to watch that. And the company is going to make money. And so it is like ads eventually come down to attention. And so here we have an internet that is all about attention and what you and I need to fundamentally realize is that if you give up that attention, you’re giving up agency. You’re no longer in charge of your life. To me, that’s the more profound thing of an ad supported model taken to the logical extreme. And a lot of tech companies are really good at this.

Nabiha Syed

You’re trying a new attempt now where you are giving users agency, you want to do a user-first approach. And that’s something that feels very familiar to us at The Markup. We decided to leave the ad-supported model of media behind. We’re trying something different. We are making all kinds of bets that a privacy-forward approach to reading the news is something that users want. But as a year-two organization, that’s a terrifying bet. What makes you optimistic in this space that people actually care?

Sridhar Ramaswamy

It’s still year two. It’s still terrifying, but search is something that’s deeply personal to all of us. People don’t realize we searched without thinking. You have a headache, you’re going to look it up. You’re worried about your child crying, “oh, is he crying too much?” You’re going to go search for that. You want a new pair of headphones. It’s become reflexive. And again, the fact that here is a product that is optimized for advertising. It’s not really optimized for what’s right for you. It’s now optimized for keeping you on the platform. More and more organic results, not just web results. Our ads come directly from Google. And so the nightmare scenario, which made me decide I wanted to create Neeva, was a world in which any query that you put into Google resulted in two outcomes. If it was a commercial query, you saw a page full of ads. If it was a non-commercial information-seeking query, Google told you what the answer was.

I said, that’s a horrible outcome for all of us. There need to be alternative models. And Neeva was born of the desire that something that people use a dozen times a day, when they searched, because I said it’s super instinctive, is one that people would see value in. This is why we decided to completely invert the model. Start from the user, make a strong commitment early on because you can’t make these promises halfway into a company that we were never going to show ads. We were never going to show off affiliate links. We would never sell data. We fundamentally think that it’s going to let us create a superior product. It’s early. Neeva has worked, but that’s the mission that drives us. And our take is that a product that is set up like that, that is hopefully becoming self-sustaining with people paying small fees is a lot better for the world in which there is this apparent freedom at the beginning, but the product just gets worse and worse over time. That’s the early thesis. We need your support to actually make this into a reality.

Nabiha Syed

Well, it does feel like centering the user is an excellent pathway to building trust in a moment where people increasingly distrust their devices. There’s all kinds of conspiracy theories of, “Oh, my phone is listening to me in real time.” And you’re like, no, it’s actually worse than that, way worse than that. But as our resident philosopher, Rob, what do you think the path to increasing social trust of technology looks like in the next five years?

Rob Reich

I think that the tech companies would like you to think that they can either somehow tweak and tune the platform or the system to increase trust. And maybe there are modes to do that. But the more fundamental thing in my head is that that creates a kind of framework for thinking about trust in which there’s the company and the power that already resides in it and then there’s every single individual user. And we don’t ever hear about people as citizens. We hear them as consumers and what that leads to is a kind of discourse in which you say, well, if you don’t like Facebook, then just delete Facebook. You don’t like Google, then move to Neeva, which is fine if what you’re thinking of yourself as is just the user. But I think that so undersells the general orientation. We have so many other aspects of our lives in which we seek to try to bring power and decision-making outside of companies so that we get some guard rails to avoid the worst outcomes. And we get to harness the benefits that these different products or sectors provide.

Imagine with the roadway, we invent the automobile or the motorcycle, whatever it is—we set up a whole system of interstate highways and roads. And we then invite people to buy a car of their own choosing and then say, without any rules or regulations, you’re free to go drive and good luck and be careful. And if you don’t like the system, then just don’t drive. You’d say that’s a ridiculous way to think about organizing collective behavior. Rules and regulations get put in place to coordinate human activity and to put in place some basic guard rails.

So to increase social trust, I don’t want to hear any more apology tours from the big people in tech companies saying our smart people inside are working on the problem. What I’d like to hear is the raising of other voices outside of tech companies, the raising of voices of employees within tech companies to have more power vis-a-vis their executives. And then of course, to have our democratic institutions, which involve all of us as citizens to increase the social trust in the collective project of harnessing technology for our benefit, rather than the power grab that we’ve experienced for the past decade.

Nabiha Syed

I want to pick up on that theme for what is shockingly our last question of this segment. We focus a lot, and you hear a lot of conversation on, “What can individual users do?” And I certainly want to hear more about that, but I also think it’s kind of like climate change, right? Progress in this space is not just users making different choices, although it is also that there is more structural or systemic change that we need. And I’d love to hear from both of you kind of what you think that needed systemic change looks like. And as you think about your answers, I’ll offer that I’ve been thinking a lot about how 80 years ago in the shadow of the New Deal, we saw a rise of a different type of power. We saw government bureaucracy growing that was unaccountable sort of democratic processes in a different way.

And what we saw around that was more transparency, regimes for accountability and audit. And I wonder what example we can be taking from history to say, okay, we have a new form of power that has been unchecked. We’re not going to have this sort of hands-off drive your own way on the road approach. So should we mandate algorithmic audits? What does a Freedom of Information Act for corporations and platforms look like, right? How do we borrow from some of these systems we have, cause we’ve faced this problem before. It’s not totally novel. And I’m curious about how you’re sort of thinking about those structures for the future, too.

Sridhar Ramaswamy

To me, the last two tech decades in which a few companies have risen to like these behemoths with so much power, has also coincided with an unfortunate period where essentially we’ve said, oh, there’s no such thing as competition law. We don’t really need it. The markets will take care of everything. I think it’s only now we are realizing that’s absolutely not the case for companies optimized to grow. And they are ruthless about competition.

The biggest worry that I have with Neeva is less about building the product. We have an amazing team. The product will get better every single day, but all the roads that put us in front of users are shut. All the default positions roughly fall into this bucket of, oh, we’ll make you one of the optional defaults in a browser once you have succeeded, but you know what to succeed, you need to be an optional default in all of the browsers. So to me rethinking competition and the new administration has taken several moves there is foundational. You have talked a lot about what we can do as individuals, but I think we are in hopefully the end of an unfortunate 50-year cycle in which we stupidly believe that free markets would solve literally all problems that we had.

Rob Reich

If I can pick up on that very point, my own view, it’s very similar to this is that we’re happily finally exiting a 30-, 40-year period, partly of thinking the market is the solution to all of our problems, but also a kind of early tech optimism or even utopianism. These digital tools and platforms would increase human capabilities and spread freedom around the world. That was the early heady days that the Davids programming in their pajamas overnight would defeat the Goliaths of the world, and now they’ve become the Goliath. And so we now have five years of an extraordinary tech backlash in which big tech is rotting democracy from within, addicting us to our smartphones, and making our lives worse. And all of that time, we basically just left decision-making to people inside the companies and not everyone inside the company, a small unrepresentative set of people at the top of the companies.

We’re entering a new era in which it’s not just the user’s choices, but our collective choices and other kinds of power and decision-making. Algorithmic auditing, potentially federal privacy legislation ways in which, because of automation and the displacement of work, we’ll find new ways to provide a social safety net and job re-skilling in a variety of different ways. And we can keep going on with a variety of different legislative or policy mechanisms. But the essential thing from my point of view is not to point to the blueprint about how to fix it all because it’s not going to be one fell swoop. It’s all done. It’s going to be an ongoing series of challenges, but where we now all know that it’s not working for us just to have big tech in charge. And I mean, my own view just to bring this home in my own head.

Three of us are here from California. We just lived through another fire season, smoke season. And one of the things you often hear is that, well, you should do some brush clearing around your property. You should ensure that you don’t have a bunch of firestarter material sitting close to your house. Which of course is a wise thing to do. But if you thought that by doing that alone, you would prevent the next fire season from coming, you’d be completely deluded. You have to approach that as a more systematic collective challenge. So sure, take individual actions, but never think that that on its own is going to be enough. We have to engage in this broader collective project of wresting power from out inside the company to a broader set of our actors outside.

Nabiha Syed

And as we think of other poles of power, we can think of the elite people who can swoop in and save us, like regulators or legislators, even companies, but we can also be thinking about not just users, but the movements, the advocates, to really build the framework for how we should push forward. Thank you both.

Rob Reich

Thank you.

Sridhar Ramaswamy

Thank you.