It’s a never-ending process, because it’s impossible to predict how humanity will adapt to innovation

On its own, technology is morally and politically neutral. The same encryption apps that allow dissidents to organize against authoritarian regimes can help human traffickers evade police. What matters most is how new technologies are governed, and getting it right is a never-ending process. 

“When you design a governance system, you always have to know that they have to be edited,” said Perry Chen, an artist and the founder of Kickstarter, during a conversation at Unfinished Live. “How do we run experiments with governance in real time, learn from each other and then learn how to edit?”

In the cryptocurrency and blockchain space, thousands of new projects and companies are experimenting with a wide array of different governance structures. To be effective, the most important thing they need is a core set of ethics, because it’s impossible to anticipate every possible scenario that might arise.

“You can’t just engineer it away,” said Maggie Little, a senior research scholar at the Kennedy Institute of Ethics and a philosophy professor at Georgetown University. “You also absolutely need to have north stars of values.”

Watch the full conversation below, and scroll for a written transcript. The transcript has been lightly edited for clarity. Statements by these speakers have not been fact-checked by Unfinished and represent their individual opinions.

Andrew McLaughlin

One of the conceits or delusions of Silicon Valley is that in order to innovate, you can’t be burdened by the things which have come in the past. But the nice thing about the actual title of this panel is it’s not just governance, it’s governance 3.0, it acknowledges that there has been a 1.0 and a 2.0. And actually the numbering of this whole thing is sort of off because if you think about human governance, we’ve had clans, we’ve had monarchy and hereditary succession to power. We’ve had fascism and communism. We’ve had republic and democracy in various forms, but so maybe governance 3.0 is what you map onto Web 3.0, so let’s take it that way.

And in that spirit, I just thought as this sort of set up to the panel I would just acknowledge that what characterized the version of the internet that we now call the Web or Web 2.0 was a kind of sunny Silicon Valley optimism that the technologies that we were building, and I consider myself guilty of having had this perspective, the sort of intrinsic properties of those networks would be so powerful and so positive and so constructive that they would essentially govern themselves, because what’s not to like about freedom of speech? What’s not to like about the distribution of the power to speak globally? What’s not to like about social connectivity? And among the many things that we messed up, I think was not understanding that every bad thing that could happen would happen, every misuse and abuse of the technology that could happen would happen. And we didn’t really think in terms of either process or normative values, we didn’t think in terms of how you could govern an institution like a global internet and all the things that could be built on it.

We didn’t really think through the normative values that we were trying to vindicate other than at a very high level of generality. And so now we find ourselves looking at a new set of technologies, distributed social networks, being an instance of them, but all of the underlying technologies to blockchains and advanced forms of cryptography, quantum computing, advanced machine learning models that we can call artificial intelligence. And so what I’d love to talk about today in this panel is the process of how we might govern them and the normative values that we might try to bake into them at this early stage of the development and deployment and adoption. So Perry, can I start with you?

Perry Chen

Oh, sure. I’m going to try to make a little slice of that, but thank you for setting it up, Andrew. And yeah, I think Andrew’s right. I think the lot of talk about governance is that there is in some way, governance is governance, meaning that there is nothing specifically about decentralized crypto network governance that makes it not have to abide by the laws that people organizing together under sets of rules have had since time immemorial. So I think it’s realizing that because I don’t think everybody always does, that, as you said, with the Web 2 internet, you drink the Kool-Aid and you’re like, well, this is different this time. So there are different characteristics, like the speed of it.

I think a lot of it is almost like a preview for when countries and cities adopt more digital voting. I think the crypto network voting systems that exist now are almost like a lab and a preview of what that might be like. So I think it’s thinking about what’s worked. I think under the… The technology’s developed in the last several decades under kind of one branch of governance, like pure executive branch, right? And I think as you’re creating new forms of governance, you might consider if there’s other branches within there. We talk about stakeholder governance, things like that. It’s easy to say, but it doesn’t tell you what you’re going to do. It just means you want to have a larger group of participants; but how are they participating, right? How is power distributed? But the point you made backstage too, and I’ll leave it here is, how is it edited?

Meaning, a lot of what we think about with governance today maybe applies to the US government. And we may be seeing certain things about the way the rules work in our systems that we’re hitting walls. And so when you design a governance system, you always have to know that they have to be edited. And so you have to build in this kind of meta governance where you’re going to the governance of governance, right. Where you have to think about, how do we run experiments with governance in real time, learn from each other and then learn how to edit?

And the last point is that I think they’re going to be competitive in a way, meaning that the networks that govern the best hopefully will produce… will become the largest. And that means also that’s how they might usurp some of the Web 2 entrenched companies is because the governance will outperform the kind of typical corporate governance structures that they may be trying to replace.

Andrew McLaughlin

Maggie, how does that sound to you? You run a lab that attempts to apply ethical principles in a very practical context. How does that sound?

Maggie Little

Yeah. Yeah, the lab tries to… well, we’re ethicists and designers, that’s an interesting combination, that work with external partners to weave ethics into their R&D. As opposed to retrofitting ethics, which works about as well as you might expect, how do you build the values in from the very beginning? So I love the idea that you’re asking about. How do we build the values into Web 3.0, but, and also what are the structures, processes of governance and not just internalized values? Because you have to have both, right. So one of the things that I think about most is how to avoid the fallacies that just keep on tempting us. So there’s the engineering fallacy of thinking I can build the tech specs right, get the logic of what humans might do so that we will never have any problems. I’ll engineer out the bad actors, for instance.

And I don’t know if my colleague Jonathan Healey’s in the audience, but he talks about however much you think about human beings, their logic of what you can build into your system is never going to capture the emergent behavior. Complex, we were talking about that backstage, emergent behavior. So you can’t just engineer it away. You also absolutely need to have north stars of values. What are the values we want to be loyal to? Which includes decisions about what we will not do. Right. I want to do this, but here’s where my limit is, and here’s my moral boundary. We need to have conversations about that. And then we also need to have structures within Web 3.0 and the applications for it… but also you and I talked about, you’re not just building a moon colony. That structure’s also still going to live within other nation states and on planet Earth so far. And so you’re going to need to have governance of your platform, however well you govern yourself.

Andrew McLaughlin

Yeah. So Braxton, you’re building a system, right? You and your team are building a system that is decentralized. Everything that we just heard sounds totally right to me at a conceptual level. Practically, I can’t even really begin to imagine what that would look like. And I’m curious if you’ve started to think about that. And actually, if I can just say one thing, it’s very hard to encode values in a technological system, right? So, the obvious example of this is every single one of us in this room benefits from cryptography, right? We can send credit card numbers out without them getting captured midstream. We can communicate securely with family members. Dissidents can express themselves and LGBTQ youth in Uzbekistan can find a global community without being surveilled. Cryptography is great. Cryptography also supports financial transactions to pay for child sex abuse images. It supports money laundering. It supports weapons sales into conflict zones.

So the technology itself has deep limits to what kinds of normative values it can embed and therefore a governance structure which is nerd centric, right, like the classic IETF governance of TCP/IP that says, all right, the nerds are going to encode into the protocol or in a given let’s say blockchain community these days, or cryptocurrency community, we’re going to encode into the chain protocols some technical specs, in no way does that allow you to limit the use of the technology, which really does require something approximating laws, societal structures that are sort of backed up by a monopoly force and a government that can jail you if you don’t. So, anyway, I’m curious how now that you’re building this thing and governance is front and center as one of the critical components. How do you think about it and what are you proceeding into to tackle?

Braxton Woodham

Well along the lines of what you’re saying, we definitely think that tech-centric governance defined by techies before everyone is a dangerous notion. We’ve seen how that’s played out. So really what the first thing we’re doing is trying to reach out with intersectional skills. A lot of the activities in this conference have been tied to that. And through the institute, partnerships with Sciences Po, with Georgetown, we’re trying to figure out a way to create these intersectional skills to bring people who have real expertise with these long, centuries of governance systems. And as you said, we’re not governance 3.0 it’s governance 1025 or whatever it is. So trying to bring these experts in, but at the pace at which technology develops and solve that velocity mismatch. So that’s a big part of what we’re doing right now as a process. Some of that’s values—Maggie and I talked about this previously about inserting the ethical processes and thinking into the process at that velocity. Some of it’s the actual rules and then some of it’s using new tools that are available because of blockchain and these kind of new technologies.

Andrew McLaughlin

All right. So let me ask, whoever wants to tackle this one first. So, let me ask… there was a really great moment earlier today when, I forget who it was, but was recounting a dinner in Silicon Valley where it turned out that the Silicon Valley moguls were all proceeding from the assumption that the better governance structure for the world would be nerds and not democracies, right? So technocratic, governance by the, I don’t know, Ayn Randian superhumans. And so for the governance of technological protocol, is it essential that we be participatory? So in other words, does it have to have some democratic element in order to either reach good normative outcomes, to be a legitimate process, or to otherwise achieve some kind of good or goal that we are attempting to reach?

And I’ll just say as an example, what I remember so clearly the first time I walked into a big internet engineering task force meeting was the… It happened years earlier, but the kind of canonical statement of process values was, we reject kings, presidents and voting, we believe in rough consensus and running code. In other words, we will reach an expert consensus that doesn’t have to be everybody, and we will measure our success against the objective metrics of running code, running on machines and achieving numerical outcomes that we can measure. Perry.

Perry Chen

Yeah. I think the exciting part is that we’re going to find out. What I mean to say is that, even if we designed a system right now, we tried to, the likelihood that we would nail it is low. Right? And I think what’s happening now because of the rise of blockchain and crypto networks and DAOs and all these other things, which essentially are governance systems, is that this is the biggest R&D lab for governance in history. Right? Maybe going back to when there were similar amounts of tribes as there are blockchain projects. So what I think is exciting is that you’re going to see all these different variations of things, super command and control, small central committees, a la China, you don’t have then one person, one vote or one token, one vote, and every single way of doing it. And that’s… We’ll be able to observe and learn what works and how, and then also hopefully the networks that govern well will gain constituencies, right?

The difference between having the border determine… You were born here, you’re a citizen here, meaning that if you’re that more the things that govern well in kind of the cloud space, more people would want to participate in joining those networks. So there’s a potential also for the competitive environment to actually kind of lead to better and better governance, because it’s what can be a differentiating factor between two hyper similar projects, especially when you can just, it’s open source. You’re like, well, the difference between us and them is that we govern this differently.

Andrew McLaughlin

Hasn’t the competitive dynamic, though, produced oligopolistic, money fueled, nightmare companies?

Perry Chen

No, but it’s to your point where it’s been both, right? It’s like you say with technology, it’s been both. It’s produced things that I think we’d all be happy with, and it’s produced things that we would all think are terrible tragedies. But I think in a way we’re… For so long America has been a place where many people all over the world would like to come, right? Regardless of how Americans may feel about America at times. And the reason is ultimately a lot of it is the governance system, right? They find it superior or the results of it had been superior to the conditions—

Andrew McLaughlin

Until very recently.

Perry Chen

Right. And so that’s all I’m saying is that also exists. But this is easier, you don’t have to find your way to America. This is, I just join this network.

Braxton Woodham

Also, the dynamic could be different. It’s not just shareholder value competition in this world. It could be different types of value with these tokenized systems. Right? So that might change the dynamic competition. And we’re early in that. So we don’t—

Perry Chen

Yeah, I don’t mean competition purely in a financial sense.

Andrew McLaughlin

I want to let Maggie speak here. I’m unpersuaded that raw competitive dynamics are going to produce the best result here.

Maggie Little

I’m with you. I’m skeptical that the marketplace will determine best governance. I think we’ve run that experiment for 20,000 years and it’s just proven to be false. So I don’t know what the right answer is. We certainly are going to learn a lot from the marketplace of governance structures. No doubt. And we might even get innovations in governance systems, right? There are going to be new ways of doing it that are super exciting, we didn’t think about. So we should absolutely, I think, be open to governance innovation, but I don’t want to trust the marketplace to yield the winner.

So I want to go back to the way you pose the question, which I think was really fascinating about, does tech need to have some kind of participatory element to its governance to be legitimate? And I don’t know, to me, small tech, right, an app for picking pizza slices, right? I actually have one of those on my phone. I don’t care if the coding was done with participatory values, just don’t care. But when you get tech that has power, it is not an app about pizza slices, strong powers where we start to want to know about legitimacy and accountability. And that’s when I think we’re going to want governance that lives up to some ideals. And one of the best versions of that is something with some participatory element.

Andrew McLaughlin

Yeah, that’s our fundamental premise.

Perry Chen

Can I just clarify? Because we’re saying the same thing. You’re reading, when I say, “market,” democracy is a market, right? People vote, that’s how they express themselves within that market. I don’t mean market as in, it’s Wall Street. What I’m saying is that people will be able to determine which networks, which services they use, where you’ll have a lot of services that will have almost identical features and functions, but the differences will be the rules applied and the way you can participate in those things. And so people will vote with their feet-

Maggie Little

You’ll vote with your feet, but people often choose horrific communities that are self-servingly good for them that have externalities on others that are dreadful. So if you allow everybody to vote with what you want in your self-interest, we get disasters and climate crisis.

Perry Chen

Yes, but it also goes back to, democracy is the worst system except for every other one. Because I’m not saying we’ve perfected anything but more of what I’m saying is that it will be a new stage of where there will be more of this ability for there to be differentiation in governance. That being having a high contrast there, whereas before, they all seem very similar. It’s for-profit, nonprofit.

Andrew McLaughlin

Yeah, so I have just one note about sort of the importance of intentionality here, which goes something like this. To Maggie’s point, I don’t want the autopilot… It doesn’t matter to me whether the autopilot software on the plane has had a broad constituency participation from randoms or whatever. I want it to function against objective criteria. I was involved in the coordination of the domain name system. That was an interesting hybrid because it has performed super well and evolved super well as a technological infrastructure. We’ve added DNSSEC, IPsec, all kinds of cool, necessary things, but, human identifiers, names, embed political, cultural, social conflicts. And so we had to somehow figure out how to accommodate those in some ways.

The natural tendency of Silicon Valley has been to try to punt policy issues to somebody else. And there’s this great riff in a Douglas Adams book about spaceships coming to Earth that are so bizarre that the human brain doesn’t ever see them because it assumes it must be somebody else’s problem. And it’s called an SEP field, somebody else’s problem field. And so you cloak these ships in an SEP field. And what Silicon Valley has tried to do over and over again, like Mark Zuckerberg’s oversight board, is let me make this somebody else’s problem. My contention for this panel is, and this is particularly to you and your colleagues Braxton, as you build out a new technology, you can’t make it somebody else’s problem—

Braxton Woodham

That’s right.

Andrew McLaughlin

—to figure out the governance mechanism and its limits and the normative values that go into it and their limits.

Braxton Woodham

And that’s why we’re building an organization for that, as opposed to just doing it within the tech. So we don’t have all the answers, but step one was, let’s actually build a governance design organization that’s intersectional, and that’s not the answer, but that’s the start of a solution versus tech solutionism, which we think has been… You’ve got to get people like myself for us to admit that that’s been problematic.

Andrew McLaughlin

What do you think, Maggie? Does that sound convincing?

Maggie Little

Totally agree. And we have 32 seconds left.

Andrew McLaughlin

Yeah. It’s kind of hard to know how to spend that, right?

Maggie Little

Can I just underscore one thing you said? If you build it, you’re responsible for what it does. Can we just go back to that?

Braxton Woodham

Yeah, that’s good, that’s good.

Andrew McLaughlin

Alright. Well, on that note, we have done this obviously best panel of the day in less than the allotted 25 minutes. So to the audience, thank you so much for your attention.

Braxton Woodham

Thanks.