Podcasts

The Master Algorithm


Tim O’Reilly, the founder of O’Reilly Media and author of the book, What’s the Future?, talks about how new technology can either be considered a scapegoat or a mirror and what this means for our future.

Subscribe and Listen on: Apple Podcasts | Spotify | Stitcher | Google Podcasts | YouTube

Abridged Video Interview

Transcript

Rob Johnson:

I’m here today with my friend Tim O’Reilly from O’Reilly Media. He’s the founder and president, he runs a series of conferences, he’s been to INET conferences many times. Tim, thanks for joining.

Tim O’Reilly:

Hi, glad to be with you.

Rob Johnson:

Let’s start with you are Friends of O’Reilly Foo Camp, social science edition, just about a week or two ago. What, in 2021, did people bring to the table?

Tim O’Reilly:

Well, first of all, our whole idea with Foo Camp is to bring together people from different disciplines. It was originally just people from our technology community, O’Reilly is an online learning provider, publisher, conference producer. It was the mixing of the communities. Then we started doing one on the natural sciences with nature and Google. This thing has been going on for 20 years. 4 years ago, we started one on big data and the social sciences. We thought, where better to do it than Facebook. It’s actually sponsored by Sage, the social science publisher, O’Reilly and Facebook. We try to bring together, normally, in normal times, a couple hundred people for a face to face gathering at Facebook’s campus. This was done with a combination of Discord and Zoom. Some of the talks that I thought were really fascinating, first off we always have a set of what we call lightening talks, which are short, five minute talks where people are advertising. Two of the highlights there, we had the famous pickpocket, I forgot what his name is. It just went out of my head. I’m sorry. But talking about the neuroscience of pickpocketing. It’s a really interested perspective because it does remind you that we are not actually as aware of what drives us as we think we are. I thought that was a real high point, it always is.

When I think back on what were the sessions that I attended, we had Paul Romer, whose gotten a real focus on how, in today’s network world, you can actually have ideas that lead inevitably to monopoly. Are we really in a phase change in our economy that we need to be taking much more seriously? Bill Janeway did a marvelous talk about lessons from the New Deal, which I think are very relevant for climate change. Alison Gopnik and Margaret Levi did a really interesting talk on the caregiver economy and how we need to think about that. That seems to me, this kind of goes back … Actually Mariana Mazzucato was there also talking about her new book, Permission Economy. But I think back on her previous book, the Value of Everything. One of the big, big gaps in that book, I thought, that she didn’t talk about the caregiving economy as one of these things that’s still outside what she calls the value boundary. I think we’re in and COVID has really brought that home in some sense that actually these things that we tend to not value turn out to be quite valuable and this real challenge to our economy. We spent a lot of time talking about that.

We had a great session with Nick Klagge, just a frank conversation about Facebook. I find Facebook endlessly fascinating because it’s such a metaphor for the things that we don’t see about other aspects of our society. Every time people point to Facebook and say, “Oh, how horrible” it makes me think of the Gary Larson cartoon that was on the cover of one of his books of the two bears in the cross hairs and one of them is pointing to the other bear next to him. Like, “Him, get him instead of me.” Everybody is like, “Yeah, we need a scapegoat.” I go, “No, when you look at tech, it’s this incredible mirror. Look in the mirror.” That’s really been one of the biggest ideas that I’ve been wrestling with for the last four or five years. I talked about it some in my book but I’ve continued to think about it and look at it, which is when we see these algorithmic systems and we question, are they giving us what we want? We have to recognize we built an economy, which is an algorithmic system that is a natural creation of a set of rules. It’s very analogous to what Facebook built and it can go right in some ways and wrong in other ways. People can have really good ideas that don’t work out over time. There’s so much to drill down on there.

Rob Johnson:

A lot of the side effects. By the way, just for our listeners, your book is called What’s The Future, big yellow cover, WTF. I remember you presenting that at the INET conference in Edinburgh and people were really delighted. Many went home and read it and sent you nice notes about it. I hope people will start with that and follow your work, which is online. But the book is really a nice basis for understanding where you do work.

Tim O’Reilly:

From that point on, I spent a lot of time starting to look at economics of these platforms and the economics of content on the web. The whole idea of platform economics is a fascinating subject. It’s very relevant to my own business because we run this online learning platform. What we’re trying to do is balance the incentives for content creators to create with the interests of consumers. We’re very clear that we actually have to satisfy both sides of our market, partly because we’re not a monopoly. There are a lot of people providing this kind of content. It really struck me that the reason why Google and Facebook and Amazon and Apple are able to say, “Well, all we care about is the consumer.” To squeeze their suppliers so hard, that’s a real measure of the fact that they are a monopoly because if you’re not a monopoly you have to care about your suppliers because they’ll go somewhere else if you squeeze them too hard. I guess that sent me down a path of really trying to understand how do you build a balanced platform? We really try to do that. It’s actually funny because a lot of the most powerful innovations in our online learning platform came from us trying to find new ways to make money for our suppliers, not by trying to come up with new features for our users.

Anyway, I just started studying this whole field of platform economics, understanding there’s so much that’s powerful and right there. There’s this great quote from Paul Cohen, who used to be the DARPA program manager for AI and other advanced technologies, now a professor of computer science at the University of Pittsburgh. We were at the National Academy of Sciences and he said something that I immediately wrote down and I’ve been quoting ever since. He said, “The opportunity of AI is to help humans model and manage complex interacting systems.” There’s a way that I look at these big platforms and actually, quite frankly, financial markets as well. They are the farthest along at using these AI and algorithmic systems to manage economies. You can actually, in an odd way, I find this particularly fascinating the Google because in its early years, unfortunately I don’t think they’ve kept to this insight, in their early years, first 15 years, they really had a very clear separation between the money market, which was their advertising and the natural organic search market. It’s so fascinating because we have this idea in economics, I think, that in some ways price signaling, money, is the coordinating function of the invisible hand.

What Google showed us in organic search was that you could actually build an incredibly powerful invisible hand, centrally managed that did not use price at all as a factor. They have hundreds of factors, modeling and manage a complex, interacting system. Looking at who links to what, what do people click on? Hundreds of factors. None of them was price. They proved to us that you can build a global scale, billions of users, marketplace of suppliers and consumers, a matching marketplace without money. They had on the side the ads. Now, what’s happened in the last 10 years is they actually have made search worse. They basically have broken down the wall that they used to have between the two. It’s actually a lot like, in some ways, Dina Srinivasan has a wonderful paper about Google ad markets, how they have a monopoly on all the levels of the ad stack. Well, what I’ve been looking at is the other side, which is as a result, they’ve had to produce more inventory. The same way that during the housing crisis, they were producing more and more tranches of bad mortgages because they didn’t have enough for the demand that they created.

In a similar way, Google has basically said, “Well, we’re going to get rid of those pesky organic search results. We can make more ad space.” You’ll find many times now, on a Google page for a financially valuable search. It’s kind of interesting because non-commercial search looks a lot like Google did 10 years ago. You’ll have 10 lengths, you may have a few extra things like here’s a Wikipedia entry but it’s basically … I always use my, being a literary person, I’ll do a search for say, Anthony Trollope, who nobody reads anymore, it looks like old Google. But you look for anything that might be commercial, like a place to stay or something to buy and it’s amazing how much is now Google content. You might get one organic result. It’s all ads and Google’s own content. Google’s own content tends to be a roach motel where you click on the link and it looks like you’re going to Yellowstone National Park but no, you’re going to another Google page about Yellowstone National Park. They’ve taken more and more of the content of the web, of course this is the other side of what Dina’s talked about.

I guess that leads me back to, well, what is the master algorithm? Because one of the things that’s fascinating despite what Paul said about modeling and managed complex interacting systems, algorithmic systems typically have a master objective function. There’s this thing, actually it could be for each sub-factor because again, a lot of these things are additive. You have an optimization function. It’s called a lost function, you’re trying to minimize some value. But in other cases, it really is the maximization of something, even if the function itself is a lost function. The master algorithm of our society is grow your corporate profits. At some point, this idea that Google had of don’t be evil when Larry and Sergei wrote their original search paper and they said, “An advertising based search engine will always be based against its users.” There’s this appendix called Advertising and Mixed Motives. They really, for the first 10 or 15 years, they figured out how to do that with pay per click advertising off to the side. But at some point, this again, I’m rambling through my half-witted absorption of economics concepts here-

Rob Johnson:

I like going down the sewer with you, keep it up.

Tim O’Reilly:

I’ve become fascinated by the concept of rents. I started working with Mariana Mazzucato and Josh Ryan Collins at UCL on a project funded by Omidyar to actually go a little deeper on this work on tracking changes in Google and Amazon’s content ecosystem and how that is really a former of rent extraction. In the course of that I read Josh’s little book called Why You Can’t Own Your Own Home. It’s basically about land rents. It kind of helped me realize there’s two kinds of rents. I was always thinking, my naïve idea of economic rents is the bandits controlling the paths or the feudal landlord who is, “Dude, give me your grain or I’m not going to protect you.” Or mafia or whatever. It’s that kind of extractive rent. But Josh really talks about how there’s a fundamental part of land rent which is the rent that is caused by growth. More people move into your town, there are improvements to your town that you didn’t have anything to do with. You get this free lift.

What struck me is that in a lot of ways, what we’re trying to measure with these companies, like if you look at an Amazon or Google, any really innovative company, they have that first type of free rent of the growth of the market, which carries them along up to a certain point. Then, because the master algorithm says you have to keep growing, they go, “What do we do?” In some sense, it’s a new thing, I haven’t actually kind of done when the number of searches stop growing? Is that the point at which they started using the extractive form of rents to make up the difference?

Rob Johnson:

I’m going to recommend, there’s an old economics book about urbanization and rents for real estate by a man named Fred Hirsch. He talks all about what’s called positional goods. This analogy to the searches and the positional searches that extra money. I’ll find that for you.

Tim O’Reilly:

It’s super interesting. Anyway, the point is I’m just trying to figure out what … This is part of a broader set of reflections that I’m trying to have on what we can learn from Silicon Valley. In the early years, for example when I started pushing the idea of government as a platform, I was like, “What did we learn from the way that Amazon or Apple became a platform? Could we do something different in government from that?” Those are very positive ideas. I still think there’s a lot-

Rob Johnson:

Your wife coded for America, as an embodiment of that.

Tim O’Reilly:

That’s exactly right. She’s gone on to do a lot more, United States digital response, she set up the United States digital service at the White House. We really tried to push some of those ideas forward. But now I’m really focused on what are some of the negative lessons? I do think that there’s a really important point for us to face up to which is that we get the economy we ask for. We didn’t know that we were asking for it but once we see it, we have an obligation to change it. I think that this notion that you can see so clearly in companies like Facebook and Google of they have an optimization function. In the case of Google, in their best years it was and Larry actually said this in an interview they actually attached to the IPO documents in 2004, it was a Playboy interview, actually. He said, “Our goal is to have users come to Google, find what they want and go away.” If you look at Google today, you would never think that was the case. They had this idea, this optimization idea that we will give people what they want.

In fact, one of the search factors that I love the most, in some ways, is called the long click. People who are in the industry know about this but others may not. They realize that when they were in the days of the 10 blue links, people would typically start with the first link and if they came right back and clicked on the second link and then they came right back and they clicked on the third link and then they went away, that’s a statistical signal when enough people do it that maybe the third link is the one that’s actually the best link, despite all the other factors saying otherwise. They call that the long click, when somebody clicks and goes away. Here was Google with this very clear, wonderful, very effective master objective, which is to have people fid what they want and go away. In the case of Facebook, they had this idea that they would show people things that they would want to spend more time with their friends. Again, pretty wonderful. Then we end up with the [inaudible 00:20:26], we end up with the various electoral politics in the U.S. It’s a combination of bad actors and misaligned incentives. In each case, the companies are doing a lot to try to change but I think they’re doing a lot to try to change without threatening what’s their real master objective, which is we have to keep growing.

I think in a similar way, if you look at our tax system and our politics, there’s this massive avoidance of the fundamental realization that we are in charge of the economy. We do these little tweaks. Basically we’re going to do everything we can to reduce inequality except do the things that would really do it, because there’s too many people that have too much at stake.

Rob Johnson:

I was going to say, it’s a very interesting question because when the scale of what I call natural monopoly like Paul Romer talks about, who is so large and yet we value individual freedom, we’re defending the right of those technologists to operate on a very large scale without us intervening to change what they do for the collective good. It’s really attention between protection of individual liberty, whether it be commercial or social or what have you, on the one hand and protection of other people from intrusion on their lives by these products. I have a good friend, Rohinton Medhora, he runs the Center for International Governments Innovation near Toronto, in Waterloo, Canada. He talks about why don’t we have a Food and Drug Administration where these things get a trial, then they are explored for their social ramifications, then they are evolved, then they are released. We’re not very good at incorporating the public dimensions of the innovation for the social good once it takes off on a path that’s quite profitable.

Tim O’Reilly:

I hear that. I do think, being a Silicon Valley type, I don’t actually like that approach because prior restraint of innovation is always a bad idea or almost always a bad idea.

Rob Johnson:

Competitors can engage in stopping you from developing something that would have rode their profit margin, too. There’s lots of complexities in this.

Tim O’Reilly:

What I would start with, more than anything else, is a modified set of disclosures. If you think about stock compensation, for a long time it was just buried. Then they basically said, “No, you actually have to have your gap and non-gap measures of profitability. You can’t hide this anymore.” They should have just said, “You can’t hide it at all” instead of you can have two measures and let people pick which one they’re going to talk about. But they had disclosure. I think, for example, in the case of say, Google and Amazon, just in general I love the phrase, I’m not sure who said it originally but Al Roth uses the title of his book Who Gets What and Why. If you basically said, “Okay, we are going to work very hard to have a set of disclosures about who gets what and why so it’s really transparent.” A real revolution in the kinds of documents. This is going in a totally different area but the work that Saul Griffith and crew did to map the energy economy with a Sankey diagram, what if you had a Sankey diagram of the money flows within Google and Amazon?

It would give you a whole other perspective on what’s broken and what’s not and how it’s changing over time because right now, we don’t really know. I look at, for example, part of the work that I’m doing now is to try to figure out, well if we look at how have Amazon’s fees to their suppliers changed over time? How does the introduction of advertising, which is basically a fee on their suppliers. Calling it advertising, it’s really a placement fee. They call it an advertising business but it’s a placement fee. Again, it’s very much like what we talked about with Google in the glory days of Amazon, it was totally customer focused. Jeff said, “We want to be the most customer focused company on earth.” It’s hard to understand how that can still be the case when you do a search and all that you get are promoted products. Are they saying … Maybe if we dug in they would say, “Well, we have a system for evaluating in which the willingness and ability to pay is only one of many factors and we’ve tested and we’ve made sure it’s balanced out by the other factors” and I would believe that’s possible. But they should be thinking about that. Is there a fundamental conflict between their advertising business and their idea of giving the best product to customers?

Rob Johnson:

I’ve been looking for used books recently, I get all kinds of sponsored products and the thing I search for is usually number nine on the list they show me.

Tim O’Reilly:

Yeah. It’s pretty interesting these days just how little you see. When I first started talking about Web 2.0 and collective intelligence back in 2004, 2005, I used to use Amazon as an example, just like Google because they did use collective intelligence. I used to show, here’s a search on BarnesandNoble.com, Amazon was just a bookstore back then, here’s a search on Amazon.com. You can see, Barnes and Noble is pushing their promoted product or a product that they have created themselves. Here over on Amazon, the default search was what they used to call all the flow around a product. It was the number of reviews, the review ratings, how many people had links to it. All the same kind of things that Google did. Now, today, it’s kind of like they’ve got the Barnes and Noble system, where it’s Amazon’s product and the things that people are willing to pay them to promote. Now, I think the thing that probably Amazon does differently is that they basically say, I’m just guessing, is that they don’t actually … It’s just a tax, it’s just a rent they extract. They’re going to go, “You’re going to show up first in search using all these other factors but you’re not going to unless you’re also a member of … We will charge you for it.” That’s a clever hack. It’s a classic bandits in the positional advantage type of rents.

Rob Johnson:

I’m interested in things like this film that came out, I guess several months ago, the Social Dilemma. They talked about you can watch the things you like, the things you click on, and then send you a subset of the stimulants to excite you, to affirm you, to keep you on longer. The side effect or the intent of that is once they’ve shown advertisers that they get this long attention and recurring returns and everything, they can then raise their advertising rate.

Tim O’Reilly:

Again, I think that we need a lot more study of this. I don’t know if you’ve seen Tim Hwang’s book, The Subprime Attention Crisis where he makes the point that actually some of this hyper targeted advertising is not actually as effective as promised. In some sense, it’s a little bit like the subprime crisis where you’re basically making a bogus product that doesn’t actually hold up to scrutiny and eventually the bubble will pop. I don’t know if he’s right about that but in any event, the thing that I find super interesting, again coming back to my thoughts about the master algorithm, I found it fascinating that in 2014 Facebook got taken out to the woodshed for the study where they were researching whether they could make people happier or whatever by what they showed them in their newsfeed. That was considered a breach of research ethics. But studying whether you can get them to spend more time on your site so that you can make more money, which is the research they do every day, isn’t even considered research. That’s just doing business.

Rob Johnson:

That’s marketing.

Tim O’Reilly:

How can it be a breach of research ethics to experiment whether you’re influencing people’s moods when your business is influencing people’s moods? It’s okay as long as it’s for profit. That’s our society in a nutshell. I think that our media that descended on Facebook like wolves on a crippled deer, it’s just like, we have a lot to answer for there because they should have been doing more of that kind of experimentation. If that had become a legitimate subject of discussion, “Wow, we really are influencing how people feel. What do we as a society want to do about that?” That would have been a really damn good question. Instead, we made them stop.

Rob Johnson:

They might have discovered that they were doing things that were harmful to mental health and then probably had the American Psychiatric Association take out an ad so they wouldn’t publish that so that they would generate more business. All these interactions, I’m goofing but this is a lot of complex stuff that really does matter to.

Tim O’Reilly:

It is. It reminds me, of course, of the wonderful book, Phishing for Phools by George Akerlof and Bob Shiller which, there’s an efficient market for everything including fraud. There’s an efficient market for manipulation. So much of what we take for granted as the way our economy works is that efficient market for manipulation, not for people’s benefit.

Rob Johnson:

That’s right. At one level, the naïve economic theory assumes that one’s so called preferences that are the basis of the utility function are [inaudible 00:32:08] frozen and independent. They’re not valuable, they’re not influenceable. They’re yours and your demand is yours, it’s not the product of a social interaction-

Tim O’Reilly:

That’s right.

Rob Johnson:

… or inspiration or manipulation.

Tim O’Reilly:

Absolutely.

Rob Johnson:

That changes the normative implications of economics quite powerfully. I used to laugh because I used to live in Murray Hill in Manhattan. I’d walk down Madison Avenue, past all of the various advertising agencies. I used to laugh and say, “Huh, you guys don’t exist. Economics says you don’t exist. I don’t have to look in these store windows, I don’t know how much you’re doing.” Then you go look at a business school curriculum and what do you have? A third of the time or more is spent on teaching marketing. Marketing, accounting, finance but marketing plays a big role. The subjective psychology and the fluidness of the psychology are really elements that are in play in the context and technologies that you study.

Tim O’Reilly:

There’s so much to think about. I do think that the idea of economics, in this idealized situation, this would be true. Let’s leave out all the factors. It’s no longer necessary because we actually have, again this comes back to this theme, we have the technology to start to tease apart many more of the factors and to run experiments at global scale. I think we’re in, a lot of ways, a beautiful time for economics where we can actually observe many things that before you were doing these equations and you had to do this radical simplification. Now you’re literally in a world where you can build a machine learning system that will take thousands or even millions of factors into account. There’s a great description by Yann LeCun of what an actual deep learning model is. He said, “Think of a machine with a million little slider switches that you can push up and down until you get to the right answer. That’s kind of what it is.”

It’s like, there’s all these functions that are being calculated, when you change this, does it get you more correct results? If you change this does it get you more correct results? We can change this because we can do, in today’s machines, millions of these kinds of little experiments until we actually fit the curve. You can actually get a lot out of that. I think it’s a very different approach. Of course, it still doesn’t account for, it’s usually within a particular, constrained problem set but we have a totally interesting new tool set.

Rob Johnson:

I want to draw an analogy to the history of economic thought because the original ideal of the market, it pulled everything together and created the balance and the price was a reflection of desire, cost, et cetera. It was then a little bit superseded by people by Friedrich von Hayek who said, “No, what the market is, is the information aggregator. Through the prices, you get the signals for social response to do the right thing.” What you’re telling me, I believe, is that now the technology is such that at essentially zero marginal costs, the information aggregator is pulling it all together. You can see the outcomes without all of the iterative process. It’s almost like a planning market, to use the old Communist, can be run at the central office because the computer can reach so many places so inexpensively.

Tim O’Reilly:

You know, it’s interesting, I’ve actually thought a little bit about that. There’s actually a novel called Red Plenty, about the Soviet central planners and it was a fabulous essay about that, a book review by a computer scientist about why even today it’s still very, very hard. But when I read it, I thought, “Oh, I think we’re thinking wrong about that.” The way we’re thinking wrong about it is, yes, these companies are central planners but in the old model of central planning, you were planning supply. Today’s central planners are planning demand. They are manipulating demand and then letting the supply rise to meet it, as opposed to the Soviet model or whatever, it was plan the supply and you were always wrong. But they don’t plan supply. They just plan demand and they shaped demand.

Rob Johnson:

The Soviet model, they told you what you needed in the Soviet model and provided the supply whereas they determine your preferences of what you would get, what you would eat, et cetera. Now, they’re detecting what people want, influencing what people want and then the suppliers can react to that information.

Tim O’Reilly:

That’s right. It’s the combination of detection and influence that, again, we need to understand. Anyway, all this again comes back to this idea that if I had a magic wand about how to deal with the antitrust problem, it would actually not be first to say, “We’ll break this up, we’ll do that.” It would be to vastly increase the amount of disclosure because I don’t think we understand these systems enough yet. You said earlier that you’re able to do this at zero marginal cost and I don’t think that’s actually correct. The cost is quite high. If it were zero … It’s high enough that you have a relatively small number of players who are able to do it. Yes, certainly, it’s got a low marginal cost because once you built the infrastructure, you can do it for so many things that you would never have measured before, would never have influenced before. But it’s not quite-

Rob Johnson:

Once you play at scale, your fixed costs are prorated over a large volume.

Tim O’Reilly:

Absolutely. There’s something that’s a little wrong to me. Again, I am totally naïve, I feel like I learned no economics before the last five or six years and I’m just kind of floundering around but it does seem to me that the idea from manufacturing of marginal costs going down is not the same in the digital realm. When we say there’s zero marginal cost, a lot of it has to do … I remember back in the early days of Google when I was first formulating, this would have been in the early part of this century. It was maybe even before Google had gone public. I was thinking about what was different about these online platforms from the previous generation of software, like Microsoft would have the gold master of Windows and it would come out every two years or whatever. I was working on this idea that software is becoming a process, something that you do every day. I made that comment to somebody who was a senior executive at Google. I said, “If you guys didn’t keep working on Google, it would stop working within a couple of days.” He said, “Oh, no, you’re wrong. It would be a couple of minutes.”

Tim O’Reilly:

There’s that whole point, that ideal of marginal cost goes like you make it and your done. The point is, if Google or Amazon or whatever didn’t keep doing what they’re doing, there is an incremental cost which is ongoing. It’s not like you have a fixed cost that your once done with. In fact, because the world is changing so fast, you have to actually keep doing more and more. You look at that with Facebook. Billions of people are pushing and pulling. There’s actually quite a high, ongoing cost which is driven by the changes that are extrinsic to the system. Unlike, say the decreasing marginal of a manufacturing business. Again, I haven’t thought about that much before but I’m quite convinced that somebody needs to basically write an economics paper about why the economics of these businesses doesn’t fit the old model.

Rob Johnson:

There may be marginal costs that are incremental at increasing returns because the fixed costs on the platform is prorated over a larger and larger volume. I may have confused the two notions when we talked. Let me ask you-

Tim O’Reilly:

Yeah, exactly.

Rob Johnson:

I’m really interested in you are exploring these technologies and their social ramifications and their business models but there’s one other overlay here. We live in a world now, which different regions have very different philosophical systems. The Chinese Confucian / Doaist world is very different from the Cartesian Enlightenment world. We see the difference between the United States on the one hand with lots of suppliers, Europe using these services but not having as many of the big monopolies that originate there. Asia with a centralized government structure, people fearing the centralized control. How do we build a global system where these kinds of platforms interface across nations and across philosophical boundaries?

Tim O’Reilly:

That’s a really, really interesting question. I guess I would say that is one of the challenges that we get to face in the 21st century. I would say that, in fact, is going to be a lot of the history of the 21st century. People are going to be pushing for advantage and trying to push their system over another. I guess is that really that different from history in the past? Probably not. In the sense that-

Rob Johnson:

It’s different manifestation with this technology system-

Tim O’Reilly:

Of course it is.

Rob Johnson:

… there’s different kinds of challenges in the micro sense but the struggle is ever present.

Tim O’Reilly:

If you think about Colonialism for example, this was somebody had a better, when I say better it’s in quotes because it wasn’t necessarily-

Rob Johnson:

More profitable.

Tim O’Reilly:

… better but it was more powerful. They had a more powerful methodology. We may well see, for example, let’s just say, let’s imagine that China has a more powerful methodology. They will outperform. The rest of the world will be forced to adapt. Hey, that’s what happened with American capitalism. I guess I would say that figuring this out really matters but it isn’t necessarily going to be, “Oh, we’re going to figure out how to inter-operate and everybody will play by the same rules.” It may be that somebody wins and we all have to play by their rules. Now, of course, just obviously the whole discussion about do we get a digital central bank currency that becomes a reserve currency, whoever controls that. Now of course, there’s been a great debate going on around this question of China’s ideas about using a digital currency as a tool of social control and social tracking internally run against their goal of having digital currency that becomes a world reserve currency. Those two things fight with each other because other countries … You look at things like that, we’re going to have a lot of different factors that influence the future. I guess I would also say that I think we’re entering a very interesting world where the kind of challenges that we face, our old solution becomes a new problem.

Tim O’Reilly:

Again, I talked a little bit about this in my book, I was influenced by Mark Blythe, some things I read of his in my book about what we dealt with after World War II. Which was, you look at okay, after World War I, they were continuing to optimize for capital and so on. That the winners bankrupted the losers. They had this terrible outcome, the Great Depression, hyper inflation in Germany leading to World War II. After World War II they’re like, “Dude, we’re not doing that again.” You have the martial plan, I have all the social insurance in the U.S., you have all those things that we did where we were optimizing for full employment. Then you get to the 70s and it’s worked out to cost push inflation. Oh my God, we’ve got to change. Of course, Blythe talks about Goodheart’s Law, which is once you start optimizing for something, it stops working. I think that what we got to with our current version of shareholder capitalism has also had a 40 year run. We tamed inflation, it looked like it was working. Now it’s not working. Goodheart’s Law again.

Tim O’Reilly:

The question is, what’s going to replace our consumerism, capital focused, capital appreciation focused system? You’re already seeing all these things around the edges. This is why, when I talk about the next economy, which is this overall I’m putting together this giant basket of factors. On the one hand, it’s like the caring economy, we’re starting to include that in the value boundary when rich companies are saying, “Well, we’re going to give you three months. No, we’re going to give you six months parental leave and it’s going to be for both parents.” That’s basically valuing the caring economy. When we’re starting to go, “Oh my God, pandemic, unemployment insurance.” That’s the caring economy. The question is, will it just be, “Well, it’s government’s job to build the social safety net”? Or is it companies job? Or do we start saying, “We just have to optimize differently”? All these discussions. Universal basic income. I love Kai-Fu Lee’s idea of, “No, we should tackle the caring economy thing with a social investment stipend where we pay people to raise their kids, we pay people to look after their elders, we pay people to work in their communities.”

Tim O’Reilly:

We have all these fresh opportunities to think about do we still want to do the same kind of economy or is it time for one of these generational shifts? In the back of my mind we always have climate change as a giant driver of this. When you have hundreds of millions of climate refugees and all those pressures, are we going to learn a lesson? Are we going to go, “How are we going to turn this into an opportunity?” When I think of great challenges of the 21st century, one of the ones I like to say is, “How do we turn refugees into settlers?”

Rob Johnson:

There you go.

Tim O’Reilly:

Instead of treating them like a temporary problem, no, they’re not going to be a temporary problem. We have to figure out … I’ve often thought there are lessons from history. Unfortunately they’re bad lessons in some ways. Like if you think about Israel. But the good side is, all these people were not refugees, they’re settlers. The bad side is they dispossessed the people that were there before. But could you imagine a 21st century regime that said, “Dudes, let’s buy …” You look at the great depopulation of vast parts of the United States and you say, “What if we said, “No, let’s build new settlements there. Let’s figure out how we would do that.” There’s so many interesting challenges. Same thing, smart cities. Do we really need to build a smart city for tech people on the outskirts of Toronto? No, Google, figure out how to make a refugee camp that’s the new Hong Kong or the new Singapore. That would be a freaking challenges.

Rob Johnson:

One of the things that is in the forefront of my mind right now is the challenges, the disruptions in the continent of Africa.

Tim O’Reilly:

Yeah, absolutely.

Rob Johnson:

On the one level there are things like Jack Ma and the Luohan Academy are working on in relation to creating new networks with the kind of technology you study to facilitate development because what I will call the East Asian model, manufacturing and industry protection are being obliterated by global supply chain’s machine learning and automation. Secondly, it’s an equatorial region. Climate change will affect subsistence farming and undermine social stability in an underdeveloped region. Finally, you’ve got, according to the international office of migration, by the year 2070, absent of major war, the population of the continent of Africa will be 5.1 billion people. There’s all kinds of the things that you just took me through, there are all kinds of dimensions, technology playing a positive role. Technology meaning past isn’t prologue. Climate change intrusing massive potential migration. It’s a fascinating laboratory for the kind of works that you explore.

Tim O’Reilly:

It is. I have to say it’s not going to be pretty. This kind of comes back, in the beginning we talked a little bit about social science Food Camp. I didn’t mention one of my favorite sessions, is from Ada Palmer, who is a science fiction writer. She’s also a Renaissance historian. Or I should say she’s a Renaissance historian who is also a science fiction writer. The two spring from the same deep sense, she has a blog called Ex Urbe which, from the city in Latin, in which she writes these popular essays about how history happens. They’re fascinating because one of her themes is that we tend to always interpret history through our own lens. She said, “People like to say history is written by the victors, that’s wrong. Among historians, what we say is, history is written by the people who write history. They have their own motivations.” She gives the example of one of her peers, as a grad student, had written a thesis about a period in Mexico and other parts of Central America where there was a whole movement where the educated Spanish were rewriting Roman history to justify the way they were treating the Native Americans.

Tim O’Reilly:

You could see it there but could we see, “Well, don’t you think Gibbon was doing the same thing?” Justifying the British Empire. She goes back through many, many different examples. Actually it’s kind of funny because I’m just reading, this is not history but it’s making very much the same point, Emily Wilson’s translation from a couple years ago of the Odyssey. She just talks about the fashions in translation and the values that are expressed. Both Ada and Emily make this same point, which was the people in the past were not like us. Ada, she has a great series of essays on our myth of the Renaissance and all these people who were proto-moderns. They were atheists. She’s like, “Nobody got burned at the stake for being an atheist. They were burned at the stake for believing some crap we would not be able to believe that Giordano Bruno would be so stupid as to think that.” When you really go back, she says, “Look, I looked at all the people who burned at the stake.” She kind of goes off on this Neil deGrasse Tyson thing about Giordano Bruno being burned at the stake for espousing Lucretius. She’s like, “No. Nobody got burned at the stake. They thought Lucretius was so ridiculous because nobody could believe in atheism. It was a great foil. They got burned because they believed some variant of crazy things that now we go, ‘Why would you kill somebody over that?’”

I guess the reason I say this, the same thing with Emily Wilson’s Odyssey, just like, “Here’s Pope turning this interpretation of Odysseus” and so on. I guess I think that we have to have humility as we think about the 21st century. It’s not going to look like us. We have to look back and go, “Oh yeah, the way we are today is not how we were 50 years ago.” There is a continuity but there’s also massive discontinuity. For example, that post-World War II economy that I was talking about earlier was a different economy where we valued labor. The idea that somehow the market doesn’t value labor anymore, no, no. We told the market not to value labor anymore. I know there’s a conversation I had with Brad DeLong once where he said, “Look, I have a toy model. I haven’t really developed it. A toy model where you can actually get the same output with more technology and less labor or less technology and more labor. But you can get the equivalent results.” We made a choice. We made that choice because it was the advantage to some people. This idea keeps cropping up. I’m thinking about Ibram Kendi’s book, How to be an Anti-Racist.

He comes to this wonderful conclusion which is very similar. He says, “Everybody has this idea that being an anti-racist is having certain values and feelings and eradicating values and feelings.” He says, “No, racism is not a set of feelings. Racism is a set of policies which are put in place by people in power and that are sold by influencing and creating those feelings.” I think that is a really powerful notion. The idea that our economy is shaped by the idea of people who are shaping what we believe for their advantage. We can shape it differently.

Rob Johnson:

The great black writer James Baldwin-

Tim O’Reilly:

Baldwin’s good.

Rob Johnson:

… saw right into this. He really saw into these areas.

Tim O’Reilly:

He did.

Rob Johnson:

There’s a modern poet who I often quote who goes by the name IN-Q which is short for In Question. He’s got a poem called Evidence. As I was listening to you talk about these different aspects of history and how they’re projected onto understanding now that comes from now, his poem says, “You can always find evidence to support what you choose to believe.”

Tim O’Reilly:

Absolutely.

Rob Johnson:

He repeats that two or three times. [crosstalk 00:58:41] This veil of science, as though these things are there teaching us about the future belittles the notion of our feelings and interactions from the present, which influenced our lens in looking back at that past evidence and what we draw from that. It’s fascinating.

Tim O’Reilly:

We have this fabulous opportunity to make it new.

Rob Johnson:

I always laugh when people talk to me about climate change. At the time of Adam Smith’s Wealth of the Nations, there was a man named the Earl of Lauderdale. He said, “What are you talking about? That price is equal to value.” He said, “If anybody shut off the water or the oxygen, you would die. It obviously has great value.” But at that time, the size of mankind in relation to the planetary resources was small enough that you could have a price of zero because people had plentiful water and plentiful oxygen. But we may be learning a different lesson in the current context. That how price didn’t reflect value was what the Earl of Lauderdale said. The question is, why do we presume it does now?

Tim O’Reilly:

Absolutely. That’s a great place to end, I think.

Rob Johnson:

I think you’ve taken us on a wonderful tour today. I want to thank you. I’m not surprised because I’ve been following you and learning from your work and your explorations. I love how refreshing your mind is. I love, in this dire time, how much enthusiasm and curiosity you project. My young scholars can learn a lot from your way of being.

Tim O’Reilly:

Thank you.

Rob Johnson:

Thanks for joining me. Hopefully we can turn around a corner a little bit and maybe come back in a few months and take another look.

Tim O’Reilly:

That sounds great. I’ve learned so much from you, thank you.

Rob Johnson:

The feelings are mutual. We’ll talk again soon. Bye bye.

Share your perspective