Abolish Venture Capital

Edward Ongweso Jr.

Notes

Paris Marx is joined by Edward Ongweso Jr. to discuss how the venture capital industry works, why the technologies it funds don’t deliver on their marketing promises, and how that’s once again being shown in the hype around AI.

Guest

Edward Ongweso Jr. is a freelance journalist, co-host of This Machine Kills, and guest columnist at The Nation. You can follow Ed on Twitter at @bigblackjacobin.

Support the show

Venture capitalists aren’t funding critical analysis of the tech industry — that’s why the show relies on listener support.

Become a supporter on Patreon to ensure the show can keep promoting critical tech perspectives. That will also get you access to the Discord chat, a shoutout on the show, some stickers, and more!

Links

Transcript

Paris Marx: Ed, welcome back to Tech Won’t Save Us!

Edward Ongweso Jr.: Thanks for having me on. Happy to be here again.

PM: Absolutely. It’s always great to chat. I think last time you were on, we were talking about the financialization of everything, and how these mechanisms of financialization work their way into so many different aspects of the economy and society and everything else that’s going on. You’ve been doing a lot of writing recently about the tech economy, but venture capital in particular, as this important thing that shapes not just the technologies that are created, and the types of tech companies that are able to thrive, and give it a shot for taking off, but then also affect the types of technologies that then make their way into the rest of the world, and that we have to interact with. And so to start us off getting into that conversation, can you give us a general idea of what venture capital actually is for someone who wouldn’t be so familiar? And who are some of the key players and companies in that space that people might be familiar with, or might not be familiar with?

EO: So, I think one way to really understand it is just think about the problem of technological development inside of our system. We have a system where technical innovations are pushed through the market in one way or another, or substantively through the market — and we can get into the ways actually bears out. And so the idea is that new ideas, new innovations, new ways of finding out things things, new ways of doing things, solving various problems are going to be presented by people who come together have an idea or figure out a way to provide that solution or that product to a bunch of people, and they will do that by seeking financing. But because they’re new business, they can’t get traditional financing from a bank, since they don’t have established financials and records.

And so they get financing from venture capitalist — capitalists who are going to invest in a venture and obstensivly gamble on something and say: Hey, I will front you this amount of money. Or I’ll give you this amount of money invested in to help you expand your operations, do research and development, scale up, get more customers. And in return, I got a piece of your company. And we can use that to arrive at a private valuation. And maybe we can get together groups of investors and get into rounds together and value at a certain rate and keep buying chunks and chunks of the company until you eventually go private, and I cash out. Or I hold the shares, and maybe I have some role in decision making. And the venture capitalist provide obsessively the capital, networking connections, advice, experience that they’ve garnered from investing in other people.

And so VCs are essentially the financial lifeline for firms inside of a private technological ecosystem. And as a result, there are a lot of really interesting dynamics at play. A lot of funds, a lot of the venture capitalist industry really hinges on a few key players or few key networks. So you have places like a16z, with Horowitz and Marc Andreessen, two longtime investors who have thrown money into various startups and sectors that they believe will either get them a lot of money, or will revolutionize commerce or industry in one way or another. You have more traditional firms that have been in the business for a while, like Sequoia is one example, or Benchmark — places where they obstensibly do really intensive due diligence, they spot talent, they spot teams, they spot, just business models and industries ripe for disruption, and invest in founders who have a bold idea that might get huge market share, and thus give them a return. So with venture capitalists, they have these networks that they fall back on, and people that they throw the money to. They have these huge funds that they pull capital into and then allocate it.

The funds that they get are usually from an array of sources. They get them from other wealthy capitalists and investors, other corporations, other venture capitalists funds, as well as from institutions that need returns on capital, because they’re providing them for maybe for a retiree, so a pension fund, for teachers or for firefighters. You might have universities putting endowments inside of funds, because they want to keep growing their capital. And they want to keep investing it. Obstensibly, in the university and in the education, but more realistically, just to keep earning a return on it. And venture capitalists earn a fee for managing the money and they earn a percentage of the any profits that are made. So there are a lot of places for venture capitalists to skim the top right. So just a one way to really think of them in that simplistic models. They’re well connected middlemen who have the money and can use that to shape what gets invested in, who gets heads up on what’s hot right now. And they stand to benefit whether or not the things that they’re investing in hyping up or incentivizing other people to invest in are worth anything to the society at large.

PM: I think that gives us a really good picture of how this actually works and what is going on there. You have these firms that are essentially using all this cash and then investing it into the economy, into society, in order to place bets on what the technologies or what the companies that are going to take off in the future are going to be. But then that also gives them an important decision making role in terms of who is going to benefit from this, who is going to be able to take a chance and grow and whatnot. And in one of the articles that you wrote, you said that venture capitalists present themselves like the truffle pigs, who are rooting around for these great companies that they’re going to find and they’re taking these risks and stuff. But you write that that is how they want us to think about them, as these risk takers that are searching through this tech ecosystem for the bright spots or the lucky companies that are really potentially going to do something. But then the actual reality of what these firms do is actually quite different from that. So can you talk to us about how they present themselves, but then actually the real impact of what they’re doing?

EO: I think venture capitalists believe that we have an ecosystem right now that’s the greatest wealth creation engine ever existed, and we owe that largely to venture capitalists finding valuable enterprises inside of the tech ecosystem and expanding their ability, their scale, or value, providing jobs providing good consumer goods and products, optimizing the economy and efficiency of production, so on and so forth. And in that sense, they view themselves as truffle pigs, but they think a better way to understand them is either we can be nice and call them herd animals, or I think, realistically, they’re parasites. When you step back, and you look at some of the more spectacular examples, where they failed to catch charlatans such as Elizabeth Holmes, it’s easy to paint those as exceptions. But when you dig into what are the actual reasons why these people were dug into, by venture capitalists, you find a few commonalities. You find the fact that these are people who were charismatic enough to get the money and went into business models and sectors where there wasn’t any real chance of them creating the product they wanted.

But the product that they wanted was a monopoly. And this gave venture capitalists a lot of excitement, because to achieve a monopoly would be to achieve total control over price setting, total control over labor conditions, total control over all the aspects of good or the service or the sector, that would yield dazzling returns. And even if you weren’t able to realistically achieve that, you’d be able to convince investors that somewhere down the line, and that would continue to inflate the valuation. And so on the first count, I think there’s the fact that they are liars are deceptive and manipulative, and that they’re mainly interested in enriching themselves. And they will do that at any cost. And they will sterilize most of those costs to the public. They will mismanage pensions; they will mismanage public funds; they will mislead investors. They’re really self-centered and self-interested in getting as much of a return as possible. And as a result, misallocate resources, especially public funds, because they get a subsidy through tax loopholes and regulatory loopholes that allow them to use public funds, and not also get taxed for gains that they have.

They are herd animals, in that they go where there seems to be another stampede happening. And they are heavily reliant on insular networks of insiders and friends that are passing around opportunities to get into this hot new fundraising round, or this hot new startup with this hot new sector. So what you end up seeing is a picture’s something where it’s not actually a bunch of value-seeking, risk-taking investors, but a lot of risk-averse, lazy, parasitic, self minded, and really superficial investors who aren’t really interested in are capable of doing the sort of due diligence necessary to find things that are worth value, right. And then on top of that, there are structural problems and venture capital, right, where there’s not really any real evidence that these are people who are able to like adequately anticipate where value is going to be.

There’s also the fact that because they’re so focused on short-term returns that they’re not going to take up long-term investment horizons, that would be necessary for things that have social utility. These are technologies that may not pay off in 10, 15, 20 years from now, that would be transformational in terms of the energy grid or in terms of pharmaceutical innovation, or in terms of logistics. These are things that they they’re more interested in, what can we do in the short term, and in the short term, the most promising things are at based digital labor platforms, or surveillance platforms, or commodification of daily life, right, these are the things that are gonna attract a lot of the capital and then some sprinkling in of clean or green tech, right. As a result, venture capital ends up being prioritized on acquiring market share, crowding out competitors, lowering labor costs, privatizing everything inside of a city or inside of the someone’s daily life, and inserting as many checkpoints as possible to suck out dollars while skimming the top, from other investors whilst doling out lottery tickets to one another, to make each other richer and richer and richer, so that they can do it easier and easier next time.

PM: Definitely. And I think what you’ve described there gives us a number of things to drill down into to understand this a little bit better, because I feel like one of the things that I’ve been concerned about — you’re talking about these kinds of long term investment horizons — is that is typically a role that would be served by government. And certainly government still does a bit of that. But I feel like as the venture capital model has taken hold, what we’ve seen is governments as they have been stepping back from public investment and expecting the private market to do more and more things is that they rely on the investors or the venture capitalists to make the initial investments and then say to a company, if you’re getting investments from whatever firm or whatever, then we’ll give you a top up or something. So it’s still the venture capitalists who are deciding where these resources are being allocated, even when it’s public funds that come from that, at least that’s something that we’ve seen up here in Canada. I don’t know if it works the same way in the United States.

But I also wanted to pick up on what you said about the herd mentality, because I feel like this is something that we see a lot, whether recently it was crypto, and everyone was running into crypto and throwing money at crypto to a lesser degree, the metaverse. Meta made the big push on that, and then a lot of companies were going for it. Now we see AI, and that’s something that we can discuss a bit more into the future. But I feel like it’s not just in investments. One of the things that really stands out is with the Silicon Valley Bank collapse earlier this year, it was another example of how there’s these really insular networks where the information travels very quickly, and that affects what these investors what these venture capitalists are doing. And that can have massive impacts on like the water economy.

EO: I think SVB is a really instructive example here, because SVB was servicing as far back as what 2014, 2015. As far back as then it was already servicing a majority of the industry. It was most, if not all, startups in the region placed funds there, and most venture capital firms and funds, and the investors involved in them were parking money there. They were parking money there, probably because of sweetheart deals where the firm would give preferential mortgage rates or loan rates to investors who are getting portfolio companies to go out there. And because of this low-interest rate bubble that we had where the idea was, we have so much money that we don’t really know what to do with the startups keep throwing it at us. We need to put it somewhere, let’s put it in the heart of Silicon Valley, the Silicon Valley Bank. And the collapse happened because —  it’s ironic, on a level, where you have low-interest rates driving the tech sector to get inflated valuations and then giving these people enough money that to place it in Silicon Valley Bank, and then Silicon Valley Bank in this low interest rate environment miscalculating the risk of interest rate hikes and doing bets on bonds. And when hikes start to begin to have been trying to sell the bonds, duration of capital, and sparking panic.

But SVB also points to concerns that we should have about venture capitals in general, because if they are not able to manage something that is important to them, as the heart of their financial ecosystem, and if they were as prone to risk mismanagement, if they were as prone to blindness about a potential ways to navigate the crisis, because the panic was set off, even though the more likely would have been made home no matter what if they had left the money there, if they take it out. And the fact that a lot of these people didn’t understand it, and tried instead to advocate for an overhaul of banking regulation, so that they would be made whole again. All of this suggests that these are people who have pretty poor understanding of risk and are not risk takers, they’re risk averse. And they’re willing to put the cost into the public because they think that what they’re doing is far more important and integral to the state of the economy, even though it’s a destabilizing factor. Because there was a threat of contagion. It was made real once a bunch of them decided, like Jason Khalil Mack is to start screaming on Twitter and insisting another bank run friend of the show. Another friend of the show. You have you had them in their network, going out and insisting that what we need to do is guarantee all of the deposits and make everybody whole, or ensure that everybody would be made whole, or else we have a recession or god forbid, a depression.

And this sort of gross negligence, this externalization of cost is this risk aversion is how they deal with something as important to them as their bank. How are they going to treat something as obsessively important to all of us as the development and the design of our technology? And the answer is, like: I’m not really interested in technology as such, and I interested in finding things that are socially useful and productive. They’re not interested in things that genuinely help people. They’re interested in things that generate profits, and specifically things that generate profits in ways that are sustainable. So this ends up being platforms that you can erect walls around, this ends up being cultivating social relations that can be transactable. It can be quantifiable, it could be replicated one way or another and market conditions and context. And this means flattening and eroding the really rich lives that we all have with one another outside of markets and bringing them all in that and so I think that is the reason why these these VCs are best viewed as parasites is really dangerous, destabilizing parasites, one they’re making the host body weaker and weaker and weaker, but two, they’re also destabilizing and trying to change the nature and the behavior of it trying to train people are trying to convince people are trying to introduce platforms and logics structures that get people to act in ways that are more profitable. And I think that is the real threat, the real danger, the real concern with venture capital, and with privately driven and finance, technological development.

PM: I love your approach of positioning them as parasites, especially when describe it as what is happening to the host body. And of course, the host is the society that the rest of us live in and the economy that we rely on. But I think that the example that you give of Silicon Valley Bank also shows us something else that has been happening in particular, the past number of years, where you have these venture capitalists who have been at the heart of the boom in the tech industry for the past couple of decades, if not longer than that. And they have created a self conception of themselves as these really important people who are doing this really important job that is benefiting the rest of society, while they themselves are getting rich. And we need to hold them up on a pedestal because they are doing such important work.

And then when things like Silicon Valley Bank happen, and when you see how poor of an understanding of the financial system, and of how this whole economy works, that so many of them have, as we have been getting a lesson in the past couple of years — not just with Silicon Valley Bank, but throughout the pandemic period, and the economic issues that we had during that period — it shows us that these people who see themselves as these incredibly intelligent beings, these incredibly important people in the economy, who are helping a bunch of people actually serve a very different role. And when we start to push back on that, and when we start to say the reality of what they’re actually doing, then there’s not only this divide that happens in their self image, but they react really negatively to that. And that has serious consequences as well, where we see them shifting to the right, and things like that. I wonder what you make of that the effect of them thinking of themselves one way, but actually acting in a way that’s very different than that.

EO: I think there are a few things that happen. One is one of the things that it’s gonna be hard to parse out, but maybe in 10 years, we’d be able to or we have, we’re starting to get a good idea of it is the role in which commentary and criticism and the lack thereof, honestly, of these people for the first decade or two had in not just allowing them to act without little to any pushback, but infecting the public, with the same sort of ideas and ideology and framing and conception of technology, and what its role should be, and what kind of technologies we should pursue? And how should cities look? Or how should we relate to each other, what sort of spaces we should share? These are all poisoned by the vision of a lot of these venture capitalists and investors and founders of a society that is digitally mediated, and surveilled and legible and deliciously profitable. And I think that the dissonance between how they talk and how they act has given a lot of room for some commentators to focus on how they talk, and be surprised about how they act and get people to also share that surprise, when we really shouldn’t be like, if you do step back and think about it, of course, the types of technologies that you would want in your daily life are going to be different from what a billionaire who’s looking to achieve a certain return are going to be and the degree to which they converge a lot of the time is a function of how successful their propaganda has been.

Just the recent example with Twitter — what kind of platform would you or I almost any other person who uses this website, one versus what kind of platform does it make sense for the owner of Twitter, Elon Musk, and his yes-men and his sycophants to push onto people? Well, they would want a platform as he’s wanted, which sits in some wider network that includes communication and payments, and microtransactions. Things that will juice up engagement, things that might result in the return of advertiser revenue, but also create another profit center independent of advertiser revenue. And why can we say or guess that? Well, because the other major social media network, Facebook tried that and failed. They tried to create an alternative profit center from advertiser revenue, they tried to create a wider super app and ecosystem that integrated payments and they failed, and have retreated to that and then tried again to go about it through the backdoor by introducing the Metaverse and that failed.

So it’s really transparent to see okay, what kind of world do these people want? And how will they paint that up? They’ll talk about a world where we’re all connected, where you can have instantaneous and deep digital relations with one another and whatever rhetoric they need to. But in reality, what they’re acting towards is a world that is much more depressing and alienating, and expensive and draining to be in. But there are a good deal with people who are really taken in by the rhetoric in the former example, that will feign surprise when the scorpion stings them. And I think that as long as you are keeping in mind what these people’s interests are, and what the desire they have is — why they’re going to propose this technology instead of another technology? — it is not that hard to see or to understand where our interests diverge, and how and where they’re going to deploy rhetoric that beautifies it.

PM: You’re talking about all the downsides here. If you think about it, if we let Elon Musk transform our technology and social media, then we get a lot more of the letter X. That’s gotta be applause.

EO: My favorite letter, right, I think and, and his two apparently. It’s also really funny, this idea of the X rollout, it baffles my mind, because it’s also like, I feel like it’s gonna go worse than the attempted rebrands of Facebook into meadow or Google into alphabet, or whatever the fuck, right? Because even his sick of fans are like: Why would I call this website X also, when none of the other stuff exists? At least with Facebook, and with Google, they did this rebrand because they had other subsidiaries and operations that they had, and they were just like: It would be easier for us, and a better PR move and make sense if we just had this large umbrella corporation. Twitter’s just Twitter. They’re talking all this hot shit — as he does with Tesla, as he does with SpaceX — about what’s going to happen in a month in a year, in five months, in 10 years. None of it is here. And it’s just a little frustrating, because you can already start to see people pivot and talk about why X is a great idea why X is going to change the world, why X is going to revolutionize Twitter. The way to understand that is in another world, if there were more competent capitalists at the helm of the ship, I’m sure they would try and still fail to create a super app. And for them, it would be a great innovation. But for us, it would suck. It would be miserable. And it would make worse and already depressing state of affairs for the digital ecosystem.

PM: I obviously think that’s the case.

EO: You’ve written about this a lot!

PM: I think Elon Musk is going to fail. But it’s also when Google and Facebook rebranded, it wasn’t like, now we need to call the Facebook social media platform, Meta.

EO: Right!

PM: We don’t need to call the Google search engine, Alphabet. Those are still there. It’s just we had this holding company, where are other things are going to be part of it as well. And it’s easier to distinguish between the our main product and our company that holds all the rest of our products — whereas Elon Musk, who’s just like: This company that you’ve known as Twitter for the past 17 years, or whatever, we’re just going to call it X now and like, do a totally botched rebrand, where we have not planned absolutely nothing out.

EO: It’s amazing to watch him do all the things that these other companies have tried, and not learn from any of the failures that they had.

PM: It also makes it so much worse.

EO: [laughs] But somehow this is the man that’s going to get us to Mars and also give us a mindmeld the objects.

PM: Absolutely. We’ve been talking about these complete failures of billionaires, and we’ve been talking about the venture capitalists, but like, what is the actual impact of giving these men and in the most case, they’re men, of course, there are some women involved. But these people who hold this immense influence, because they hold the purse strings of where money is going to flow in the tech industry and in the wider economy. What effect does that have on the type of technology that gets developed, but also on the type of society that develops as a result of those investments?

EO: I think this is a really crucial question because this is one of the biggest drivers as to the type of technology we get. If you are venture capitalists, or groups of venture capitalists competing for and looking for places to park money that will give you excessive returns, you’re going to prioritize business models that can achieve monopoly scale. You’re going to prioritize business models that can advance rapidly the digitisation or the privatization on the digital platform of daily life. You’re going to advance surveillance platforms. You’re going to advance labor exploitation platforms, you’re going to advance schemes that either involve regulatory arbitrage — things that will help you move fast, break things, take advantage of loopholes scale up and use capital as a weapon and integrate yourself into life as a parasite so that you can’t really get ripped out. I think that as a result, we end up getting the worst versions of things that we might want or need that satisfy a real problem, but only in a very perverse sense.

Let’s take the gig economy. A lot of the app-based labor platforms meet a few holes in our current system in a very superficial and perverse way. There’s a huge amount of underemployment and people are in need of work and they have this car; they might have a home, that they think they can wean out more worth out of to help them make ends meet. And then there’s also the fact that in a lot of our cities, we have food deserts, we have transportation networks that are falling apart or underserviced. We have shortages of housing, or we have huge hikes in rental costs. And so as a result, there’s this idea that maybe these things can be met with the private market. But the solution to all of these problems would look very differently if we were interested in going outside of the marketplace and not instead of through it, because what we’re doing by handing it over to private enterprise, but specifically to a sector of private enterprise that is so maniacally focused on excessive returns in the short-term is that we are building out platforms that are as engaging as possible, that have reserved supplies of labor in the labor platforms, and of supplying the housing ones, to try and attract customers in the early stages, and then hike up the prices later on degrading the quality of the whole thing that they scaled up and then use to displace the public variant.

And then we’re also encouraging and people this idea that infects the society at large, that the way to solve some of our social problems and our political problems is to introduce market logic. We don’t need political vehicles for changing our society, we need economic ones. If people in your community cannot get to where they need to go, you don’t need a bus need an on demand ride hailing service, and if they don’t have homes, you’re gonna figure out some way to integrate the market into that approach, right, and so on and so forth. They don’t have food. Well, we need on-demand grocery delivery services, instead of rethinking how we provision food in the country, or in the city, or whatever scale you want to think of. And so we end up getting really monstrous versions instead of experimenting. Because you can imagine what a public right hell option would look like. I mean, we have the taxis were when a component of it, but not in and of itself a perfect example.

But a public option would be very different. Because one, it would, ideally would complement a mass transit and would also come with massive expansion of mass transit and the various modes of transit you could use — whether it’s bikes, whether it’s skateboard, scooters, whatever it is, whatever makes sense. But also, the reason why these platforms are cheap, because they have so many drivers on the back end. And so to keep so many drivers on the back end, you have to learn them into predatory conditions. So you increase the pay on the front-end, and then you decrease it on the lower-end. And then you also have to have some way of managing churn. And you also have to have some way of pushing them to drive more and more and more. So you introduce quota systems, or you introduce promotions, or you introduce these algorithmic overseers to try to randomize earnings and keep people hooked, trying to make ends meet. And so you end up creating, for the private version, a really exploitive thing on the back-end, that sucks in workers who are underemployed, and now traps them into debt, traps them into worsening working conditions and health conditions, just because they don’t have coverage, they don’t have adequate funds or means to take care of themselves. Some of them are living in the car, so on and so forth.

And you’re making worse one aspect of the social problem. And then you’re getting people to use these things more and more starving some public transit systems, encouraging people to try to create their own startups that are modeled after this thing. I think it ends up creating a vicious feedback loop, where people start thinking that the solution to society’s problems are these private enterprises that dissolves the social bonds between us and that trap us or provide a really glitzy appealing consumer option, but on the back-end is a worker who provides that service and is exploited. And the only reason you can get it that way is because of how deeply they’re exploited. This is the real cost of it. You have the short-term parasites. And these venture capitalists have introduced this idea that short-term greed, corrosion of the society, privatization of anything that’s public, atomization of people — these are the way to solve the political and social and economic problems of our times. And coincidentally, give us even more money, right and so we do Just giving accelerant to the arsonists. We’re giving wrenches to the saboteurs. We’re letting the people who have created these problems make even more money off of them.

PM: As you’re describing that, basically, exactly what I’m thinking of is that the technology is like a Trojan horse. You have these VCs and these founders who point to their technologies and say, look like we’re doing all these wonderful things, we’re going to solve all these problems in society by rolling out our shiny technology that’s going to do all these great things. And that technology is progress. And this is just how we make the world a better place. And then, within that technology within that Trojan horse that they are bringing through the gates is all of these market forces, all of these forces of privatization, all of these efforts to ensure that workers are precarious, and have much less power to be able to push back against these forces. But they were able to successfully do this, because the marketing and the PR operation for technology in the tech industry has been so successful. And the media, in many cases has been so ineffective, in actually telling us what is actually going on here, and just repeating the PR lines.

EO: How many journalists? Sam Harnett wrote, “Words Matter,” a really amazing, incisive critique of tech media. Almost every single journalist except labor journalists and outright tech critics fond over each iterative wave of the gig economy, even when it was very clear what was going on. It was only the labor journalism and the tech critics who were like: I’m not into this privatization extension of the self that’s visible just below the surface, the veneer here. It’s very obvious to exploiting people, it’s very obvious that breaking the law. It’s very obvious this will never ever scale up in any profitable way. It’s very obvious. Their plan is to integrate themselves so deeply into cities, that they make deals, and that they will continue to suck at the public treasury, and they will continue to corrode labor conditions. And they will continue to corrode transit networks. And they will continue to follow Gize peoples and their interactions with one another, in an attempt to bring everything into a marketplace.

There was that period 2021, I think when Dara Khosrowshahi — the chief executive of Uber — he talked about how he wanted it to be the operating service for your city. And I think, of course, it was a ridiculous idea. But there’s the nugget in there that points to what these people are trying to get at, which is urban life. We’re not going to be able to provide a profitable Uber, but we can use Uber as a vector platform, because the brand is established. Because there’s a baseline level of use now, to get people to use and offload more and more task and services onto it. We can do Uber Eats. They were going to do on-demand labor. So they were going to have workplaces be able to get contractors to work for them through the Uber app. We’re going to try to onload travel onto here and do Uber freight work with trucks. We’ll work with travel companies, or agencies or airlines, and so on and so forth. So you can get tickets and train tickets and airline tickets on here, we are going to be your one-stop-shop.

Not because you should have a one-stop-shop. But because if you have the one-stop-shop on here, we can do a bunch of things, but we’ll be able to inflate the prices, get more of a profit, and you won’t be able to escape it because, well, it’ll be expensive. And maybe we’ll try to look elsewhere. But we’ll have first mover advantage. And then by the time you look around, everyone else will have tried to adopt the same thing that we did because of our success. And that also is another problem here, right, because of how insular and networked a lot of the tech industry and the VCs behind them are. Copycats abound. There’s a lot of shared delusions, and so if one company tries to pursue monopoly in one way, or another company — you can bet your ass — is going to try to pursue it in that same way, and so on and so forth.

And so, this vision of Uber as the OS for the city, would speak to both the idea that one firm should be in control of most of what you do in the city or be integrated into your daily life, but also speaks to a concerning vision, or financiers will be looking for companies to come in and corner various parts of the urban experience or life in general or life outside of cities. And there is no self-awareness of how horrifying that sort of vision is. Or how if it were to happen if we couldn’t believe them in that if it were to happen, how much of a degeneration it would be politically and socially, and instead, the ideas, we’re going to optimize things are going to make consumption more tidy, we’re going to optimize your daily life. And I think this is analogous to the super app obsession that we’re starting to see with X, Ubere and Twitter. It’s the desire and realization that people are not as interested in the consumption that we think they are, but maybe if we forced them to move there until lives onto these platforms, they’ll be consumed the way we would like them to consume.

PM: Absolutely. And as you’re describing that, I think not just the the super apps, but when Google was trying to create the Smart City in Toronto, and, one of the reasons that people push back was because the expectation was going to be that Sidewalk Labs, this Google division was basically going to be in control of so much of the tech that was going to be necessary just to exist in urban life and city life. Right. And so you see this time and again, and I think that also leads to another question that I wanted to ask you, which is about the role of government in this right in the relationship between tech and venture capital, and government, because I feel like there was a while where the suggestion was more tech and venture capital, they are like opposed to government. They are outside of government, they are operating separately from that. And the tech industry is trying to push back on like the overstepping of government, like we’re working for you — those digital libertarian narratives that people are very familiar with.

And of course, we can discuss whether that was ever really accurate, and whether they were really that disconnected from the state, as they like to suggest. And I think it’s fair to argue that they weren’t. But even now, I feel like we’re beginning to see in the past few years a shift in that narrative and that relationship where they wanted to present themselves as being separate in the past. But now, as they faced the antitrust threats from the government, there was this shift at a similar time to seeing China as the big enemy. And the one that the United States had to protect itself and its tech prowess against, and that has created an environment where it’s much better for these tech firms and these venture capitalists to be close to the government to be able to say, on one hand, you need strong tech companies that aren’t going to get broken up. But then on the other hand, there can be a lot of mutual work between us to ensure that the American tech industry is strong, and the Chinese isn’t. So I wonder what you make of that relationship and how it’s evolved.

EO: The development of tech capitalism in Technopolis, is intimately tied to the geopolitics of the age. It’s out of the Pentagon, that Silicon Valley really gets its jumpstart. It’s in collaboration with it that it gets a lot of the key consumer products that are originally military applications. And it’s out of [the Pentagon] that it gets a really large and consistent customer that allows it to provide business to business services that are more exuberant, for investors, I’m sure, for the companies than the consumer facing versions. I think that if we step back and look at it right there, you are exactly right in tracing that development. There is a shift from trying to put this veneer of separation, so that they can withstand antitrust scrutiny, and so that they can avoid being broken up. While all of these Leviathan heads are still in close collaboration. And then the pivot to arguing that we need monopolies to fight China because China has monopolies.

But there is a lot of slipperyness that’s going on there. All these tech capitalists will tell you that you need to look closely at China and look at how they’ve been able to leverage these monopolies to compete with US firms across the world. But they’re not going to tell you how the monopolies developed, why the monopolies developed, what mechanisms allow them to exist, and when the government will also take them away? Because, as we saw most recently with Jack Ma, and you and his attempts to make comments criticizing CCP’s rules on not allowing private capital to have that much of a flourishing. They detained him and disappeared him for eight months, and then they started a systematic review and breakup of his financial tech empire. Even though he was one of the wealthiest monopolists inside of China, but they won’t talk about the fact that they still attack the monopolies. Because they are, they’re interested in doing a few thing. They’re interested in preserving themselves. And they’re interested in justifying why they should get closer and closer ties with the government less and less regulatory oversight, why we should steer away from imposing guardrails that might prevent them from making this or that product, or generating this or that profit center this or that revenue stream.

But I think there is also, in some instances, a lack of understanding about why China has these monopolies. I think, for example, the Great Firewall and using China was able to leverage that as a way to keep out Western firms develop local competitors and then take them to the international market. China has spent a lot of time integrating itself into the development of telecommunication standards, and has become integral to ICT and telecommunications across the world. This is also part of the multi front economic war that the US is waging was trying to purge Chinese firms and providers out of the infrastructure of the United States and its allies. China spent a lot of time or attempted to spend a lot of time creating its own supply chain for electronics and materials that are the frontier of the information technology and advanced electronics that would in theory be durable from economic war with United States and durable from production shocks or supply shocks as well.

So there have been attempts to build out these firms, because they have been building them out in competition with the West, because they’ve been trying to build out their own alternative and durable supply chains, because they’ve been interested in crowding out or preventing any foreign capital from really coming in and being able to grow and displace its own firms, in a way that the US has not been. The US is mainly concerned with or mainly been oriented towards being a vehicle for these firms. And I think that has led to — not to say China’s monopolies are good, because they’re not; they are as problematic as ours. If you look at, for example, Meituan and the labor conditions that workers have to deal with on the delivery apps, they’re comparable, if not worse, to what Uber and Lyft drivers have to deal with in this country.

In China, there’s a little bit more planning decision and intention with what firms are we going to allow? What lines and boundaries are we going to impose on them? And when are we going to pull the rug out from under them? So the fear mongering is an attempt to say look at China, because of the threat that it poses, do not look at China in terms of scrutinizing or understanding the political economy of the monopolies there, because if you do that, then you might come to the realization that they tolerate some monopolies, but they also don’t tolerate others. And they have spent time building up the institutions to crush certain monopolies, or crush monopolies if they think they’re going to pose a systemic threat or threat to the political power. And we have those problems here. But I do think this is a very clear attempt by some sectors to prevent scrutiny on that, and then by others, who don’t know that that’s what’s going on to, instead just fall back on the fear mongering because they are concerned.

PM: I think you definitely see a lot of that. And I think we need to recognize the way that this geopolitical rivalry is being used to benefit the tech industry. But I think that when we talk about that relationship between government in the tech industry and venture capital, that also gives us an end to talk about the most recent wave of investment in the tech industry, because AI has served not just as like, I think, an important development where we see the tech industry very closely or very immediately going to government to try to shape any potential regulation that might come on this new field.

But you also have a lot of influential people in the tech industry, immediately going to government and saying: Hey, this is how AI could work in military applications; this is why we need to be developing AI so that the United States has it. And we can’t let China beat us on it. But it also serves this important role where Silicon Valley was in this difficult place, as the interest rates were rising, and its other hype vehicles went bust. And it needed something else. And then AI reemerged to have a new cycle. So what do you make of what we’ve been seeing with AI over the past year? And how the venture capitalists and how the tech industry just so quickly went all in on this new type of technology or whatnot as chat GPT and these other tools gained a lot of attention?

EO: Well, because it’s bullshit. So, a lot of unfinished edges. So the great work of privatization of everything, you used to have this pesky thing where labor laws are present. You used to have this pesky thing where copyright and IP is not as tight as you might like it to be. The human element imposes a lot of limits on the amount of returns that we can seek reliably, and also on some decision making power, because maybe we can’t do that sort of thing. So we might want to because laborers will rise up or sabotage work. And we can’t make the sort of things we want to because artists might raise some concerns because it’s their work and we’re doing enough their work. I feel like a huge driver in the AI hype cycle — conscious or not — is this desire to free some of the shackles that are sitting between where we are now and the full privatization and the full immiseration of culture and labor respectively. I think that there’s a huge opportunity, which is to say there’s a huge opportunity not to automate labor away or cultural production away because the AI is intelligent enough to do that and but because they’re interested in restructuring things such that when there is human labor, it’s doing the task mainly of looking over what is supposed to replace all the human labor. Making worse or shoddy or cultural forms and products, making much more messy, error ridden generative creative works, that human laborers and invisible workers will have to correct and moderate.

And so the goal here is to automate way labor not in the way that I think a lot of the fears are of where it’s more productive than us, where it does a better job of doing something that is better and more efficiently. But because they are so self-interested and short-term oriented, that they think reorienting labour around a core of this generative product, with a crust of supervisory human or laborers is equivalent to a core of human workers, and then some ancillary, artificial intelligence, algorithmic mediation. So on that count, it’s to continue the degradation of those things in the working conditions. And it’s also the consolidate decision making power, to the degree that if you can remove as many laborers as possible, you remove as many steps as possible, and people reviewing the work complaining about the work of raising concerns about the work and its ethical obligations and dimensions, and their ability to get in the way of you just generating returns that you want. And I think also, because similar to the gig economy, similar to a lot of the iterative waves of tech hype that have happened, everything is a tech company, everything is a bang, because this is like a sort of frontier in early days thing, you can lie. You can just lie about what your thing can do, or might be able to do in a few years.

PM: What, Ed, you’re suggesting that tech founders would lie to us?

EO: [laughs] Never! So a lot of the lines — if we were to sit down and rattle off what every single tech company that exists today, that’s prominent today, said it was interested in doing throughout the years, how many of them would have told the truth? How many probably just lied last year about the rollout of some product, or the intention that they would have behind applying some service or who would work at it? Or what the conditions would be like? Every single thing out of these people’s mouths is a lie. Almost every single thing, minus the stuff they can immediately get in trouble for. And I think that AI offers a huge opportunity, if you’re an investor, to get in with your friends lie about what they’re building, lie about what they’re doing, make a lot of money off of it. And when it fails, say: Well, this stuff is really complicated, man. It turns out we don’t know how intelligence works. And it turns out we don’t know how not replicating human mind. It turns out that we can’t do any of the things that we were bullshitting about we can’t make an AGI, we can’t make a image generation bought that makes a normal looking human being or human with normal amount of fingers.

We can’t do any of these things that you might have been interested in and valued us at $2, $3, $4, $5 billion, but we will be able to pull the money out. And that’s the more important thing. And I think this is the other side of it — by virtue of investing in things and the act of trying to make money they sustain the hype cycle, and keep the dream alive. I think we should be viewing investments by like Andreessen Horowitz or by Sequoia into AI firms that are clearly bullshit, as not just an attempt to make money off of the bullshit, but to keep the dream alive, because other people will take that signal and try to either get in on that company, or get in a company that’s doing the same thing over and over and over and over and over again, regardless of whether there are advancements that justify it, regardless of whether that’s actually technically feasible, regardless of whether the people at the firm have the expertise or the capability to do it. Regardless of whether any of this is anything more than vaporware. The act of investing, generates hype to sustain the frenzy until some large deflationary event happens.

PM: I just want to echo what you’re saying. I don’t think that these AI companies are going to replace workers, I think the goal there is to deskill and then reduce the power of workers yet again, because this is what they do. And if anyone has proven that you can lie your way to the top and keep getting away with it. It’s Elon Musk, the richest man in the world. And, you talked about what these venture capitalists are doing, and Andreessen Horowitz in particular, Marc Andreessen. I feel like one of the things that I think about a lot when I think about these discussions, and what Silicon Valley is doing is on one hand, I think back to Marc Andreessen’s “It’s Time to Build Essay” from early in the pandemic.

EO: Oh, god, yeah.

PM: But he’s very much championing Silicon Valley and that we need to be exerting our power on society to a much greater degree to shape how it works. And we’ve already seen the impacts of how that is going. But now in the AI push, we see people like Sam Altman saying: AI is going to do all these fabulous things, but it also puts us at risk for AGI which could destroy humanity. But then you have someone like Marc Andreessen, who isn’t echoing that second part is saying: AI is going to be great, and the AGI threat is not coming. It’s just everything’s going to be wonderful. And we’re all going to have these AI assistants and it’s going to make everyone’s life so much better, because you can see the incentive that he has in making us think about AI in a particular way. So I wonder how you think about those different narratives that are being deployed by different people, because they have different incentives to do so?

EO: I think that “It’s Time to Build” as a really great, really great starting point here because it’s one I like to make fun of, but also, if you step back, it is a call to arms in that venture capitalists do understand, at least the smart ones are the ones who are a bit more self aware. And I think Marc Andreessen, as much as this everything he writes and says, is one of the more self-aware and intelligent ones and understands, it’s not sufficient to just park the money in places you have to be generating the sort of bullshit self rationalization narrative, you have to be generating the ideology that these people are going to rally around, and something that’s a little bit more than accumulating money. And so part of him saying it’s time to build I think, can also be understood as like, we do have to build the edifice and the infrastructure for people to come and join us and go against their own self interest or go against their own doubts and concerns and hesitations and to enrich us, and to redesign society as we see fit. Because their investments are also attempts to change the way in which politics is done in one way or another, or economics occurs or social relations are mediated. They are engaged in a project of trying to revolutionize or transform society.

And when we get to the AI question, and the competing interests that these people have, on the one hand, we have people like Sam Altman, like who you pointed out, are raising the alarm about extinction from AI. And doing so I think, as a Brian Merchant, at the LA Times points out as a marketing strategy. Because you’re on the one hand, you’re saying AI is so dangerous, we need a pause. And on the other hand, you’re saying, but also buy my mixtape, please come out and support me and I will save us all from the Lord’s. Or let me make the rules,. Let me write the rules. Or you can write the rules, so long as I’m in the room with you holding the pen and the paper in which and we talk about it together. Those are the options that these people are presenting. And we saw with the leak of the EU rules that had been watered down, thanks to lobbying from open AI, that they’ve been able to successfully do this, that the marketing strategy is working. And we also then have people like Marc Andreessen, who, whether or not he believes the squirrel that he’s spewing talk about AI is being in a position to provide eternal love to us. You get your own personal Jiminy Cricket, that’s advising you on every single decision that you make. It’s optimizing your learning capabilities, that’s helping you navigate tough social situations and being the best person that you can possibly be, which all sounds nice and dandy.

And I’m sure it’s also calibrated towards an audience of people who may feel like they’re struggling with those things People may feel alienated; people may feel awkward. People may feel like: I don’t know how to do any of this. Wouldn’t it be great if I had an angel on my shoulder that helped me do all of this? But then also, you read that essay, AI will, which is titled “AI Will Save the World.” And on the back end, he immediately starts talking about China. A strange thing to pivot to, if you’re going to say is going to save the world, but not the Chinese. Chinese AI is actually the devil and it’s going to destroy the world. And he started talking about how in China, they use AI for surveillance and social control. And in the United States, we’re going to use it for love and actualization. But it’s actually the case is both places use it for surveillance and social control. And not only just both places, almost everyone in the world is using these technologies to figure out how to narrow the range of possibilities for human activity, because the people in control of designing these things are interested in narrowing the range of human activity, because they’re trying to figure out ways to make things that are more profitable, and to instill or lock in political, social and economic outcomes that ensure they’re at the top.

And so when we’re looking at the AI thing, and the hype cycle in the way in which these people talk about it, Andreessen comes out of the circle, this group, this lineage of people, who I suspect, are so deeply interested in this, this project that transforming society because they have on some level of disgust with the very Political forums, social forums are the economic forms, but also with humans themselves, and that we have all these inherent limitations on us that we need to transcend. We have the limited lifespans we have limited abilities to think or to compute, because they think we’re all spiritual machines and only another. There are the cingulate Aryans that think that human beings are 1.0. Tthe version one point of some thing that’s going to turn in sand the limits of humanity that’s going to merge with the AI merge with, by by the biology with the computational and create something that will be spectacular, and then that’s what we’re fighting for. And along the way, we can make a lot of money. In fact, we should make money along the way, because that’s the only way this is going to happen.

But that also the world we live in the bodiess; we occupy the politics, we engage in the economy, that we all are a part of the social relations that we have, are miserable or lesser than what they could be. And we are fighting for that future, where everything is transcended, and better, but also where we are deciding the form of it. And we are sitting at the top because of all the money that we made, no decisions have been made. So some of that rant, I think the way that look at the AI thing is you have the marketing strategy, you have the sort of a plot, conspiracy almost, to try to get people to think about AI in a certain way. And to invest in AI a certain way. And to believe that will come out a certain way, but also this desire, both to make money in the short term. And for the first time to think about the long term, but only in terms of, at the end of the day, it will all be worth it. Because we will transcend this this mortal coil. So long as we take over everything, one way or another, or proliferate and I’ll call them Qaeda or Diaz across the population forever.

PM: I really liked how you describe that. And I think it does show us the dangerous ideas that kind of undergird a lot of this AI thinking that we need to be challenging, and that we need to be critical of and not just falling for, even though the idea of having a nice little Jiminy Cricket on my shoulder sounds great. I’d be down for that.

EO: Right! These are people who — if we speak plainly about it — these are people whose ideologies they pull from, people that they work with, their intellectual networks are full of eugenics, full of racism, full of sexism, full of bigotry. These people would create a global apartheid system that would probably be highly rigid and hierarchical, and delineated along lines of race and class and gender, and that would be regulated with violence. Because these are also the same people are in ensuring that we have much more lethal drones, and smarter systems to regulate them. And that we have hardier weaponry or police departments and for military forces, these are people who would turn the world are already violent, tightly organized, along these discriminatory forms world into something even uglier, and potentially do it permanently, if they had if they gained enough power, and global sway with these ideas.

And so that’s what we are up against people who would if they could institute just like a permanent sort of caste system along different orientation lines. And I think that makes them especially dangerous. It takes a while to build up to that conclusion, realization. But it has been there from the beginning, from the earliest days of Silicon Valley, from the earliest days of the thoughts and the influences they have. And more presently, today, and some of the most prominent figures like Peter Thiel sneaks up on his intellectual networking cadre, these are people who they want to transform the world for worse, because they will be better off in that.

PM: Definitely. And one of the things I was thinking about, as you were describing, that was Marc Andreessen in that essay, saying, We should develop AI for a war, and that will make war less deadly. And it’s like, man, what reality are you living in? If you really believe that?

EO: Precision bombs are actually very accurate, then. Never kill anybody else.

PM: Exactly. And it’s always fantastic to have you on the show to get your insights on these topics, because you’re so knowledgeable on all of this. And I would just say, to close off our interview, obviously, I became familiar with your work when you were working at motherboard, where you were doing fantastic critical journalism on the tech industry that I think was so necessary, and so informative for so many people. And I love sharing with people. And now you’ve moved on from motherboard but you’re writing these fantastic critical pieces for the nation and slate and these other places and I’m just so Excited to see the work that you’re doing next. Because I’m always thrilled to be reading it. So thanks so much for coming on the show. always love chatting with you.

EO: Oh thank you so much, Paris! The feeling is so mutual — I’ve loved reading your work over the years, loved your book. Very excited to see what you do next. Also, I always love listening to the podcast. I’ve been a fan for so long.

PM: Thank you so much, man.

Similar