As the 1990s wore on, however, the dream of decentralization frayed. During what would later be dubbed the Web 1.0 era, the typical internet user, though theoretically empowered to create web pages, was in practice doing little more than viewing those made by others. And as a mature economy developed around the internet, powerful companies began to centralize on top of its open protocols—like Microsoft using its operating system monopoly to take over the browser market with Internet Explorer. Then came the dotcom crash, which called into question whether the internet would ever fulfill its potential.
Hope reemerged in the mid-2000s, when new platforms and technologies allowed ordinary users to create and upload content that could reach thousands or even millions of people. If Web 1.0 saw the masses passively consuming media created by publishers, in Web 2.0, the masses would be the creators: Wikipedia entries, Amazon product reviews, blog posts, YouTube videos, crowdfunding campaigns. Time captured the spirit of the moment with its 2006 Person of the Year selection: “You.”
But something very different was happening beneath the surface. User-generated content was free labor, and the platforms were the bosses. The big winners slurped up user data and used it, along with old-fashioned mergers and acquisitions, to build competitive moats around their businesses. Today, one company, Meta, owns three of the four largest social apps in the world, in terms of users. The fourth, YouTube, is owned by Google, which also accounts for around 90 percent of all internet searches. As these companies conquered more and more of the web, it became clear that the user was less a creative partner than a source of raw material to be perpetually harvested. Escape is difficult. Meta controls access to your Facebook and Instagram photos, plus your friend lists. Want to ditch Twitter or find a streaming alternative to YouTube? You can’t take your followers with you. And if a platform chooses to suspend or cancel your account, you have little recourse.
In hindsight, there’s no shortage of explanations for why Web 2.0 failed to deliver on its early promise. Network effects. The unforeseen power of big data. Corporate greed. None of these have gone away. So why should we expect anything new from Web3? For believers, the answer is simple: Blockchain is different.
Gavin Wood, an English computer scientist who helped program Ethereum, coined the term Web3 in 2014, the year Ethereum launched. (He first called it Web 3.0, but the decimal thing has since become passé.) In his view, Web 2.0’s fatal flaw was trust. Everyone had to trust the biggest platforms not to abuse their power as they grew. Few seemed to notice that Google’s famous early motto, “Don’t be evil,” implied that being evil was an option. To Wood, Web3 is about building systems that don’t rely on trusting people, corporations, or governments to make moral choices, but that instead render evil choices impossible. Blockchain is the crucial technology for making that happen. Brewster Kahle, the creator of the Internet Archive and the Wayback Machine, has described this goal as “locking the web open.” Or, as Chris Dixon, a general partner at Andreessen Horowitz’s crypto fund and a leading Web3 booster, puts it, “Can’t be evil > don’t be evil.”