Recently, a friend—a dear, dear friend who perhaps forgot what it's like to ask me questions regarding things I'm passionate about—asked me, "What's Web 3.0?" ”What a fantastic question,“ I replied. I was excited to answer. Not just because I've waited my whole life for someone to earnestly ask me about something I've put so much thought into. But because, considering recent national events, it was *a fantastic question,* loaded with all the important considerations of the time, if you ask me. (And hey, someone did ask me!) Anyway. I didn't just have an answer, but an essay; one of many essays rotting in my drafts that I wrote over a few consecutive mornings then promptly forgot about. I figure now is as an opportune moment as any to present to you my thoughts on the future of the Internet and Web 3.0. Now's the time to share the term I shout at my screen whenever I read about Mark Zuckerberg building an AI supercomputer, or Microsoft shoving LLM chatbots into Office 365. It's a term I use often, and pejoratively, as I think it will ultimately come to define Web 3.0, and I honestly hate that for us. I call it “The Computational Web.” Every decade or so, the Internet endures a transformation. In retrospect, there is always some engulfing new technology mixed with human behavior, that defines the era. In the 1990s, it was the rise of the consumer web—the “information Highway,” and digital malls—followed by the e-commerce boom and subsequent tech bubble burst. The turn of the century brought us the “read/write web,” the rise of blogging, and eventually, the digital platform era. (Facebook, Google, Amazon, etc.) The Computational Web is defined by the increasingly massive amounts of computational power required to run the modern Internet and the small group of firms that can meet those demands. Our homes, cars, businesses, even our sunglassss, increasingly rely on larger gulps of compute to operate our daily lives. Congruently, big tech firms are using the sheer expense of building required datacenters and GPUs as a barrier to entry for their competitors. In the future, our stack of monthly bills could include cryptic egress charges and computing overage fees that rival anti-consumerism by the most egregious telecommunication monopolies of American history. And if we don't take off those stupid fucking Meta Rayban sunglasses, the computational web will swallow us whole. Every decade or so, the World Wide Web endures a transformation. We are technically on the fourth or, perhaps, fifth major iteration, though colloquially, we're still in Web 2.0 and approaching Web 3.0. (It's a mess; I'll touch on it in a moment.) Before we delve into speculation, let's take a look at past iterations of the World Wide Web. The "proto-web" of the late 1970s-1980s predates the invention of the World Wide Web and is defined by the many bulletin board communities populated by academia and tech nerds. The most notable of these communities were USENET and The WELL, the latter of which is still active. Howard Rheingold, founder of The WELL, coined the term "Virtual Community" in his book The Virtual Community: Homesteading on the Electronic Frontier, to describe the social nature of the proto-web. The proto-web was decidedly non-commercial. Attitudes of the early internet leaned toward protecting the social benefits of the technology and many internet citizens strongly discouraged profit seeking behaviors. The first "spam" email soliciting computers for sale was met with outrage. The second spam email didn't happen until years later. While often used interchangeably, the "Internet" and the "World Wide Web" are two distinct concepts. Let's define them before we go any further. The Internet is a network of computers around the world, and a suite of languages called protocols that allow those computers to talk to each other. The World Wide Web, or simply "web," is how humans locate, view, and organize information on computers connected to the Internet. HTML, hyperlinks, webpages, Chrome— that's all web stuff. When my computer asks your computer for the photos from your Italy trip, that's the internet. The browser I use to pretend to look at those photos, thats the web. If the proto-web was a digital commune deep in the internet wilderness, Web 1.0 were the condos and strip malls under construction across the way. In the late 198os, a small software company named Quantum Computing Services wanted to bring Internet connected services to the masses. Q-Link, the company's flagship software, provided users with an easy way to access online services like electronic mail, chat rooms, games, and online news. By the early 199Os, Quantum Computing Services rebranded to American Online and had grown to five million users accessing online services through its proprietary prortal. There's a great write-up on AOL history that I'll add to my link blog. Its called,' "AOL Pretends to be the Internet." Congruent with the rise of AOL, the World Wide Web debuted as a more open, non-commercial method for consuming and organizing information on the Internet. The World Wide Web gained enough momentum that AOL was later forced to ditch its walled garden of content and services and include a WWW-accessible web browser. ## The Information Highway For a brief moment, it looked like the Internet would maintain its non-commercial spirit. However, in 1996, under pressure from the American telecommunications oligopoly, the Clinton Administration passed the Telecommunications Act, which effectively transferred the public internet infrastructure to corporations. Dubbed the "Information Highway" by Al Gore, the new for profit web had arrived, along with it, a slew of digital malls. Fun fact: Al Gore never said he invented the Internet. He was misquoted. Sort of. In 1999 on CNN, when speaking to his accomplishments as a Senator, Gore mentioned a bill he authored that allocated public funds to build out our internet infrastructure back in the early gos. What he actually said was, "I took the initiative in creating the Internet," which is technically correct. But, who gives a shit... Because after spending hundreds of millions of our tax dollars on building the public Internet, Gore sold us out, and gave the infrastructure away to the Fortune 500. Never forget: The Telecommunications Act of 1996 is what gave us Comcast. Anyway, ## Web 2.0: The Rise of The Techno-Oligarch We can easily split Web 2.0 into two, possibly three, distinct eras. The first marked the increase of interactive websites, which gave way to personal blogging and digital storefronts in the early aughts. Then, in the 2010s, the rise of digital platforms and ad-based business models. Facebook, Google, and Amazon all defied the e commerce bubble that marked the turn of the century and went on to become trillion-dollar companies. Each are now monopolies of their Add a spective sectors. ### Surveillance Capitalism Finally, there's the Web 2.0 era of the Dead Internet and Enshittification, which lasted until around 2020. Rage bait, user-hostile practices, and surveillance capitalism are hallmarks of late-stage Web 2.0. No one is happy except a small group of elite technocrats who own the Internet and the politicians who pad their stock portfolios. Before we continue, it's worth mentioning that web eras are similar to how we define our generations of people (Gen X, Millennials, Gen-Z, etc.). The macro may have consensus, but the details are often messy and disputed. No centralized authority can decide what Web 2.0 is and what Web 3.0 can be. That decision, especially for the future of the web, is largely up to us, even if we don't act like it. ## Web & Decentralization From roughly the 2010s up through to today, many dialed into the tech world believe blockchain, NFTs, and cryptocurrency would lead us into Web 3.0. Evangelists for these technologies have gone so far as to rebrand the iteration to "Web3." Decentralization, a major tenant of Web3, promises to topple power structures held by techno-oligarchs and too-big-to-fail banking institutions. Whether that promise holds any water is a topic of another debate. But 1 think it's safe to say for the time being that the Web bundle (maybe with the exception of crypto currency) isn't the transformational technology we were promised. Though, the night is still young. ## Artificial Intelligence Love it or hate it, Al is a different kind of hype than its Web predecessor. Big tech companies have actually put their money where their mouth is, burning through cash building large data centers and buying expensive Nvidia chips. Meta alone has spent $30 billion on specialized GPUs to feed its AI models and tens of billions more on new datacenters across the country. This is more than hype. > "Computational power, or compute, is a core dependency in building large-scale Al." - Al Now, "Computational power and Al" (see link blog) ## The Computational Power Moat But Al itself isn't the defining trait of Web 3.0. It's the computational power required to build large-scale Al (or to run blockchain or to mine Bitcoin). Compute is expensive and increasingly difficult to scale. These hurdles make it accessible only to the largest tech firms in the world. So, shoehorning Al features into all of our apps isn't just tech bros following their tail. It's setting the expectation that all consumer technology requires resource-hungry Al. If all technology requires Al, and only a handful of companies are equipped to handle the computational power that Al requires, then computation becomes a moat too deep for competition to cross. Who owns the Internet? In 1996, the US government handed over the internet's public infrastructure to corporations. Yet, the mythology of an ownerless internet persists today. Google, Amazon, Microsoft, and Meta (GAMM) now own most of the steel and glass that makes the internet go vroom. Google, Amazon, and Microsoft control seventy-five percent of the cloud computing market. Meta and Google own half of the fiber optic cables supplying internet services across continents. ## GAMM—Google, Amazon, Microsoft & Meta Most of our favorite productivity apps, retail websites, and social media platforms are beholden to proprietary infrastructure controlled by these four corporations. They own the most heavily trafficked server networks, all the GPUs, and gigawatts, and whatever. They call it the cloud, but really, that's just the internet. ## Fifty-dollar solutions for fifty-cents problems Web 3.0 probably won't involve the blockchain or NFTs in any meaningful way. We all may or may not one day join the Metaverse and wear clunky goggles on our faces for the rest of our lives. But none of that really matters. We keep waiting for the next iteration of the web, or the internet, but the future is now. We're living it at this very moment. It snuck through the backdoor when no one was looking. It's consumption. Its monopolistic control. It's computing hungry magic tricks thrown at the wall, hoping something sticks. The next iteration of the web by way of the internet is just one long infomercial of fifty-dollar solutions to fifty-cent problems. Today, computing power doesn't feel like a big part of our lives. But don't be fooled. The internet, too, once felt small and obscure until AOL made it indispensable. Before the iPhone, our cellphone carriers charged us by the minute. Today, our phone bills are measured in data. Big tech is burning through cash, building next gen infrastructure. It's operating its most computing-hungry, consumer facing features at a loss in hopes of attracting the masses. In short, GAMM is selling printers at a loss so that later it can fuck us on the price of ink. The Computational Web is defined by the increasingly massive amounts of computational power required to run the modern Internet and the small group of firms that can meet those demands. Our homes, cars, businesses, even our sunglasses, increasingly rely on larger gulps of compute to operate our daily lives. In the future, our stack of monthly bills could include cryptic egress charges and computing overage fees that rival anti-consumerism by the most egregious telecommunication monopolies of American history. ## Communities, Not Markets Ultimately, for now, at least, we are the ones who decide what Web 3.0 will become. But first, we have to understand the threat. Then perhaps we can revisit a time in Internet history when communities governed markets, and not the otherway around.