For two grand I can buy a laptop with unfathomable levels of computational power. Stock, that laptop comes with silicon made of resistors so impossibly tiny, it operates under [different laws of physics](https://youtu.be/Jh9pFp1oM7E). For just a few hundred dollars more, I get a hard drive that stores more documents than I can write if I wrote 24-hours a day for the rest of my life. It fits in my pocket. If I'm worried about losing those documents, or if I want them synced across all my devices, I pay Apple for iCloud. Begrudgingly.
If I were on a budget, any two-hundred dollar laptop and twenty-dollar USB, could handle the computational load required for the drivel I pour daily into plain text files ([shout](https://sive.rs/plaintext) / [out](https://www.williamhern.com/living-in-a-single-text-file.html) / [to](http://www.textfiles.com/100/) / [.txt](https://mnmlist.com/a-case-for-storing-all-your-info-in-text-files/)).
Yet, the average note-taking app [charges ten bucks per month](https://fromjason.xyz/p/notebook/i-guess-i-ll-just-pay-til-i-die-why-i-m-switching-from-ulysses-to-ia-writer/) in perpetuity. It stores my writings in proprietary file formats that locks me into the app. In exchange, I get access to compute located 300 miles away, storage I don't need, and sync-and-share capabilities that I already pay for. Now, I can also expect a 20% hike on all my subscriptions for BETA-level AI solutions desperately searching for a problem to solve.
Welcome to the **Computational Web**.
I define the “Computational Web” by the increasingly gargantuan levels of computational power (compute) required to run the modern Internet, enacted by a small group of firms uniquely positioned to meet those demands.
The Computational Web is the commodification of computational power. The Computational Web marks the achievement of absolute control over the modern technology stack. The Computational Web signals a future where all of our personal computers devolve into mere portals to the cloud. These devices are sleek, thin and inexpensive. These devices are incapable of answering "how many 'R's are in the word st**r**awbe**rr**y” without an internet connection (and maybe not even then).
The cloud used to be the place we stored our files as backups, and kept our devices synchronized.
But increasingly, we are seeing the [cloud takeover everything](https://www.tiktok.com/@jasonkpargin/video/7591592052954025246) our computers are capable of doing. Tasks once handled by our MacBook's CPUs and GPUs are being sent to an edge server to finish. There's no product on the market that facilitates this process better than cloud-based AI solutions.
Aside: *(I think) Sam Altman fundamentally misunderstood the role his product plays in the Computational Web. Now [he's scrambling](https://www.wsj.com/tech/ai/openai-sam-altman-asia-middle-east-7b660809), begging companies with infrastructure for [more](https://www.tiktok.com/@sonypicsathome.uk/video/7445302185706982688) compute.*
*Artificial intelligence, manifested into Chatbots and agents, isn't the product. The product is the trillion-dollar data center kingdoms required to power those bots. ChatGPT might be OpenAI's Ford F150, but datacenters are Microsoft's gasoline. Without Microsoft's infrastructure, ChatGPT is a $500 billion paperweight. I don't know when Sam Altman realized AI is just a means to sell retail compute to the masses. Probably just before the ink dried on the pair's partnership agreement.*
Compute is expensive and difficult to scale. AI is the most compute-hungry consumer technology in the history of the web. So, shoehorning AI features into our apps isn't just tech bros following their tail. It's setting the expectation that all consumer technology requires AI. If all technology requires AI, and only a handful of companies are equipped to handle the computational load that AI requires, then compute itself becomes a moat too deep for competition to enter, and consumers to flee from.
Compute is a scarce resource, turning the tech industry into a cloud-oligopoly *(Google, Amazon, Microsoft, and Meta (GAMM))*. Our devices—laptops, desktops, phones—have grown dependent on the cloud, not just for storage, but to complete the types of tasks that our devices are largely capable of handling.
Our level of dependency on cloud-computing has made, by and large, local-computing redundant. Something has got to go. Can you guess which it'll be?
---
An interesting side effect of late-stage capitalism is the gained ability to forecast business strategies. Some in the blogosphere loathe this type of navel gazing, but I find it fun. Because all you must do to extrapolate big tech's strategies is realize that morality, ethics, and sometimes even the law, are not considerations when one develops a “corner the market“ business plan. It's Murphy's Law but for big tech. If it's possible and profitable; if it causes dependency and monopolies, then that's the plan. [Competition is for losers](https://www.wsj.com/articles/peter-thiel-competition-is-for-losers-1410535536), after all.
My silly 2026 prediction is we'll see a mainstream politician and/or tech elite call for outlawing local compute. This is big tech's end goal—make AI critical infrastructure to run all of our apps, then work towards a cloud-tethered world where local compute is a thing of the past.
All it would take is a picture of a brown man next to a group of daisy-chained Mac Minis, and the headline *AI-assisted Attack*.