The World Wide Web has completely changed how we find and consume information. When I started working on websites in 1995, I was convinced (like almost everyone at the time) that this new communication medium would be an unquestioned force for good. The logic felt airtight: if knowledge is power, then making knowledge easier to find must inevitably make humanity better.

Thirty years later, it’s clear that assumption deserves a second look.

The early web was built on a deceptively simple idea: hypertext. Pages linked to other pages. Small ideas pointed to bigger ones. You might be reading a research paper and stumble across a term you didn’t understand, and (miraculously) there was a link. You could follow your curiosity instantly, at your own pace, or ignore it entirely. The author didn’t decide your journey. You did.

That might sound obvious now, but it was revolutionary at the time. Before the late 1990s, your word processor didn’t have a little chain icon. Links didn’t exist. There were footnotes, but those lived mostly in academic journals. Hypertext blurred the line between finding information and consuming it. It let curiosity drive.

Before hypertext, information lived behind deliberate barriers. Libraries had card catalogs. You searched first, then you read. Hunting and consuming were separate mental modes. Magazines were the closest thing to a bridge—designed to be skimmed, browsed, and explored. Pull quotes, sidebars, “continued on page 24.” You could get the gist in minutes or sink into a long feature if something grabbed you.

Television, by contrast, was famously passive. In my house in the 1980s, it was called “the boob tube,” and not affectionately. Five channels. No cable. Advertisers interrupting whatever you were watching to tell you what you should want. The lack of agency was the point, and it made TV feel fundamentally untrustworthy.

The early web felt like the opposite of that.

If you could wrangle HTML and an FTP client, you could publish a page. If other people linked to it, search engines like AltaVista, Lycos, and eventually Google might notice. Google’s breakthrough was elegant: pages mattered more if other pages linked to them. We were collectively weaving a network of knowledge.

Using the web was active. Click, read. Click again. Follow a trail. Get lost. Get curious. It wasn’t something happening to you. We “surfed” the web.

The utopia cracked when reality set in.

Building websites was hard. Early pages were functional at best and ugly at worst. As design capabilities improved—CSS, JavaScript, Flash, middleware—the technical stack grew more complex. Expressing an idea online started to feel harder, not easier.

This is where people like me came in. Information architecture. Usability. User experience. We got paid to remind organizations that users don’t think in org charts. They think in questions. Needs. Intent. However while we were busy helping companies organize their marketing messaging, we quietly punted on the harder question: how should all the new information humanity could now easily create be organized?

The answer that won was the simplest one imaginable.

Blogs.

Everyone could publish. That was the win. Organizations defaulted to “most recent first.” Not because it was best, but because it was easy. The pile of papers on your desk works that way. The groceries in your fridge. Chronology feels natural when you don’t know what else to do.

Content management systems focused on the end content creation. Organization became an afterthought. Either you could structure things exactly how you wanted (if you were an expert and knew how to think about that) or you got a reverse chronological feed.

Social media doubled down on this choice.

Billions were poured into making it effortless to post photos, videos, and thoughts. How that content was organized has barely evolved from piles of papers on desks and filing cabinets. Early on, users made it clear they didn’t want to pay directly, so platforms turned to venture capital with a promise: we’ll figure out the money later.

Eventually, the bill came due.

Advertising became the business model. Algorithms took over organizing all information. The goal of these algorithms is unabashedly to consume our attention and sell it to advertisers. There is no desire to help humans understand something in a newer or deeper way, it’s simply who can get me to click on an exercise plan for men over 50 the fastest.

Now we live in a world where creating content is easier than ever, but deciding how it’s grouped, summarized, or presented has been outsourced to systems we don’t understand, optimized for incentives we didn’t choose.

Imagine if your phone call with a family member was interrupted every five minutes by a sales pitch, tailored precisely to what you were talking about. You wouldn’t tolerate it, but then again, you did pay for that phone service.

Imagine a library where the card catalog gave you different results every time you searched, sprinkled with ads for books you already knew you didn’t want.

That’s the trade we’ve made.

By giving up on curation, summarization, and intentional structure, we handed the most powerful part of learning to financial systems optimized for persuasion, not understanding.

It doesn’t have to be this way.

What we need now are tools that make it easy to collect, curate, and present content in ways that invite exploration. Tools that favor grouping over feeds. Context over chronology. Craft over volume.

We need to remember the joy of zines and early desktop publishing, when people cared deeply about how ideas were arranged on a page. When structure was an act of hospitality. When design said, “I thought about how you might want to read this.”

The future of the web isn’t about creating more content.

It’s about organizing what we already have so it serves human curiosity instead of advertising models.

That future is still ours to build.

Share This Article

Previous Article

June 1, 2025 • 5:14PM

Topics

From Our Blog