If you see Web1 as a gigantic, albeit silent, digital library, then Web2 is a bustling square. No longer a collection of static pages to click through, but a living environment where people created, shared, and generated value from content themselves.
The term Web2.0, and thus the observation that the internet had taken a new direction, became popular in 2005 through O'Reilly Media. Tim O'Reilly describes an internet that improves as more people use it, making it a platform where data and network effects are central.
However, those who see Web2 merely as a marketing term miss what was actually happening technically and economically. At the beginning of the new millennium, the focus of the web shifted fundamentally: from publishing to participating, from websites to platforms, from documents to data streams.
Users Become Producers in Web2
The seed of Web2 was sown a bit earlier. In January 2001, Wikipedia went live; an encyclopedia written not by an editorial team, but by volunteers. The idea that millions of people could build a source of knowledge together, without central control, was still revolutionary.
In February 2004, Facebook was launched. What started as a university network grew into a global social system where identity, relationships, and content came together. YouTube followed in 2005 and made video distribution trivially accessible.
The real revolution was not in the fact that people created content - that could also be done earlier through forums or blogs - but in the scale and architecture behind it. Web2 platforms were built to center user activity. It was easier to post, easier to search, and easier to share than ever. The more people participated, the more valuable the system became. Network effects became the engine of growth.
The Browser Becomes an Application Platform
Under the hood, just as much changed as on the surface. In 2005, the term AJAX (Asynchronous JavaScript and XML) was coined by Jesse James Garrett. Asynchronous communication between browser and server made it possible to refresh parts of a page without a full reload. That sounds obvious now, but at the time it was a huge step.
Gmail (launched in 2004) suddenly felt less like a website and more like software. Interfaces became richer, more responsive, and more interactive. The browser was no longer just a document viewer, but a runtime environment.
This had far-reaching consequences. Front-end development became more serious. JavaScript grew from a helper script to a core technology. The line between desktop applications and web apps blurred.
Data Becomes the New Fuel
Web2 companies discovered something fundamental: data is not just a byproduct, but a core asset. Google had already demonstrated in 1998 with PageRank that links could be used as a signal for relevance. In Web2, that principle was expanded to user behavior. Clicks, likes, shares, searches - everything became measurable. And then you get the following possibilities:
- Personalized search results
- Recommendation systems
- Targeted advertising
The user experience was optimized, and the web became smarter as it was used more. But that also meant that users themselves became the product.
Advertising models shifted from contextual (the content of the page) to behavior-based (the user's profile). Google's advertising platform, launched in the early 2000s, grew into a dominant source of revenue. Data analysis, scalable storage, and later machine learning became core competencies in IT.
Cloud: The Silent Revolution Behind Web2
Perhaps the most important, but least visible shift occurred in infrastructure. In 2006, Amazon Web Services launched its first major public cloud services, including S3 and EC2. For the first time, startups worldwide could rent scalable infrastructure instead of buying their own servers.
This fundamentally changed the game. While Web1 companies had to invest in data centers, Web2 startups could scale almost infinitely at variable costs. It lowered the barrier to entry and made experimentation cheaper.
Cloud enabled Facebook, YouTube, and later countless SaaS companies to grow exponentially without traditional infrastructure constraints. Thus, platforms had free rein to grow, add users, and, indeed, to grow.
Mobile Internet Accelerates Everything
Then there was another addition that made a mature Web2 an immortal internet. The introduction of the iPhone in 2007 made the web mobile, always present. The internet could be carried in your pocket.
Real-time notifications changed user behavior. Social networks became no longer a destination, but a constant stream.
Technically, this meant:
- Stricter requirements for latency
- Global content delivery networks
- New security challenges
- Enormous data growth
Mobile made Web2 more intimate and addictive. The foundation for the now has been laid.
Web2 in the Now
Actually, we consider 2012 as the end year for Web2, but that foundation is still visible. Where Web1 mainly struggled with technical maturity, Web2 faced societal and governance issues that are still not fully answered today.
Central data collections became attractive targets for attacks. XSS, SQL injections, and large-scale data breaches became structural risks.
Additionally, questions arose that are still relevant today: Who decides what is visible in a feed? How is misinformation addressed? Who owns user data? And what happens when one platform becomes the dominant infrastructure? These tensions would later form the breeding ground for Web3 ideas about decentralization.
What did change, and where the foundation for Web3 was laid, is the level of maturity. Platforms became regulated. Privacy legislation (such as GDPR) forced a reconsideration of data use. Cloud became standard. DevOps and continuous deployment became mainstream.
Web2 laid the foundation for the digital economy as we know it now:
- SaaS
- Platform companies
- Data-driven decision-making
- Advertising-driven business models
- Social infrastructure as the core of communication
The History of the World Wide Web Rolls On
Web2 was not a cosmetic upgrade of Web1. It was a fundamental reconfiguration in the history of the World Wide Web. It changed the role of the user, from reader to participant. It changed the role of the browser, from viewer to application platform. It changed the role of data, from byproduct to core asset. And it changed the power structure, from distributed sites to centralized platforms.
It is perhaps the most important part of this history of the World Wide Web. What began as a technical experiment in the early 2000s grew into the dominant model of the internet. A model that enabled unprecedented innovation but also created new dependencies.
In the next part of this series, the focus shifts to Web3: an attempt to break those dependencies through decentralization, cryptography, and digital ownership.