This year, 2012, the "Internet" will be 43 years old. Well, that's not exactly true: the modern Internet wasn't born until 1981, and while its immediate predecessor, the ARPANET, went operational in 1969, that of course had its roots in several years of preceding research.
Regardless, most historians date the origins of the Internet to the ARPANET coming online. And while the network has indeed had a long life, its can be more or less divided into distinct eras, each about ten years long.
The first era was the ARPANET era and lasted from roughly 1969 until 1981. In this era, the network is characterized by synchronous protocols and 36 bit machines such as the DEC PDP-10, Honeywell H6180, etc. Most machines on the network are mainframes. Dominate applications include network mail, file transfer (FTP) and remote login (TELNET), but the focus is on research, and many applications are specialized accordingly. Towards the end of the era, a number of 16 bit host minicomputers are attached to the network. Human/computer interaction is characterized by text: most users connect using textual display terminals, either teletypewriters or "green screen" terminals. Presumably some research into networked graphical applications is occurring, but it is not common.
Next is the Internet era, beginning with the adoption of the TCP/IP protocol suite circa 1981 and lasting through 1991. This era is characterized by reliable transport layers built on top of unreliable datagram protocols. Most hosts are 32 bit minicomputers or workstations such as the DEC VAX, IBM 370, Sun SPARCstation, MIPS-based workstations, and machines built around the Motorola MC68020 architecture and its successors (e.g. the NeXT, early Sun models and others). TELNET, FTP and electronic mail remain the network's major applications, and the number of hosts grows exponentially. Human interaction with the network is still largely text-oriented, but graphical applications start to become common towards the end of the decade, coinciding with the growing access to workstations with high-resolution, bitmapped displays and applications such as the X Window System.
Following the Internet era is the World Wide Web era, from roughly 1991 until 2001. This is characterized by a major shift to the Hypertext Transport Protocol (HTTP) as the dominate application protocol, the web as accessed by a user with client "browser" software emerging as the dominate application and the shift from research network to a commercial Internet. The number of users increases vastly as the network is opened to consumers coincident with commoditization of hardware and increases in access speeds. "Dial-up" Internet access is common, but the demand for greater bandwidth grows due to the web becoming the network's "killer app." The limitations of 32 bit address spaces become evident during this era as demand grows to process ever-expanding data sets of enormous size: most of the ground work to move to 64-bit computers is laid, with such machines becoming common towards the end of the era. Typical computers range from the high end of RISC-based servers running Unix from companies such as HP, Sun and IBM, to a lower end of an increasing number of commodity systems running Microsoft's Windows family of operating systems from companies such as Dell, HP and IBM. Linux and the free distributions of the BSD Unix branch have considerable, growing marketshare. There is talk of a "new economy" based on web applications and wile initially promising, the era is closed by a major economic down turn and world-wide political, religious and ethnic unrest; many of the companies that had emerged during the exponential growth of the "Internet" (really web) or "dot-com" period of this era are shown to be non-viable and cease operations. In retrospect, the "dot com bubble" seems laughably absurd to most observers.
The era since 2001 has been the web 2.0 era, in which the world wide web continues to be the most significant application of the Internet, but with a major push to make it more usable and transparent to the end user. Wholesale replacement of the web is considered impractical due to the massive installed base of users and client software, etc, but viable, widely implemented standards emerge that greatly simplify application development. However, fundamental physical limits of venting heat in silicon-based processor technology have led to a stall in previously exponential increases of processor clock speed. The famous "Moore's Law" still holds, but now in its pure form: gate distances in semi-conductor circuits halve every 18 months. The law's corollary that the speed of computers doubles on the same schedule also holds, but no longer due to a doubling of processor clock speed. Instead, the extra chip real-estate is dedicated to putting more functional units in the form of multiple CPU cores onto the same chip; effectively, putting more than one CPU on a physical piece of silicon. This in turn has led to multi-processor computers becoming common place: so-called "multicore" machines.
However, this implies that speed increases for particular computations now rely on parallelizing those computations. This is enormously difficult and error prone from a software perspective, and in many cases, not even possible: some problems are inherently sequential.
Further, there is a growing shift away from the traditional model of human/computer interaction characterized by a person working with a computer in a fixed "work" location and position: whereas for approximately the past 20 years people have mostly interacted with computers via a keyboard, mouse and large bitmapped display, they are now starting to interact through mobile devices that lack these peripherals. Further, very little processing is done on the local device; most of it is done by remote servers hosted in dedicated data centers. In many ways, it is reminiscent of the mainframe model of centralized computing.
It seems that we are shifting to a new era, dominated by multicore processor technology, mobile computing and shared, "cloud" resources. This is expected if the historical division of previous eras is any indicator.
What shall we call it? The cloud era? The mobile era? The multicore era? That is still unclear.