What were the computer revolutions before Dot Com and Cloud?
January 29, 2010 8:21 AM   Subscribe

What were the computer revolutions before Dot Com and Cloud?

What were the computer revolutions before Dot Com and Cloud?

Around 1995(?), Dot Com was the next "big thing"

Now, Cloud computing is the next "big thing".

But, what were the "next big things" before the Dot Com era?
posted by jacobean to Computers & Internet (24 answers total) 1 user marked this as a favorite
 
Client-server, "thin client" (well, it still kind of is), object-orientation, multimedia — it's a long list.
posted by enn at 8:25 AM on January 29, 2010


Desktop computing

Portable computing
posted by rlef98 at 8:26 AM on January 29, 2010


Human computing
Automated computing
Programmed computing
Algorithmic computing
Networked computing
Personal computing
posted by carsonb at 8:27 AM on January 29, 2010


There were microchips in the '70s, networking in the '80s, BBS systems like CompuServ in the '80s, "multimedia" and CD-ROMs in the early '90s.
posted by cmonkey at 8:27 AM on January 29, 2010


WYSIWYG, GUIs, "push" technology are a few others I remember.
posted by deadmessenger at 8:27 AM on January 29, 2010


Cyberspace
posted by Blazecock Pileon at 8:29 AM on January 29, 2010


More specific than GUI: the concept of windows, rather than a DOS-like interface. (My technobabble is weak at the moment.) Windows was a big deal in business/tech.
posted by AugieAugustus at 8:34 AM on January 29, 2010


Are you looking specifically for network-related fads, or computing trends in general?

Peer-to-peer came between dot com and cloud. People had high hopes for P2P, but it mostly didn't pan out except for distribution of illegal content.

Also, in the HPC realm, grid computing, which has sort of been subsumed by cloud computing.
posted by qxntpqbbbqxl at 8:37 AM on January 29, 2010


If you're really curious, you can read through InfoWorld back issues.
posted by cmonkey at 8:39 AM on January 29, 2010 [1 favorite]


Much before the early 1990s, computer technology hadn't quite reached the massive cultural presence that it has today. VisiCalc was the first spreadsheet program for the personal computer, and it only sold 600,000 copies in six years. This was actually a bit of a revolution, but it wasn't something that got a lot of media hype, because it wasn't something most people 1) could use, or 2) had any real use for. Even today, most serious spreadsheet users are business types.

Basically, it wasn't until the mid-1990s that it became apparent that computers really were going to be for everyone. It took the industry a couple of decades to realize that mainframes were not the way of the future, and another two decades for prices to reach the point where most Americans could afford one. The original Apple II, a version of which sold until the early 1990s, cost almost $1300 in 1977 dollars for the cheapest version, about $4500 in 2008 dollars. The IBM PC Model 5150 sold for $2880, almost $7000 in current dollars.

Look at this graph. The first time 50 million PCs were sold was in about 1995. We're at over four times that now.

I think the Internet was the first time people really started talking about a "bit thing" with respect to computers outside the then rather small world of computer users.

This is not to say that there weren't revolutions in computing. There were. Desktop computing, the move away from mainframes generally, GUIs, heck, digital computing itself, all of these are huge deals. But none of them made much of a splash in the media.
posted by valkyryn at 8:40 AM on January 29, 2010 [1 favorite]


I think that the closer you get to the birth of digital computers the more revolutionary the ideas and their implementations become. So here's my list;

timesharing
high level languages
assembly language
integrated circuits
mini-computers
operating systems
function libraries
posted by rdr at 9:39 AM on January 29, 2010 [1 favorite]


Transistors.
posted by DevilsAdvocate at 9:55 AM on January 29, 2010


A lot of the most important innovations were early ones, things we take for granted now.

You can look these up on Wiki: The invention of electronic computers (Mark I, ENIAC), stored programs, programming langauges (Betty Holberton, UNIVAC), compilers (Grace Hopper, UNIVAC), optimizing compilers and higher level languages (FORTRAN, IBM), computer operating systems (IBM), monitors (old computers didn't have these), personal computers and graphical user interfaces (Xerox PARC), the internet (DARPA), internet service providers ... (and I'm pretty much leaving out changes in hardware and storage).
posted by nangar at 10:11 AM on January 29, 2010


Apple really thought the Newton was the big thing in 1993...
3com really thought the Audrey was going to be big in 2000....
We now have the iPhone, the iTouch, and soon the iPad... *probably* by this time we're ready for the tech.

But as to earlier expansions...
The network card was huge... before that it was dialing in to a switch.
Distributed computing in the early 1990s was the brainchild of HP, Digital and IBM (among others)
As for OSes, Apple revolutionized the gui in 1984 with the release of the Mac (predating GeOS)


I'm thinking we don't want to be talking about punchcards, transistors, Desktop computing, multi-layered boards, the microchip, HPUX, ENIAC, Charles Babbage or the Abacus....
posted by Nanukthedog at 10:17 AM on January 29, 2010


In the PC world, "Plug & Play" was a big ad word for a long time before it became a reality. As someone who spent a couple of days messing with jumper settings to get their sound card working, I feel it should get more recognition. Here's Bill Gates' demoing it for Windows 98.

If you're looking for a book tip, Steven Levy's Hackers chronicles four previous computer revolutions (the advent of AI research, personal computing, computer gaming and free software).
posted by themel at 10:31 AM on January 29, 2010


The biggest computer revolution of the 80s was probably the phrase "Home Computer". In 1983, Time's "Man of the year" was the computer, and how it was "moving in".

The shift from computers as mainframes and terminals (and even PCs) in an office or engineering facility to computers as a device inside a home (and suitable for a kid or teenager) was as culturally relevant as the growth of mainstream use of the Internet in the mid-90s. Wargames was one of 1983s top grossing movies, and is still a touchstone for a lot of (admittedly, greying) geeks. If it had been written ten years earlier, it would have been a work of pure fantasy, instead of not entirely unplausible fiction.

This was the era of the Apple IIe, the Commodore 64, and a whole line of Atari XL's, all trying to get into the living room. This was a pretty big deal because it was still seen as somewhat risky (IBM was very late to the game here... because they weren't sure about the impact of their very serious machines playing *gasp* games. They didn't want their powerful, business-critical machinery to be seen as a toy for children.) In 1975, the idea of a computer in your home was unthinkable (perhaps in the basements of inventor-types or ham radio operators... but for a normal, suburban family? no way). By 1995, more households had computers than didn't. Somewhere around 1985, the tide shifted.

At the time "Desktop Publishing" seemed more evolutionary then revolutionary, but the marketers tried hard (think: housewives printing "Family Newsletters" at home). But when you look back, perhaps it was revolutionary after all.

MeFi is full of professional graphic designers, and I suspect that none of them still cut with an xacto and paste with hot wax. The impact of this was felt mostly on the professional side of the fence, and with Macintosh users, because of the availability of the LaserWriter, which was able to actually deliver WSIWYG editing. The PC side didn't really catch up until the laserjet II and IIP showed up in 1987 and 1989. It's not too far-fetched to say that the Mac absolutely owned the publishing industry, from the first day that the industry became computerized, and Apple's continued dominance in that sector has a lot to do with that history (and, of course, the Mac's suitability for the job, both then and now).

The SoundBlaster arrived in 1989. Prior to that time, most PCs did not have much more than basic bloop-beep-buzz sound production capabilities, and had ceded the game market to Atari, Commodore, and the game consoles. The SoundBlaster brought a new era of PC gaming, which helped to bring PCs into peoples homes again.

In 1990, Microsoft started selling Windows 3. More importantly, through the early 90s, PC makers started bundling it with DOS... and we'd finally gotten to the point where essentially any machine running MS-DOS 5.0 or better and Windows 3.0 or better could run any PC software on the market (previously, clones had a history of frustrating compatibility issues).

So, as Atari was going under (again), and Commodore and Apple were trying to figure out what their future held (Wait, is it the C-64, 128 or the Amiga, the Mac II, Apple-III, or the Apple IIgs?), the market very quickly went almost entirely to the PC clones. Whitebox builders started building out cheap machines with SVGA graphics, SoundBlaster, a mouse and a copy of MS-DOS 5 and Windows 3 -- PCs and PC parts were becoming commodity items. Computer Shopper was thicker than some towns' phone books. By the time Windows 3.11 shipped in 1993, it was clear that outside of certain professional, creative industries (publishing, music production), that Windows-based PCs had won.

It was that ecosystem of grey-market imported graphic cards, interchangeable motherboards, and soulless beige cases that led directly to the Microsoft/Intel near-monopoly, because now it wasn't Atari vs IBM, it was Apple vs. every independent system builder in the nation (and a few hobbyists, to boot).

...and don't get me started about BBSes.
posted by toxic at 10:32 AM on January 29, 2010 [4 favorites]


Forgot parallel processing and cluster computing, without which there'd be no Google, no Paypal, not much 3D animation ... (Not as old but still drastically important.)
posted by nangar at 10:43 AM on January 29, 2010


A lot of things in technology are circular. Cloud computing really harks back to an era where the computing power was concentrated in mainframes and everyone used remote terminal access. Cloud computing just takes those same ideas (you don't have to buy or maintain your own hardware, let someone else do it and just run your code remotely) and adds the twist that the "remoteness" can be over the internet rather than over a serial cable. Another one is virtualization which used to be a really big in the mainframe days but then went into obscurity as the technology to efficiently virtualize desktop processors didn't exist, but with recent hardware and software it's become possible again and virtualization is now a hot topic. And of course a rather well known circular technology is the idea of desktop apps being web-based -- that one scared the shit out of Microsoft so much in the mid 90s that they took on Sun's Java and Netscape and fought them tooth and nail. Java didn't really work out on the client side so well and Netscape was famously ground down to dust and everyone sort of forgot about it for a long time, until "web 2.0" came around with things like Google Docs. Combined the the popularity and portability of cheap netbooks and smartphones, everything old is new again and once again the "web OS" is a hot topic.
posted by Rhomboid at 10:43 AM on January 29, 2010


Desktop Publishing was the one immediately preceding the Internet, but you're kind of asking for things that didn't exist before computing became a part of the popular landscape. Both rdr and carsonb list things that were hoped to bring computers to a wider audience, but they were still tools back then. Frankly, when I think about the 80s, it has more to do with hard drive and memory growth, as well as CPU speed and graphics. Really, it was computers themselves that were "the next big thing."

Thing is, the computer industry was a lot more fragmented before the Internet. Computers were more dedicated-use when multitasking hadn't been invented yet, so interest in the future laid more within whatever niche. Graphics people were interested in workstations, numbers people were interested in the next spreadsheet package, words people in WYSIWYG, and so on. Heck, I temped for a large national market research firm in downtown San Francisco, in 1997, where I created Powerpoints on the only computer in the office. Before that I worked in a real estate office where one of my main jobs was hooking the modem up to a computer and printer so they could access the MLS and print out information and pictures of houses.
posted by rhizome at 10:50 AM on January 29, 2010


multicasting on the mbone
posted by Mahogne at 11:06 AM on January 29, 2010 [1 favorite]


to me, it seemed like the biggest thing prior to the internet was CD-ROMs. They held so much more info than a floppy, they were seen as revolutionary. Y'all don't remember Encarta?
posted by timepiece at 1:04 PM on January 29, 2010


If I may:

The monitor (as opposed to inputting via cards and receiving the results)
The command line
WIMP (Windows Icons Menus Pointer)
Separate graphics and sound processors
The compiler
The Operating System
The programmable computer (as opposed to single-purpose calculating hardware)
The IBM PC (standardised modular hardware)
Open Source
posted by Wrinkled Stumpskin at 1:16 PM on January 29, 2010


They got smaller - silicon chips
They got easier to use - development of operating systems
They started talking to each other - networking
They got cheaper
posted by mattoxic at 4:41 PM on January 29, 2010


In the beginning, there were vacuum tubes*. Then transistors, then integrated circuits.

Before it was called the Internet, a nationwide (US) experimental network was set up under DARPA. It connected only a handful of laboratories and universities. The privileged researchers on this network exchanged email and transferred files by FTP.

One of those researchers was Tim Berners-Lee, who thought it would be ever-so-good if researchers could communicate better on their network, so he invented HTTP. Thus the foundation of the Web was laid.

Then came the first web browser, Mosaic. Then the Netscape browser. Finallly AOL arrived on the scene and the Internet's web and email became available to ordinary mortals.
*Yes, the first computers were built with vacuum tubes.
posted by exphysicist345 at 7:36 PM on January 29, 2010


« Older Best Mac email program?   |   why disable flv? Newer »
This thread is closed to new comments.