Tell me something I don't know about computers
June 30, 2005 6:04 PM   Subscribe

What interesting things did people do with "big iron" computers in the late '70s and early '80s that is not generally well known, but could be useful to modern computer users?

Computer and finance history books point to the Apple II and Visicalc, as well as the HP-12C as important "everyman" computing devices from that era. Pocket calculators and spreadsheets survive and are still very useful today. I would like to learn more about the more obscure stuff back then that could probably be done easily on a modern osx or linux box.

What big iron was in use back then, how did people use it, and what reference are there for more information?
posted by b1tr0t to Computers & Internet (25 answers total) 2 users marked this as a favorite
 
"You are in a twisty maze of passages, all alike."
posted by caddis at 6:32 PM on June 30, 2005


caddis beat me to it! "You are in a twisty maze of passages, all alike."
posted by ericb at 6:38 PM on June 30, 2005


What big iron was in use back then. . . ?

Mmmm - 3033.
posted by caddis at 6:54 PM on June 30, 2005


I can't give you a lot of specific online references off the top of my head, but from a _lot_ of professional experience I can tell you that a lot more "big iron" than you'd think is actually still in service, and acting as the underpinning for many e-commerce services today. Many of the industries like airline and travel that were ahead of the curve 30 years ago, are now tethered to those same big iron anchors they were so proud of back then.

90% of the time or more, whenever you use an airline website, an insurance company's quote estimator, or a bank's online account tools, you're really tapping into some jury-rigged connection back to a mainframe running COBOL or FORTRAN. Newer companies that have been able to start from scratch are able to build up from a more modern infrastructure, but they've also tended to fail at a much higher rate--it's really remarkable how much of the existing e-commerce world is really based on massive old mainframes from way back when.
posted by LairBob at 7:02 PM on June 30, 2005


This is an interesting question.

Start here, with Google's history of Usenet.

Usenet was something people did when they started hooking their big iron together into networks - basically a big bulletin board that anyone connected to the network could read or post to. Stop a moment and consider that everyone with access to big iron was a technically proficient geek, many with advanced degrees. You can go back and eavesdrop on their public conversations in the Google archive (and wow, does it ever give a new meaning to "good signal-to-noise ratio." I miss that old Usenet.) They talk about a lot of things; a very popular topic was 'what is today's biggest iron and what nifty thing can we do with it.' Advertising, spam, and for the most part rank stupidity were absent.

Your question gets it wrong way round, as I see it. The folks who needed computational power found a way to get it. An example:

My own father, who more and more I suspect was an unheralded genius in his own day, wanted to compute approximate ballistics trajectories for projectiles with given ballistic coefficients, given certain amounts of propellant, barrel length, temperature, altitude, atmospheric pressure, and prevailing wind conditions over the first 1000 yards of travel. He wanted to use this information to optimize his rifle handloads, which was one of his hobbies. So he wrote the program in Applesoft Basic on an Apple ][ and discovered that it would run for an hour and then produce an out of memory error.

So he learned 6502 assembly and wrote the computational parts in assembly. But the finished project crashed his assembler.

So he wrote a primitive assembler in 6502 machine code to help him handle the large amounts of memory he needed for his computation, had it assemble the computationally intense bit of machine language, and wrote an Applesoft front end to that. It would churn for several minutes and produce the desired data (and print it out on a dot-matrix printer, old Epson FX-80, with graphs, if you wanted.)

On a modern computer, the computations would take less than a second - but you probably couldn't teach yourself to write customized code the way he did because of the machine's greater complexity. Stop and think about that for a minute before you over-venerate your modern OS X or linux box. "Everyman" can't learn how to code for OS X in the 6 weeks he spends laid up from a myocardial infarction.

My point being, if I have one, is that the amount of processor or connectivity is not really the determinant of how people used the computer. Interface and innovation made a lot of difference, and in general progress has been made, in the sense that things people frankly take for granted today (easily laying out a newsletter in Quark, or producing a visually appealing flyer in Photoshop) were once arcana on par with removing a malignant tumor from the center of the brain - either one spent one's life training to acquire the skills and equipment necessary, or else one paid someone else to do it.

The idea that people were secreted somewhere doing really cool stuff on high powered boxes that then somehow faded from use seems unlikely, then, to me. Most, if not all, of the good stuff those secretive geeks were up to are now available to your desktop with just a couple clicks.

I do ramble on, don't I?
posted by ikkyu2 at 7:18 PM on June 30, 2005


ikkyu2 writes "I do ramble on, don't I?"

Over and over again, apparently.

[That's just a joke...I thought that was a totally cool story.]
posted by LairBob at 7:33 PM on June 30, 2005


LairBob writes "it's really remarkable how much of the existing e-commerce world is really based on massive old mainframes from way back when"

Yep, if you want raw transaction volume the big iron is where it is at. And american airlines has sometimes made more money from their SABRE reservation system than from flying planes.
posted by Mitheral at 7:49 PM on June 30, 2005


LiarBob: I can't give you a lot of specific online references off the top of my head, but from a _lot_ of professional experience I can tell you that a lot more "big iron" than you'd think is actually still in service, and acting as the underpinning for many e-commerce services today.

This is interesting. What does the big iron do that more modern machines are not so good at doing? Or is it just inertia?

ikkyu2: Stop and think about that for a minute before you over-venerate your modern OS X or linux box.
...
The idea that people were secreted somewhere doing really cool stuff on high powered boxes that then somehow faded from use seems unlikely, then, to me. Most, if not all, of the good stuff those secretive geeks were up to are now available to your desktop with just a couple clicks.


I think my mac came with Perl, Ruby and Python, out of the box. With a few clicks, MySQL can be added. Once you have a high performance scripting language with great string processing and a decent relational database, an entirely different class of problems become tractable. It happens that you need more memory and disk space in order to run a database, but it is the database more than the resources that make a new class of problems easy to attack.
posted by b1tr0t at 8:07 PM on June 30, 2005


"Everyman" can't learn how to code for OS X in the 6 weeks he spends laid up from a myocardial infarction.

I appreciate the basic point of the story but I'm pretty sure this isn't true. It may be true that the average person couldn't learn objective c plus the OS X APIs in 6 weeks, given sufficient motivation. I'm pretty sure it's false that the average person couldn't be a competent beginner in something like python or ruby given 6 weeks of free time and sufficient motivation. Python in particular ain't rocket science, and some really good tutorials exist. Many people lack sufficient motivation, and many people simply freeze up on things like programming, even though they are perfectly capable if they could get past the freezing, but I think by and large anyone has the capability to learn to program in a language like this.
posted by advil at 8:11 PM on June 30, 2005


My partner writes IBM mainframe software, MVS. So I'm quite sensitive to the fact that big iron is still relevant.

The Mythical Man Month is still one of the best books about software project planning. It's all about the development of the IBM 3/60.

Virtual machine architectures are back in vogue. VMWare is the best known, but there's a bunch of various Linux virtualization systems. We're basically rehashing the design space of VM for IBM big iron.
posted by Nelson at 8:12 PM on June 30, 2005


I apparently forgot to include my conclusion, though b1tr0t didn't - being a competent beginner in python is indistinguishable from being a competent OS X programmer.
posted by advil at 8:12 PM on June 30, 2005


b1tr0t writes "What does the big iron do that more modern machines are not so good at doing?"

Transactions or more generally massive I/O.
posted by Mitheral at 8:28 PM on June 30, 2005


What do you get when you give a mainframe to a bunch of college students at University of Illinois in the early 70's? PLATO Various projects on PLATO lead directly to lotus notes, Nethack (via dnd, the first dungeon crawl), microsoft flight simulator, freecell, and multiplayer games of all types (via the first networked game, empire, direct predecessor to netrek). Yeah, so they didn't really get much work done... but it was basically the first network community of any decent size and importance.
posted by JZig at 9:30 PM on June 30, 2005


Transactions or more generally massive I/O.

Exactly. The processors aren't necessarily all that much faster (though there are often more of them) and often mainframes don't even have all that much more memory than you'd expect. But they WAIL on the input and output. Remember the big to-do over the "micro chnanel" architecture IBM was putting in some of its PCs back in the '80s? The reason IBM thought it was a big deal was because it was a scaled-down version of the channel-based architecture they used in their mainframes, and it was FAST. And indeed, it's a much better I/O bus than ISA. Unfortunately for IBM, ISA was "good enough" for the time and much cheaper. If it had stuck, today we might be all using machines with IBM mainframe-derived buses rather than with PCI buses.
posted by kindall at 9:44 PM on June 30, 2005


b1tr0t, I think the main factor might be stability of the infrastructure. If you aren't sure what will change in an OS in the next five years it doesn't make any sense to make a big investment in it. Even if the economics show that it is time to take the plunge, nobody wants to take the risk when something better might be just around the corner. After all, what they have is still working perfectly well.

As for new companies not really adding innovation, there are companies like Amazon Ebay and Google who have obviously been able to make new technology work. The industries the big iron is still working in don't have a lot of serious competition from new blood, and the economic advantage of new technology just isn't enough to be disruptive, I guess...

I think there are some other problems with the new technology as well... Now, I'm a control systems guy, so I am into real time and I don't know much about the big database stuff, but if my observations are terrible somebody is sure to correct me :)

These points are really about infrastructure stability too, but in a different sense... There are some things about the way the old systems were done that you have to have a very level head to apply nowadays. You really don't want all the crap that comes with a modern OS, you just want the stability and the raw crunching power. It must be pretty hard to decide to build a major database on a QNX microkernel, but it would probably be a good move in many ways. I'm sure start-ups have tried to develop this kind of thing on Windows and suffered lots of headaches, it just isn't the right tool for the job. I suspect that even Linux builds too much into the kernel to be appropriate for this type of thing...
posted by Chuckles at 9:58 PM on June 30, 2005


Agreed that one can teach oneself the rudiments of perl or Java on a Mac in 6 weeks or so - I've written rudimentary programs that accomplished simple tasks in each - but you're running in a Terminal window.

I've tried multiple times to grok the various APIs and hooks and God knows what all needed to make a proper Aqua / Cocoa app, and maybe I'm just obsolete, but I feel like a little formal training in the model-view-controller paradigm would go an awful long way. Or even an introduction to Project Manager. But it doesn't seem like something I could do in 6 weeks.

Dad had no formal training in computers - his degree was a bachelors in mechanical engineering, 1950, on the GI Bill, and he was the first in his family to attend college. He went from zero to hand-coding an assembler on that 6502 in 6 weeks - it remains one of the more astonishing things I've ever seen. Prior to that he'd used a Tektronix programmable calculator that had a plotter output, in his engineering work.

Fuck, I really miss him.
posted by ikkyu2 at 10:25 PM on June 30, 2005


What does the big iron do that more modern machines are not so good at doing? Or is it just inertia?

When I worked as a developer in a bank with an IBM mainframe at its core, it was put to me like this. We have daily processes, weekly processes, monthly, and year end processes. The system embodies all sorts of weird shit to accomodate our local financial regulation. If we want to bring up a new system, we have to run it in parallel for at least a year, matching each transaction with its equivalent on the old one, before we can safely decommission the old system.

It's just not worth it when the old system is still efficient and rock-solid stable. Believe it or not, new mainframe-type systems are still being built and sold, and they're cheaper, smaller and faster than their predecessors, just like all other computers. So if the current system didn't scale, in all likelihood it would be replaced with a newer, shinier mainframe (running all the same code, of course, from the assembler on up).

At the bank we used a middleware layer to translate requests and responses to/from the mainframe into XML messages. I knew nothing of the icky COBOL and FORTRAN and assembler running inside, and I didn't need to. New services might or might not be implemented on the host or a newer system - I wouldn't see it because the middleware layer of message queues and XML abstracted it away from me. (In IBM lingo, the mainframe is "the host", which is very confusing to those of us who think of "host" in the TCP/IP sense).

Part of it is philosophy as well. Mainframes have redundancy for all parts in a way that other architectures are only now rediscovering. Mainframe operators have ways of managing releases for minimal risk and downtime that their Unix and Windows counterparts haven't fully adopted. Mainframes are forever, baby.
posted by i_am_joe's_spleen at 10:26 PM on June 30, 2005


Also, see Wikipedia.

Possibly the notion of migrating your running processes to another node and funkier kinds of clustering are things that might now be feasible with comparitively small mods to a Unix/Linux/OS X system. Can't think of anything in the application space though.
posted by i_am_joe's_spleen at 10:32 PM on June 30, 2005


that these things are still around is at least partly the reason why service oriented architecture is the current fad (it's a framework to integrate "legacy systems" into "the enterprise").
posted by andrew cooke at 11:26 PM on June 30, 2005


I learned to program python in mere days, hit 80:20 rule within the week, and was doing full wxWindow-supported interfaces within the month.

The hard stuff is still in developing the core algorithms, tests, and flow.
posted by five fresh fish at 1:37 AM on July 1, 2005


f f f: Modern OSes certainly have made it easier to solve certain tasks. Script languages can do anything, but you really don't know what's going on underneath the interpreter. Knowing the syntax of C, you still wouldn't have a clue about how to make use of the OS features or reach deep into the internals of the machine, without resorting to some high-level runtime engine, virtual machine or interpreter environment.

These intermediate layers are good, since they prevent reinvention of wheels, make code portable and make sure various programs can work together. Still, there is something special about having full control of a computer. If you used some simple microcontrollers, such as the AVR or PIC, you know what I mean. Tweaking processor registers with assembler commands is not something you can do on a PC or Mac today.
posted by springload at 4:35 AM on July 1, 2005


My company is still trying to get off its dependence on VAX since the amount of conversion costs to get content to display properly are now becoming prohibitively expensive.
posted by sciurus at 9:48 AM on July 1, 2005


The Moon.
posted by kc0dxh at 10:58 AM on July 1, 2005


[removed ikkyu2's accidental double]
posted by jessamyn (staff) at 8:13 PM on July 2, 2005


Nelson's point about the arrival of machine virtualization on PC architectures is a good one. It's just the latest in a long line of examples of designs and technologies that have migrated down from "bigger-iron."

"mainstream' computing has seen three major generations before the PC.

Mainframes -- starting with IBMs business computers. Today there are a handfull of survivors, but IBM is the most notable.

Minicomputer -- DEC really gave birth to this generation, though it was joined by others, like HP and Wang. I'm not sure there are really any surviors.

Workstation -- This is where Sun and SGI got their start. This generation really got its start from the twin forces of UNIX and the arrival of Minicomputers-on-a-chip like the motorola 68K series, though it quickly embraced RISC.

The workstation generation was actually born after the PC generation. Sun and SGI were founded in '82, while the IBM PC was released in '81 and the Apple][ in '77, but the workstations had greater capabilities in terms of both hardware and software than the PCs of the day.

Each generation tended to disrupt the previous generation. In other words, they generally debuted with fewer capabilities than their elders, but, as they matured and gained capabilities, started to take over niches previously exclusive to their elders.

As they matured, each adopted technologies and responsibilities that were first seen in earlier generations.

In addition to the machine virtualization mentioned by Nelson, which made a stop in the "big iron" of SGI and Sun in the late 90s, we've also seen the following, in roughly decending order of arrival in the PC world:

-- Filesystem snapshots
-- Clustering. I'm not sure if their are mainframe-age examples, but DEC's VMS is the prime example from the minicomputer generation.
-- Multiprocessing
-- Timesharing
-- Journaled filesystems
-- Preemptive multitasking
-- Virtual memory & protected memory
-- CPU caches
-- Disk caches
...
-- Hard disks

I'm leaving lots out, I'm sure, but I've just realized that its a gorgeous day and I'm spending my time writing about old computers.
posted by Good Brain at 1:32 PM on July 3, 2005


« Older Blackberry 6710 Help Please   |   You probably think this song is about you? Don't... Newer »
This thread is closed to new comments.