Crash course in computer science.
December 30, 2010 7:19 PM   Subscribe

How do I gain a good understanding (both macro and micro) of the modern computer/internet/electronics?

I don't even know how to ask this question effectively, I'm so clueless about such matters... and moreover, I don't even know if this is even a reasonable goal.

They've made computers/electronics/internet stuff so user-friendly these days that I find that even though I know how to operate these devices, I don't actually know what's going on. And I would like to.

I've taken some programming classes so I know abstractly how a computer should work, but I feel that my knowledge is very limited and I don't have a good overview. I'm also not sure how this all connects to new emerging technologies. (SATA? What? It's probably not 'new' - more that I'm terribly behind the times...)

I would like to gain a good structured understanding, pretty much starting with the basics so I have a good foundation, so I'm looking for something a little bit more organized than just wikipedia articles on the individual subjects. Ideally, I'd also like to go into some depth, so I'm looking for more than just the "computer-for-dummies" approach.

I'd like your advice on where to start, and where to go after I've started. Please recommend any books/resources you think might be helpful.

I've taken isolated computer science courses (intro to programming/data structure and algorithms), but it was a while ago, and I don't feel like I know where it all fits in in the "big picture". Please assume that I otherwise have no computing experience beyond that of the average computer user/ modern world person.

I'm not averse to technical readings/textbook readings.
I can spend some money buying things to fiddle with, if that might be helpful.

Convoluted question, sorry, thanks for your help!
posted by oracle bone to Technology (8 answers total) 9 users marked this as a favorite
 
As for the big picture of how computers work, the term you're looking for is "computer architecture" or "computer organization".

You can learn it by starting with the small/concrete things, and moving toward the bigger/more abstract ones: for example, Introduction to Computing Systems: From bits & gates to C & beyond will teach you about transistors, logic gates, digital circuits, the design of a CPU, machine language, assembly language, and C programming (with an emphasis on how it gets translated into machine code), all on a simple imaginary CPU whose design is explained in the book. You can program it using a simulator.

That won't tell you anything about the internet or stuff like SATA. I never learned that kind of thing myself, but it seems that the people with that kind of knowledge just pick it up by spending lots of time messing around with computers, because they love it. I don't think there's a faster/less haphazard route to "general computer knowledge" (if there were, I would have tried it), but maybe there is for very specific narrow topics.
posted by Chicken Boolean at 7:55 PM on December 30, 2010


Read Godel, Escher and Bach. Which isn't precisely about computers, but it will help a lot in thinking about computers.

But really, there's no way to know everything about computers. You might want to focus on the fundamentals that Chicken talked about, and then find focus on one aspect of computers at a time as they interest you -- for example data modeling, or storage, or encryption and so on.

Asking about 'computers' is like asking to learn about 'math' or 'science'. You kind of have to start with the basics and build from there.

Btw, one unorthodox way to learn about how computers work is playing with redstone in minecraft. You can build any kind of logic gate in minecraft only two different objects. I found it very enlightening to play around with.
posted by empath at 8:08 PM on December 30, 2010


If you want to understand the internet, then almost the first thing you have to understand is the OSI 7-layer model.

The concepts behind it (how higher level objects communicate with each other by using services from lower level objects) is also important for understanding modern operating systems. But it is essential to understand it conceptually or you'll never figure out how things like TCP, IP, UDP, HTTP, FTP all work, and how they interact with each other (if they do).

It's also essential for understanding how you can do HTTP (web browsing) over ethernet, or over Wifi, or with a telephone modem, or with a USB connection to a cable modem, even though all those are drastically different from each other.

The concept of layers which provide services to higher layers and use services from lower layers is one of the most important in modern computer science.
posted by Chocolate Pickle at 8:26 PM on December 30, 2010


Response by poster: Thanks for the responses so far! I'll check out all the links/books.

Quick question: where does electronic appliances/gadgets fit in to all of this? (I'm thinking along the lines of mobile phones/ipods/etc.) Are they the same on the lower levels and then diverge or something completely different?
posted by oracle bone at 8:35 PM on December 30, 2010


I'm not quite sure how to answer that. They're going to be different at pretty much every level (assuming different hardware, os, applications, etc).

However, computation is computation and it's always possible to translate anything which is computable on any computer to something which will run on a theoretical device called a Turing machine (assuming an arbitrarily large amount of time and tape is available), so in that sense, they are all the same.
posted by empath at 8:41 PM on December 30, 2010


They've made computers/electronics/internet stuff so user-friendly these days that I find that even though I know how to operate these devices, I don't actually know what's going on. And I would like to.

I've been programming computers for money for thirty years, and I don't actually know what's going on either. In my estimation, it was no longer possible to know what was going on by about 1995.

Object oriented programming was widely touted as The Solution to the problem of making cobbling together software more like building hardware and less like handmaking leather shoes. What it's actually done instead is render the entire process of programming a computer so abstract that no human being can get his head around what's actually going on in there any more. Which means that every major project now in production is roughly 5% structural materials and 95% glue and recycled leftovers, and requires roughly 10x the hardware to get the same performance as comparable code from five years ago. And it's still no easier to walk into a major project and get up to speed with the architecture than it ever was.

OO has not stopped programmers from re-inventing the wheel; it's just made it possible for them to continually re-invent the car instead. Web Services are a case in point.

Used to be that we had this thing called TCP, and when one computer wanted to talk to another over the Internet it would open a TCP connection to it and fire away. There was also a thing called RPC, which ran on top of UDP, and allowed one computer to run code remotely on another. There was a standard called XDR that specified how to package data going over the wire with RPC, so that both ends would understand it the same way. It worked pretty well, and it was reasonably efficient; if I wanted to send an array of 32-bit quantities from here to there, each of them would end up occupying 32 bits on the wire. As a netadmin, I could choose which TCP-based services were allowed into my building with fairly simple firewall rules. Life was tolerable.

But none of that, apparently, is modern and newfangled enough for the rising generation, so now we have all this stuff like SOAP that runs on top of HTTP (which itself runs on top of TCP, all of it on port 80, so good luck with your firewall filtering). Efficiently packed binary data and pre-agreed formats generally are Out; XML and self-describing data that requires both a parser and pre-agreed understandings about the parser's outputs are In. Fuck knows what any of it is doing under the hood, or where to start looking when something breaks, or how much bandwidth the data consumes on the wire; the point is you're no longer supposed to care what goes on under the hood. You bought your engine from somebody you were assured was a reputable vendor, and that's supposed to be enough.

Nobody except possibly kernel hackers writes code any more. Everybody just cuts and pastes snippets out of a heaving formless mass of API documentation and slaps them together in various combinations until they appear to be getting the job done. It's mayhem out there.

Young 'uns will tell you that grizzled old neckbeards like me are merely curmudgeonly, worn-out relics who are intimidated by the pace of change and simply have no idea just how cool the new stuff is. In fact if I'm curmudgeonly it's only because none of those little fuckers on my lawn seem to have any clue at all just how cool the old stuff was, or how learning and applying the concepts that underlie effective application of the old stuff would reveal so much of the new stuff to be the extraneous cruft it surely is.

The only way left for tech newbies these days to gain anything approaching the level of detailed understanding that the Apple II gave my generation is by playing with comparably simple and limited architectures. The only place to find that stuff now is in 8 bit microcontrollers.

Go grab an Arduino development kit, and learn to make it do something useful. Once you grok microcontrollers well enough to be able to explain exactly why the Atmel chips underneath Arduino are better designs than the PIC family (which will take you roughly five years) you can move on to something else. But you will never, never, never get a handle on or any kind of overview of programmable technology generally. Nobody has that any more. Fuck knows where it's all taking us. I wore an onion on my belt. It was the fashion in those days.
posted by flabdablet at 2:53 AM on December 31, 2010 [3 favorites]


I strongly recommend this book. Though it's a little outdated (it was written in 2000), mostly everything is still relevant. I've got a Computer Engineering degree and it still cleared up a lot of topics and taught me new things.
posted by Tu13es at 4:59 AM on December 31, 2010


Yeah, Petzold is one of the good guys. No bullshit in that book at all, IIRC.
posted by flabdablet at 5:39 AM on December 31, 2010


« Older What are these nuts?   |   I need a database of exercise images that are NOT... Newer »
This thread is closed to new comments.