Explain RAM to me like I'm a small child
September 16, 2019 3:16 AM   Subscribe

I am a lit person, working on a piece of writing where RAM (random-access memory) crops up, as a metaphor. Help me grasp what RAM actually is, its core functions and properties, in very basic terms?

I've Googled, but can't find any resources which are actually intelligible to me. I don't need to know everything, just the principles at a basic level. I'm more interested in the theory of RAM rather than e.g. what materials are needed to make it, technical specs, whatever.

Hoping MeFites can break this down for me, and/or recommend intro-level resources online for this.

Thank you in advance!
posted by thetarium to Computers & Internet (25 answers total) 8 users marked this as a favorite
 
Probably the oldest explanation of RAM:

'Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing ... knew a lot about computers and substantially nothing about electronics. [This is not entirely fair to Turing. BJC] They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation.' (Williams, F.C. 'Early Computers at Manchester University' The Radio and Electronic Engineer, 45 (1975): 237-331, p. 328.) source

The first one I read was in a 1980s Osborne computer book, which is now available for free as a PDF - it's on page 8 & 9. The book describes obsolete 1980s home computers but the principles are the same. I still think of memory as mailroom pigeonholes.
posted by BinaryApe at 3:53 AM on September 16 [5 favorites]


People who use RAM as metaphor in writing often use it quite vaguely as a term for "memory", without much regard to the specifics of real RAM in modern computers. The author likely just meant it as synonymous with "memory" or "working memory".

The Osborne book linked above is a pretty decent summary of how RAM works. You have a billion tiny boxes, each of which is numbered. The computer can store or fetch one number in each box. The RAM controller accepts commands like "store 123 in box 987" or "read from box 987".
posted by richb at 3:59 AM on September 16


You can think of RAM as your computer's "working memory," where it stores information that it's about to use.

Long-term non-volatile (meaning, the data remains even if you turn off the power) data storage tends to be slow in computer terms. Historically, hard disks were used. A hard disk is a bunch of spinning metal platters, and data is encoded on it with tiny electromagnets, and read back with magnetic sensors. To read a piece of data, your disk needed to physically scan a "read head" over the correct physical location, and wait for the disk to spin around under it. All this happens very fast (on the order of milliseconds in a modern hard disk) but that's still much more slowly than the Central Processing Unit (CPU, the part what actually does the computation) runs. Newer types of nonvolatile storage are much faster, but the same general characteristics hold -- "Big storage size, nonvolatile, relatively slow access times."

The CPU can store relatively few pieces of information at any given moment, compared to "all the information in your computer." -- it mainly acts on instructions. (Yes, I'm glossing over cache here, but thetarium asked for high-level)! Before it can do any computation, it has to load the information it needs. If you had to wait for a disk read every time your computer needed a piece of data, your computer could never be faster than your disk / the data connection ("bus") between your disk and the CPU. Thus, Random Access Memory.

RAM's main characteristic is that it's fast to read from and write to. In exchange for speed, you give up non-volatility -- data in RAM persists only as long as power is available. When you open, say, a Word document, logic in your operating system makes a copy of the document (and of Word!) in your system's RAM. From there, the computer can quickly read the bits it needs, modify them, and record them back into RAM. This all happens as you're typing in Word -- the document representation is hanging out in RAM. Incidentally, this is why you can lose unsaved changes if your computer loses power before you hit 'save!' Remember, RAM is volatile -- the contents go away if not actively maintained. When you use the 'save' command, the representation of the document in memory/RAM gets copied to a file on your disk drive, so you can come back to it later!

Your CPU actually has several layers of smaller/faster memory onboard, too (the cache I alluded to up there) -- a whole subfield of computer science has to do with managing where to store data such that it's available as quickly as possible, so that your computer has the next bit of information it needs ready to go and doesn't have to wait on slower memory any more than absolutely necessary!
posted by Alterscape at 4:03 AM on September 16 [11 favorites]


One distinguishing feature of RAM is that it is ephemeral -- it is all lost when the computer is turned off. This is distinct from the "hard disk" in a computer, which retains info when powered off.

So when RAM is used a metaphor for memory, it is usually alluding to short term or "working" memory, rather than long term memory of past events.

(This is no longer always true in the real world, and there are complicating factors like machines which use batteries to keep RAM alive when the device is apparently off, such as the Nintendo Switch, but it's usually what people are thinking of in a literary context.)
posted by richb at 4:04 AM on September 16 [2 favorites]


In general and with some exceptions:

A computer's processor needs to work with data to do its job. There are a number of places that data can be stored, each of which represents a particular trade-off between speed, cost and capacity. RAM is one such place.

The fastest storage is in registers -- data the CPU is actively using right now. These are built in to the CPU and usually hold from a few to a few dozen values. Accessing data in registers is as fast as it gets, but because you only get a handful of them, you have pick and choose what you keep there. Moving things into and out of registers takes explicit instructions.

Next, there's cache. This is very fast, very expensive special-purpose memory, either built in to the CPU chip or connected directly to it. It holds a few hundred to a few hundred thousand values, typically the most recently used stuff not in registers. Cache is (from the perspective of the program running on the CPU) transparent -- you ask for a value, and the memory circuits check the cache first. If what you want isn't there, the memory circuits fetch it from somewhere slower and further away (and also store it in cache for next time -- after making space by discarding the least-recently-used stuff in cache). Some modern CPUs have multiple layers of cache.

After that is RAM (random-access memory). This is typically on separate chips, on the same board as the CPU. It's much slower than cache, but still very fast (and hugely faster than disk). It's relatively inexpensive, and holds millions or billions of values. It's called "random access" because you can read or write any part of it in any order (as opposed to something like a tape, where you have to start at the beginning). RAM (like registers and cache) loses all its contents when you turn power off. On a modern desktop computer, it's not unusual for a running application and all the data it is currently working with to fit entirely in RAM.

Beyond RAM is local persistent storage (disk or flash). This is slow and cheap -- and keeps its contents even when you turn the computer off. It might hold billions or trillions of values.

Networked computers may have layers beyond that: SANs, network file systems, "the cloud". And there's always offline storage (a box of tapes in your garage).
posted by sourcequench at 4:04 AM on September 16 [3 favorites]


RAM is a wall of cubbyholes where the computer stores its information temporarily pending a direction to file a specified collection of it in the filing drawers across the room. The information stays in the cubbyholes until it is filed, if it is. If the computer is powered down (or crashes), the information in the cubbyholes is gone and they need to be refilled. Only the stuff in the file drawers remains.
posted by megatherium at 4:41 AM on September 16 [4 favorites]


Here's an explanation using a library metaphor.

You're in a big library (maybe a university, maybe a big city library). You're working on a project, and you need references. You begin by looking for a few particular books that you're pretty sure will be useful. Other books near those titles look like they have potential, so you grab them too, to save time so you don't have to go all the way back to this same shelf later. (It's a big library, after all.)

It takes you a while, but you end up with a few armfuls of books from the stacks, which you take over to a table and spread out. You look through the first one, maybe taking notes, and that process leads you to another book on the table, and so on. Eventually, a few titles were used, and some ended up not being needed at all, but you're glad you had them handy just in case.

You leave the library, leaving your stacks of books on the table, to be reshelved by the librarians later.

data/program instructions = all the books in the library
hard drive/permanent storage = the stacks
RAM = the table
contents of RAM = the armful of books you brought over to the table
RAM's volatility = the librarians reshelving the books you left on the table

I used a variation on this explanation just last week in an intro to computing course. It didn't go over as well as I had hoped. I suspect 18-year-olds just don't have the experience with physical libraries that us oldsters do...
posted by SuperSquirrel at 4:42 AM on September 16 [11 favorites]


I think of it like this. You know how when you're working on a project, you pull out the files, read them over, and remind yourself of all the related information? The amount you can mentally juggle is RAM.

You have other memories stored -- but while you're working on that project, you're not remembering all of the random experiences you had with your high school friends, for example.
posted by slidell at 4:47 AM on September 16


Can you quote an example of the metaphor? It might help to see which aspect of RAM us being used.
posted by pharm at 5:35 AM on September 16 [2 favorites]


The name Random Access Memory was used to distinguish it from tape memory. Reading and, especially, writing on tape is essentially an end to end process. To find something in the middle, you have to start at the beginning and read all the way through to the data you want. (It was actually more sophisticated in it's best versions,
But that is not relevant to the nomenclature. )

RAM, beginning with disk drives, let to put data anywhere in the middle without going back to the beginning.)

RAM, as currently referred to today, is a pretty good metaphor for memory in the brain, including that the info is lost went the computer is turned off, or when the organism dies.
posted by SemiSalt at 5:42 AM on September 16 [2 favorites]


If you want an even cruder analogy: RAM is like your hands in that you can only hold one beer at a time in each hand to drink from. Storage is like pockets: with the right pants and hoodie on, you can carry like eight other beers -- with the caveat that you can't drink from them while they're in your pocket. If you put down a beer you're drinking then you can reach into a pocket and open a different one (and tuck away the empty for recycling later).

(Solid State Disks blow up this analogy unless you suggest they are like that hat which holds two beers with straws in them, which just reminds me that all analogies are imperfect.)
posted by wenestvedt at 6:04 AM on September 16 [2 favorites]


A few practical facts about RAM:
  • Every program on your computer uses RAM basically all the time. For example, when you go to the national geographic site in your browser, all those gorgeous pictures and videos on the website are stored in RAM.
  • If you close a program, it will stop using its RAM and put it back into the pool so that other programs can use it instead.
  • RAM is a finite resource (you have a fixed amount, like 8GB), and if it all gets used by programs then your computer will get REALLY slow. Because of this, computer power users often try to a lot of RAM, enough so that their programs will never run out.
  • The reason you computer will get REALLY slow if all its RAM is in use is that it needs to store the data it's working with SOMEWHERE, and if it can't store it in RAM it will often start storing the data on the hard drive. Accessing data on the hard drive is like 4x-1000x slower (depending on the access patterns and the kind of drive), so depending on how much data your program is accessing it can literally run 100x slower if it's not able to use RAM. But even running 5x slower is very bad.
  • Web browsers like Chrome or Firefox often use a LOT of RAM, right this second mine is using 4GB (which is the same as storing like 8 full length videos or all of the photos I have on my computer right now). I believe this is basically because webpages are complicated and every single detail of every webpage you have open needs to be stored in RAM in case you want to look at it. Browsers are kind of famous for using a lot of RAM.
  • Computer games are another kind of program that typically need a lot of RAM, because they have a lot of images and video that they need to show to you. Also image/video editors need a lot of RAM.
historical notes:
  • The amount of RAM that computers have has grown a lot: My first computer in 2002 or something had 64MB of RAM, my laptop from 2013 has 8GB of RAM. That's about 125 times more. Growth has sort of stalled in the last 10 years though and getting significantly more than 8GB of RAM still isn't that cheap.
  • Large amount of RAM are fairly expensive, like this search on newegg.com shows that buying 256 GB of RAM will run you about $1,200.

posted by oranger at 6:13 AM on September 16


There is Read-Only and Read-Write. Read-Only is fixed, you can't change it. Read-Write is mutable, you can change the value at a location.

Sequential is like a tape. You have to start at the beginning and go forward until you get to the part you want. This is still true for hard disks and even USB/Flash memory to some extent.

Memory is an address that holds a value. There are many metaphors that apply. I like the numbered pigeon hole. Every hole has a number. They go from 0 to BIGNUMBER. Every hole holds a value (usually a byte).

With Streaming, you have to rewind and start at 0 and go forward until you get the the right hole, then you can read the value in that hole. If it's Read-Only, you can't put a new value in that hole, the value never changes. If it's Read-Write, you can read the value and if you want, put a new value in the hole. Being Streaming or Sequential, the next numbered hole is easily available, you can move forward and read or read/write easily. But if you need a different hole, you have to go back and start again from the beginning.

The random part is that you can go directly to any hole immediately. You can go from 0 to 1000 to 50 to 3456 to 12 to etc. without starting over.

The problem arises when common terminology isn't really specific.

Read Only Memory (ROM) is still random access memory. It's really Read-Only random access memory.

Random Access Memory (RAM) is still random access memory. It's really Read-Write random access memory.

Disks and Tapes and Physical media of many sorts aren't really random access. They are indexed random access. Because what you really do is read some chunk of values (512 or 1024 or 4096) holes at once into RAM and then change the values in RAM and write the chunk back out (writing all of the values you read and maybe changed in one big Sequential write).

Sequential (Streaming) just means you have to find a hole (starting at the beginning probably) and can only access that hole and the one after it, and the one after that one, and the one after that second one, etc.

Random is when you can get to any hole in any order.

Whether you can only read the value or can read and write the value is a different thing. Whether you have to read/write chunks of values is a different thing. Whether the Memory is persistent without constant power is a different thing.

In general, RAM is random access non-persistent, individual Read-Write hole accessible memory.
posted by zengargoyle at 7:02 AM on September 16


"It takes you a while, but you end up with a few armfuls of books from the stacks, which you take over to a table and spread out."

This is basically the same analogy I've used when trying to describe it to people. Books are closed and shelved (stored on a hard disk drive). Books closed and shelved do you no good, so you have to bring some to a table and open them up. The only thing I add to this is that there's a finite amount of space on the surface of this table (ie how much RAM is in the system) and sometimes you can't fit another book on there, so you have to put one of them up in order to open another.

Tiny table means time spent swapping out information in order to work on it. Larger table means ability to track more things at once.

Why not spend the money on tables instead of shelves? Historically shelves are, for whatever reason, far cheaper than tables. Hard disk space is cheaper than RAM. And besides, why have a table the size of a warehouse and all of your books open on it if your reading speed (aka the CPU) can't process all of them at once?

plus when you turn the lights off the cleaning staff takes everything off the table and re-shelves it so having one big table and no shelves doesn't work.
posted by komara at 7:45 AM on September 16 [2 favorites]


I notice that the simplest form of RAM hasn't yet been discussed here, so let's talk about what RAM is: bit storage.

Let's say we want to store the result of a coin flip. We have two states. Heads and tails. We can represent this in electronics with either the presence or absence of a voltage on a pin. This state of a voltage being on or off we call a bit.

There are circuits that will persist in one state or the other. Technically they're called bistable multivibrators. We can thus have a circuit that "remembers" whether it's on or off, and stays the way we put it, until we change it on purpose. These circuits are commonly called flip-flops or latches.

Flip-flops offer electronic designers a simple way to store single bits of information. By grouping them, one can store larger values such as nibbles (4 bits) or bytes (8 bits). These can be used for simple counters or registers in electronic projects.

Nowadays, the use of such discrete logic components has fallen into obsolescence, as microcontroller designs are simpler and far more powerful. Still, the venerable 7474 dual-D flip-flop is a useful chip for many designs. I recently designed a model of the TARDIS control console that charges phones, and cascading 7474 flip-flops were used to blink LEDs as part of the design. I could have used a microcontroller to do the same thing, but designing the old-fashioned way was a fun learning experience.

RAM chips are really just a scaled-up version of bit storage, where one can supply an "address" to access the byte of data stored at that address. There are different types of RAM, with wildly differering access speed and refresh requirements, but they all work on the same basic principle of storing the states of bits for later access.
posted by sydnius at 7:48 AM on September 16 [1 favorite]


Hey, ya nerds. Anyone seen the OP lately? Or are we all just talking to each other now? :7)
posted by wenestvedt at 8:26 AM on September 16 [3 favorites]


Here's an analogy that I've used in classes that has worked well to illustrate this at a gut / intuitive level. The library metaphor is technically more accurate, but I think a bit more confusing; cooking metaphors are helpful because we innately understand them to be about sequences of processes, etc.

--

You're a sandwich making wizard.

Imagine you're making a lot of PB&J sandwiches in your kitchen. You take stuff out of your pantry or your fridge, and then you dump them on your kitchen counter. You arrange the slices of bread, put peanut butter on one slice, jelly on another slice, put them together, and set them aside.

Imagine that you have very little counter space -- for example, you have a small kitchen counter in a cramped apartment, or you're out camping and don't have a real kitchen. You can only spread out a few slices of bread at the same time. Even if you're the world's fastest PB&J maker, being cramped actually makes you slow.

BUT imagine that you now work in a giant industrial kitchen, with a huge long gleaming clean counter! Then you could lay out dozens of slices of bread, spread peanut butter and jelly much more easily, and generally assemble things really rapidly. Even if your speed is the same, having a large counter to lay things out and spread things around is really helpful!

You are the CPU, handling the processing of data. Your pantry/fridge is your hard drive, retrieving information from storage. Your countertop size is how much RAM you have -- a temporarily meaningful area you can place and retrieve information that is being handled by the CPU.

To extend the metaphor further: if you're making several types of sandwiches at once -- PB&Js, BLTs, and egg salad sandwiches, then the large countertop becomes more crucial, because there are multiple different types of things you need to have spread out. It would be so inefficient to grab things from the fridge/pantry each time you need them, even if you're fast at making sandwiches. This is why RAM is important over CPU speed, sometimes, especially when dealing with running many applications at once.

Like others have mentioned, RAM is high-speed but only exists while the system has power -- so, imagine that the pantry is huge and slow to navigate, while picking things off the kitchen countertop is really fast since it's all right there. However, the kitchen countertop is cleaned off to a blank slate every night -- it's not long term storage.
posted by many more sunsets at 9:39 AM on September 16 [2 favorites]


It really depends on what aspect of RAM is being emphasized for the purposes of the metaphor, because RAM has a few different properties that could be relevant. It has less capacity than other kinds of storage – but it's also faster and more immediate. It's also transient and impermanent (because it's wiped clean every time you restart your computer).

Consider a backup. (In the olden days, backups took the form of physical media – e.g., DVD-RW or floppy disks. These days, backups are more likely to live "in the cloud", in some data center somewhere. Either way, the principle is the same.)

A backup is like the shelf in your basement where you keep rarely-used kitchen gadgets: the ice-cream maker, that big serving platter that you only use for holidays, etc. If you find yourself needing the ice-cream maker, then it's going to take a while to go down to the basement, find it, dust it off, and haul it back upstairs – but you know that it's there if you need it. And there's plenty of room in the basement for storing that kind of thing.

Your hard drive is like your kitchen cabinets. That's where you keep the stuff that you use at least semi-regularly. It's quicker and more convenient to grab things out of the cabinets – but they're more limited, space-wise, than the basement.

RAM is like your kitchen counter. It's where you put stuff that you're actively using at that moment, or which you're likely to use in the near future. It's even quicker and more convenient to grab something from the counter – but it's even smaller than the cabinets.

But the counter (and RAM) aren't really "storage" – they're more like temporary workspaces. When you're done making that carbonara, you're going to clean off the counters, and put all the stuff back in the cabinets (or the basement).
posted by escape from the potato planet at 9:43 AM on September 16 [2 favorites]


Oh, man, too complicated.

A computer's RAM is like your desk (or like escape's kitchen counter, above). If you have a big desk, you can work on more things at once and easily see everything you're working on. If you have a little desk, you can only handle one or two things at once.

If a computer has very little RAM, it can only do its work slowly because it can't easily assemble all the stuff it needs to do the work.

So if somebody says, "I just don't have the RAM to handle that new project right now," it means their mind (or their desk) is full of other stuff they're working on, so that trying to think about any new stuff (or metaphorically spread it out on their desk) for a new project will just cause confusion, so no, they can't handle it.
posted by JimN2TAW at 1:49 PM on September 16 [1 favorite]


Another thing that might help: RAM is smaller than your hard disk in part because it’s more expensive per unit, but it’s also easier for the computer to describe where things are, and go get them, in a smaller space. Imagine finding a building on a single block by street number, instead of looking up a building somewhere on earth by country, state/province, city, postal code, street address, apartment number. For your hard disk, access is slow anyway, so it’s ok that the computer has to consult a map (the file system’s index) to find the part of memory you want, but for RAM, which has to be fast, it keeps a little table of all the things it’s stuck there right now, and can use a much faster addressing scheme to go get it (like grabbing something from a cubby vs. driving across town).
posted by Alterscape at 2:26 PM on September 16


It might be interesting to consider magnetic core memory. It is the technology for RAM before semiconductors. It's essentially tiny loops of magnetized metal that can be turned on or off, and their position can then be sensed. You may find that it's easier to picture what RAM actually does if you think of core memory.
posted by chrchr at 3:01 PM on September 16


Imagine you have a desk, and next to the desk is a bookshelf filled with books of all different sizes. You take books off the shelf and open them up on the desk one by one. Sooner or later, you run out of room on the desk. You have to either put some books away to make room, or you have to get a bigger desk.

Bookshelf: your computer's hard drive
Desk: RAM
posted by emelenjr at 4:02 PM on September 16 [1 favorite]


To keep this simple, I think the property you need to focus on is the 'random' part of RAM.

Think of a typical kitchen in which you need to store items: pots and pans, crockery, cutlery and glassware. Most people will store things organised into different cupboards, the cutlery in one drawer, all the glassware together in another cupboard etc, the one disadvantage of this is that sometimes you have to move things around to find a space - take one pot out so you can then squeeze a smaller pot in next to it.
This is very much not the way computers store things.

The computers way of storing items, would be to have hundreds of tiny thimble-sized cupboards lining all of the walls all the way up to the ceiling, then all the kitchen items would be chopped up into small pieces - all the same size, so that any piece can fit into any cupboard.

So now to put away that red casserole dish, it gets chopped into 175 different pieces and stored away into 175 randomly chosen little cupboards dotted around the kitchen. If you want to get it out again, they all get retrieved, in order and put back together again.

Now in the real world this would be madness, you would be spending hours sawing things apart and then glueing them back together again, but in a virtual world digital files can be split apart and put together in milliseconds, so thats not a problem. Also remembering where all the different bits are stored would be hard for a human, but is easy for a computer.

The reason computers do this is for speed, if it had to store that casserole dish in one piece it would first have to move other files, perhaps the files for a frying pan, but then to move the frying pan files it would also have to move the files for all the glassware and so on. Without Random Access Memory, saving one thing might take several minutes while everything else was moved about to make the right space.
posted by Lanark at 4:28 PM on September 16 [1 favorite]


Thank you for all the answers! I'm grasping my way closer to getting it, I think. Not had a chance to read and re-read, and re-read all the responses yet, but they are definitely helping! Also, hi wenestvedt :)

To answer pharm: the metaphor relates to a historical person (X) (or rather their identity) and the way in which modern people bring X back to life, use X's identity for their own purposes, specifically online and on social media.

People claim their allegiance to X, especially pretending to be X on social media - X does not belong (only) in linear, teleological history. Here, the ordered, linear version of history is akin to sequential memory - there is a beginning, middle and an end. Periodization (sequentiality) matters. And historical person X exists in this plane of course. But when modern people claim X in the now, and as part of their own identity, they are accessing a kind of read-write memory, one which is non-sequential. And an iteration of X conjured/embodied by modern person Y (etc) is ephemeral, is not stored forever as part of legitimized historical record.

X has become a multiple-use name (and identity) of a bunch of people at certain times - think "I'm Spartacus". And the reference to X, the re-writing of their history occurs, essentially, in RAM as this is where "disorganization" is powerful, fundamental. So to some extent, X persists in read-write RAM unless and until the (cultural, etc) machine shuts down - or until that machine decides to shunt that data over to "permanent" storage, the disk.

Does any of the above actually make sense, in terms of RAM functionality? After reading the answers, I think that this metaphor does work, at a core level. But again, perhaps I'm missing something due to my lack of tech expertise here.
posted by thetarium at 7:10 AM on September 18


Almost as if the version of X being claimed is one maintained by something more akin to an oral culture, where the knowledge is kept alive by repeated retelling (and re-contextualising), whereas the ur-X is to be found in history books where they are pinned down and defined once and for all?

Yes, I think the metaphor works, inasmuch as any metaphor works - the obvious “the map is not the territory” can always be applied.

(In fact RAM actually has to be read & then written back on a regular basis otherwise it decays - just as an oral culture must re-tell its stories lest they be forgotten. That might be a technical detail that was beyond the original author of the text but it’s a nice touch nonetheless.)
posted by pharm at 7:25 AM on September 18


« Older Strained lower back recs   |   French à la Québecoise Newer »

You are not logged in, either login or create an account to post comments