What causes electronics to stop working when exposed to static electricity or water? (with misc electronics questions)
April 3, 2009 9:59 PM   Subscribe

What causes electronics to stop working when exposed to static electricity or water? (with misc electronics questions)

I'm looking for an informative layman's explanation of the above question.

Also, I'm really interested in soldering electronics as a hobby. I've had many false starts with this and I think it's because I always end up reading reference or technical books. I think it was MAKE that had videos about the history of the LED and resistor that I really enjoyed but haven't found anything similar since.

My main goal is the ability to identify the function of common parts on a PCB. I think that will pique my curiosity and I'll naturally dig deeper.

I'm also confused about basic electronics vs processing chips. Are all processors created for a specialized purpose? Could you take a sound processor or a GPU out of a computer and then re-purpose it into your hobbyist work? Can you interface with any processor through software easily? Is it even worth doing this if you don't know any programming languages? It seems like the more interesting aspects of electronics beyond simple open/closed circuit motor kits require programming. I'm not against learning a programming language in fact I'd love an excuse to but I just want to understand what being an electronics hobbyist means.

I learn new things fast but it requires I find information in this counter intuitive format. If you have any websites or books you think would help I would really appreciate it.

Thanks!
posted by laptolain to Science & Nature (17 answers total) 8 users marked this as a favorite
 
What causes electronics to stop working when exposed to static electricity or water?

In short, the water contains dissolved minerals that conduct electricity, leading to short circuits. In theory, if you used absolutely pure water, you could run electronics underwater. However, in practice, getting absolutely pure water is nearly impossible. Even tiny bits of the metals and plastics leaching off the electronics themselves can give you enough ions in the system to conduct electricity.
posted by chrisamiller at 10:06 PM on April 3, 2009


Response by poster: Thanks for the answer. What makes a short circuit cause irreversible damage (assuming it is irreversible)? Is the damage to a specific part of the component that could be replaced or is it PCB wide? How variable is the cause/effect in a short circuit situation?
posted by laptolain at 10:13 PM on April 3, 2009


I've had luck rehabilitating electronics that stopped working due to water by soaking in distilled water and completely drying.
posted by acro at 10:14 PM on April 3, 2009


As far as static electricity, most microchips use very small amounts of electricity, like microvolts. When you've got a good eledctrical charge going, and you touch some random spot on a motherboard, suddenly components that were never meant to receive much power ever, get exposed to a whole bunch.

Think about toasters. The coils heat up because you run electricity through them, and they resist the flow. All metals have a certain amount of resistance, so any time you run electricity through a metal, it emits a certain amount of heat. More electricity means more heat. Too much heat, and the metal melts.

Smaller wires have more resistance, (which is why audiophiles say you should use larger-gauge wires for your speakers.) and the equivalent of wires in microchips are very small indeed.

To sum: small wires with resistance meet a large spike of electricity means small wires met, and things go wrong.
posted by fnerg at 10:21 PM on April 3, 2009 [1 favorite]


In theory, if you used absolutely pure water, you could run electronics underwater.

Not quite. Pure H2O is a pretty decent insulator, but enough voltage will punch through. However, in practice, it can work, Fermilab makes a great deal of deionized water cooling in the Tevatron and Main Injector accelerators -- they call it LCW, Low Conductivity Water.

You're right on water shorts. Static electricity causes damage from it's very high voltage -- thousands of volts -- despite the extremely low current (and thus, low energy) of the shock. CMOS based semiconductors are easily damaged by overvoltage -- the oxide that forms the gate has limited ability to resist high voltage, and some are damaged by as little as 100V.

This would often be a show stopping problem, but metal-oxide semiconductors have so many winning features that the overvoltage problem is worth dealing with. MOSFETS driving CMOS semiconductors have changed the world -- arguably, even more than the original bi-junction transistor.

Most analog and passive components can easily handle static electricity -- they're damage more by overcurrent rather than over voltage, and static electricity has almost no current to speak of.
posted by eriko at 10:24 PM on April 3, 2009


What causes electronics to stop working when exposed to static electricity or water?

Many electronic devices use MOS transistors. "MOS" refers to the structure of the transistor on the silicon substrate: "Metal" (usually aluminum), "Oxide" (silicon dioxide, an insulator), and Semiconductor (silicon with a little bit of something else added to make it somewhat conductive). The silicon dioxide layer is very very thin, and static electricity (which can be thousands of volts, though there's only a little charge) can arc through it. Once a hole is burned through the oxide, the transistor doesn't work any more.

Water is not so bad; it's the stuff in the water that causes problems. Non-pure water may be somewhat conductive, so things that aren't supposed to be connected, get connected by the water. Wet electronics often start working again if you dry them out carefully, and you're lucky enough that nothing that was accidentally connected when wet, got zapped by something else.

As far as electronics as a hobby, there are dozens (hundreds?) of kits, from 10-minute battery-powered LED blinkers, to radios, amplifiers, and computers. Check the back of Make magazine for starters. Constructing kits like that won't usually teach you much about electronics; it's like a paint-by-numbers project (which I don't think will teach you much about composition, color blending, and so forth, even if you do end up with a picture you like.)

It's sort of hard to learn to identify chips on a board (and useless, to boot) unless you've advanced enough in designing circuits that you've used something like it for your own purposes, or you can identify surrounding components and say "that must be the amplifier section," which gives you a hint as to what must be going on in the device you don't recognize.

You could repurpose chips out of a computer, but it would be vastly more trouble than it's worth. Manufacturing processes are so sophisticated now that it's really tough to get useful stuff off a board that was designed for some other purpose. (When I was a lad, I would buy junked boards & computers & stuff, just to pull parts off them. No more.) The difference between that and what circuit benders do is a bit murky (and I'm not a bender, so I can't say for sure) but I think bending usually tweaks signals going between sections of a circuit, or maybe connects sections to something they weren't expecting to be connected to, to get interesting results. Sounds like fun, but often pretty arcane stuff.

A lot of the world today is run by computers, so it's worth learning a thing or two about programming. And learning programming on a PC is as good as any other computer, plus it's often a lot easier to get started. It's not as rewarding as seeing something move or blink, though. But if you can write simple programs on a PC, you can move more easily to programming microcontrollers, which are almost always implicated in gadgets that do interesting things.

You can probably find a microcontroller hacker group near you. In Portland, OR, there's Dorkbot PDX, which meets weekly at a local brewery and everybody shows off what they've been doing. Newbies are welcome. Think of a project that you'd like to build, and chances are, they'll be able to steer you in the right direction, and tell you approximately how difficult it will be to achieve.
posted by spacewrench at 10:30 PM on April 3, 2009 [3 favorites]


What makes a short circuit cause irreversible damage (assuming it is irreversible)

One thing you learn in basic circuit design is that you've got to balance power vs. resistance in any circuit. . . bridging the + anode to the - cathode of a battery will quickly foom it.

Or, as Dr Ohm said, I = V/R.

Current (I, electron flow) is the ratio of drive voltage (V) to circuit resistance (R). A short circuit, either via a solder bridge or water, functions to connect two circuits that weren't designed to be connected, meaning you're quite likely to see an imbalanced circuit (too much V and not enough R), causing the current to increasingly increase flow, heating the most vulnerable element(s) in the circuit until they emit enough magic smoke to restore the R side of the equation.
posted by mrt at 11:14 PM on April 3, 2009


I'm also confused about basic electronics vs processing chips.

Electronically, processors are just very complex integrated circuits. One big feature that a processor will have that many other (especially passive) components will lack is some sort of a clock signal. It's this clock signal, essentially, that creates what we consider instructional logic. Furthermore, the general concept of a processor includes reprogrammability--maybe not once it's been soldered into the production package, but at least during the engineering process.

Conceptually, obviously a processor is going to be different from an op-amp or something like that.

Are all processors created for a specialized purpose?

What do you mean by "specialized purpose"? The central processor in a computer is a general purpose processor. It does lots of things very well, but nothing astoundingly. Likewise, there are plenty of general-purpose microprocessors in electronics goods and hobbies: Microchip makes a very nice line of general purpose microprocessors called the PIC. Microprocessors are commonly available in everything from 8-bit 1MHz on up to 16-bit with hundreds of MHz clock speed (as much oomph as my first computer).

However, as you point out, there are lots of special-purpose processors. These tend to do a specific task much more quickly than could be done on a general-purpose processor. Indeed, many of these processors are barely reprogrammable--offering some adjustable parameters, but no option to change the fundamental operation.

Could you take a sound processor or a GPU out of a computer and then re-purpose it into your hobbyist work?

Oh boy... not really? Modern CPUs appear to have damn-near a thousand pins. I can't imagine that a modern GPU is going to have fewer than several hundred (and probably have numbers comparable with the CPU, really). Those pins aren't there just to be pretty, they're meant to be tied into a giant goddamn morass of support circuitry: bridges, memory controllers, coprocessors, authentication devices, clocks, voltage supplies, buses, and lots of shit I can't even imagine.

In order to run one of these repurposed processors, you'd have to provide all the support circuitry. Furthermore, the only reason these particular processors are so useful is because they've been integrated into a system that does something useful.

For instance, what the fuck are you going to do with a GPU on its own? Even assuming you give it the right clocks and buses and RAM and everything. You can't talk to the chip without PCI; you can't render without RAMDACS. And then, of course, there's all the software necessary to get the GPU doing its thing. It's awful hard to load drivers when you've got no system.

Can you interface with any processor through software easily?

Huh? Do you mean, can you take two processors and communicate between them with software?

The software is generally the easy part for me (a programmer) in getting two electronic components to talk to each other. What usually trips me up is mismatched voltage levels or very high-frequency timing information that couldn't possibly be handled in software.

If you mean, can you always program any processor? The answer is not so much. You pretty much can't program something you don't have at least some documentation for. You might reverse engineer the language, but that's a lot of damn work. And pretty sophisticated.

Is it even worth doing this if you don't know any programming languages? It seems like the more interesting aspects of electronics beyond simple open/closed circuit motor kits require programming. I'm not against learning a programming language in fact I'd love an excuse to but I just want to understand what being an electronics hobbyist means.

Well, I'm a software guy... so, I think everybody should learn to program.

But, you are right that the most remarkable hobby projects today are most easily completed with microprocessors. These must be programmed. Which means learning a programming language.

But, please, please, do not learn to program by programming these chips. Instead, learn to program on a PC in some friendly language (Python, Ruby, smalltalk, etc.) and then learn the C language necessary for most microprocessor development. After learning to program, and learning your first language, picking up another language will take you literally a matter of days. But, if you teach yourself to program on a micro, you're going to come out of the experience programming like you're from the 70's.
posted by Netzapper at 11:31 PM on April 3, 2009


As far as static electricity, most microchips use very small amounts of electricity, like microvolts. [..] All metals have a certain amount of resistance, so any time you run electricity through a metal, it emits a certain amount of heat. [..] and the equivalent of wires in microchips are very small indeed

1) Most electronic circuitry runs off 1 to 10 volts, not microvolts.

2) Static electricity destroys chips by punching through insulating layers, not by fusing conductors.
posted by ryanrs at 12:32 AM on April 4, 2009


I can't imagine that a modern GPU is going to have fewer than several hundred [pins].

Hah! 1000+ pins on a modern GPU. You need at least 600 pins for that 512 bit memory bus. Another 100 for the 16 lane PCI Express interface. Plus several hundred more for power and ground. (Modern GPUs pull 100+ amps. You need a lot of pins to supply that kind of current.)
posted by ryanrs at 12:46 AM on April 4, 2009


If you do want to toy more with programming microcontrollers, you might want to learn about the BASIC Stamp, which is central to a few Make projects: http://en.wikipedia.org/wiki/BASIC_Stamp

It's a microcontoller that can be programmed with a programming language that was initially designed to be easy for learners. Starting out with kits is the right way to go, since modern electronics are too complex and too tiny to be taken apart. Even straightforward hacks or fixes often require a surgeon's hand to solder.

I once took a class in college where we had to build a computer very nearly from scratch to get an A. I got a B. But I do remember quite vividly the itty, bitty, tiny little atom bomb that went off when one of my connections short circuited.
posted by Skwirl at 12:51 AM on April 4, 2009


Modern GPUs pull 100+ amps

When plugged into a typical 15 amp household circuit, through a power supply usually rated for under 5? Neat trick.
posted by toxic at 2:04 PM on April 4, 2009


Modern GPUs pull 100+ amps

When plugged into a typical 15 amp household circuit, through a power supply usually rated for under 5? Neat trick.


It is a neat trick. The transformer was invented by Hungarian engineers Zipernowsky, Bláthy and Déri in 1884 and refined by George Westinghouse in the U.S. You can use a transformer to convert 15 amps at 120 volts to 150 amps at 12 volts.

But ryanrs made a small mistake. He meant to say 100 watts, not 100 amps. And that would be for all the components on a graphics card including RAM. The GPU would only use maybe 20 watts alone, say, 15 amps at 1.25 volts.
posted by JackFlash at 2:51 PM on April 4, 2009


I made no mistake. The fastest modern GPUs consume over 100 watts at approximately 1 volt, hence 100+ amps. Since I was discussing pin counts, current is more directly relevant than power.

The fastest GPUs consume way more than 20 watts. That's why they have ridiculously huge heatsinks. The RAM consumes much less power.
posted by ryanrs at 4:40 PM on April 4, 2009


Many electronic devices use MOS transistors. "MOS" refers to the structure of the transistor on the silicon substrate: "Metal" (usually aluminum), "Oxide" (silicon dioxide, an insulator), and Semiconductor (silicon with a little bit of something else added to make it somewhat conductive). The silicon dioxide layer is very very thin, and static electricity (which can be thousands of volts, though there's only a little charge) can arc through it. Once a hole is burned through the oxide, the transistor doesn't work any more.

Water is not so bad; it's the stuff in the water that causes problems. Non-pure water may be somewhat conductive, so things that aren't supposed to be connected, get connected by the water. Wet electronics often start working again if you dry them out carefully, and you're lucky enough that nothing that was accidentally connected when wet, got zapped by something else.


These are excellent answers and are essentially what I was going to write. MOS gate oxide breakdown is the major cause of damage due to static electricity.

"Modern GPUs pull 100+ amps"

When plugged into a typical 15 amp household circuit, through a power supply usually rated for under 5? Neat trick.


Okay, ryanrs has already addressed this too. I'd like to add that 100 A at even 1.5 V is only 150 W, which is only 1.25 A at 120 V. So very high currents are reasonable at such low voltage. A computer power supply is not a transformer (at least now that we have good power electronics), but it behaves much like one... only "better" in certain senses.

I'm not against learning a programming language in fact I'd love an excuse to but I just want to understand what being an electronics hobbyist means.

It depends. Some people learn analog electronics and build amplifiers or ham radio transmitters and receivers or guitar effects pedals or theremins. Some people build artistic things with blinking lights. Some people do embedded systems, which is essentially anything that has a microprocessor but isn't what you'd normally think of as a "computer." (While it's perhaps over-hyped, a lot of people recommend the Arduino platform for those interested in getting their feet wet fairly painlessly with embedded systems.)

If you have the patience for it, it would be well worth it to read a basic circuits textbook in your spare time, like this. But buy an older edition (3rd, 2nd) to save money, and buy it used. You have the advantage over the university students; you're not required to buy the latest warmed-over edition!

If you want to do embedded systems, you can't really escape learning at least C and eventually wrapping your head around assembly languages. These can be fairly steep learning curves. If you're interested in a more general-purpose programming language used mostly on normal "computers," then I highly recommend Python.

Are all processors created for a specialized purpose? Could you take a sound processor or a GPU out of a computer and then re-purpose it into your hobbyist work?

Well, a lot of processors are general-purpose. Examples include desktop computer CPUs, or certain microcontrollers such as the PIC or AVR families. Some microcontrollers, such as those found in iPods, are very speedy and capable. Others are extremely simple and have limited capabilities (but low power consumption and cost). Some of these have specialized hardware to perform a particular task, such as motor control. That doesn't mean they can't be adapted to other purposes, but they're best suited to a particular one.

Some kinds of processors, rather than being specialized to save on price and power consumption, are designed to perform a particular kind of task with much higher performance than a general-purpose processor. DSPs (Digital Signal Processors) are designed to be very good at filtering and otherwise processing signals and not very good at other tasks... like, say, running an operating system. Here are some popular DSPs made by Analog Devices. Some DSPs also implement floating-point arithmetic, which is common in high-performance CPUs like those in desktop computers but uncommon in microprocessors designed for embedded systems. (Why? Floating-point arithmetic takes more "work," so the chips need more logic on them--meaning more cost--and they burn more power.)

Another common kind of specialized processor is a GPU, which has highly parallel hardware designed to perform graphics tasks very quickly. Because GPUs are specifically designed to render graphics, they have very particular kinds of registers. For example, they are usually optimized for single-precision floating point arithmetic, and are usually not really capable of higher precisions. In additions, some sacrifices in exact accuracy are often accepted in exchange for higher performance, because the slight inaccuracy will never be noticed, but poor performance would. Despite these specializations, scientists have had some luck in using GPUs to perform certain kinds of scientific computing much, much, faster than general-purpose processors could.

One thing you have to consider when talking about repurposing processors from other applications is that the kinds of processors and microcontrollers you're likely to find in most modern electronics may require fairly extensive support circuitry that is built onto the circuit board you're removing them from. More importantly, most modern chips are at least packaged in things like TSOP, which you can solder by hand if you are an experienced solderer with a decent temperature-controlled iron. However, a lot of chips are starting to use packages like BGA, which is actually impossible to hand-solder (and this doesn't count).

IAAEE (I Am An Electrical Engineer).
posted by musicinmybrain at 6:41 PM on April 4, 2009


It looks like everybody's giving complicated answers, so I'll try to stick to simple ones instead:

1. Water conducts electricity (or rather, most water does -- distilled water doesn't because it doesn't have any dissolved ions), which will screw up the electrical circuit and make it not work properly, or possibly even damage the device, depending on its design. A simple circuit like, say, a computer keyboard, generally works again once it's dried out.

2. Static electricity can destroy transistors because they're very sensitive to high voltage, and static shocks are high voltage (but low current, which is why they don't kill you!). Most ICs (integrated circuits -- "chips", so to speak) have a bunch of transistors in them (millions of 'em, these days).

3. Identifying components on boards can be tricky. There are several distinct types of capacitor, for instance. Lots of boards these days are surface-mount, which means the components are made as small as possible -- usually they're just anonymous little rectangular things. Sometimes you can identify them based on the board markings -- "C22" would be a capacitor, "R3" is a resistor, etc. Interesting to look at, but most consumer boards are tricky to puzzle out unless you know plenty about electronics.

4. There are lots of ways to program logic on a board. I definitely second the recommendation of the Arduino. The Arduino site has a bunch of introductory projects you can try if you want to get your feet wet. The programming language is pretty simple, too (although doing complicated stuff on it is, well, more complicated).

If you're looking to order some stuff to get started tinkering in electronics, try Sparkfun -- your typical electronics supplier website (like Digi-Key) is ridiculously impenetrable unless you know exactly what you want, but Sparkfun is much more approachable. They stock Arduinos, too.
posted by neckro23 at 7:10 PM on April 4, 2009


The more I think about it the more I think you should learn to program now.

Your computer is an infinitely reconfigurable electronic device. For instance, the parallel and MIDI ports can be used as a bank of software controlled switches. They're usually easy to program, and you can control all sorts of projects without any other processors using the parallel port on your PC.

Likewise, you can build electronic projects with processors in them and connect them to your computer via serial line (and USB/serial converters) very, very easily.

For instance, I once built a cellphone. It consisted of a (very smart, self-contained) GSM radio module, a whole bunch of breadboard, a whole whack of various kinds of chips, a usb cable, and my laptop. I wrote custom software on the laptop that communicated with the radio to place and indicate calls, send and receive text messages, etc. Since my interest wasn't (at that point) in building a complete cellphone, but rather exploring the radio and what it could do, this made life much easier than it could ever have been trying to program an embedded microprocessor to drive a screen. Not to mention the godawful extra circuitry I would have spent weeks puzzling over and failing to build as I attempted to implement a keyboard bus.

Basically, your computer can serve as a controller and processor for just about anything you build... often vast orders of magnitude more easily than an embedded processor. You can build the interesting piece of a project, explore it fully with your computer controlling it, and then build a dedicated hardware interface when you're certain of your intended design.
posted by Netzapper at 11:18 PM on April 4, 2009


« Older 24 Hours of Le Mans Tips?   |   Getting Bent Newer »
This thread is closed to new comments.