When the Voltage Divider Equation fails on the simplest of circuits...
August 4, 2006 7:16 PM   Subscribe

My voltage divider doesn't work! Try this: set up two 1-Megaohm resistors in series, with a 100mVpp signal at 4kHz across them. Using an oscilloscope, measure the signal between the two resistors. 20mVpp (from ground). Swap the resistors. 20mVpp from the middle of the two to ground.

An odd sort of problem, I think it has something to do with the inherent inductance of "standard" (film) resistors. I've had professors look at it and say, "Oh. That's funny." I've had lab techs look at it and say "Huh. Strange." And EE grad students are like "WTF."

Does anyone have any ideas what it could be? This problem is the same with multiple oscilloscopes, a NI data acquisition card, and a lock-in amplifier. Thanks.
posted by arrhn to Technology (21 answers total)
 
The obvious way to find out if you've inadvertantly set up a low-pass filter is to test your divider with DC. And then test it with other frequencies.

I wouldn't tend to think that film resistors would have any significant inductance. I'd be more inclined to suspect that your circuit has some parasitic capacitance in it. (A resistor in series and a parasitic capacitance to ground would be low-pass.)
posted by Steven C. Den Beste at 7:52 PM on August 4, 2006


where are you grounding it?
posted by sergeant sandwich at 7:55 PM on August 4, 2006


also, you aren't doing something bonehead like having the attenuator switch on your scope probe set to the wrong thing, right?
posted by sergeant sandwich at 7:56 PM on August 4, 2006


Does the same thing happen with smaller resistors? It sounds to me like the current flowing through the oscilloscope is affecting the circuit. Try it with resistors a few orders of magnitude smaller.
posted by cillit bang at 7:56 PM on August 4, 2006


Response by poster: I've tried it with DC. The signal starts out being 50mVpp between the two resistors as expected. Around 800 Hz, it starts to drop off significantly. By the time its at 4kHz, it's down to about 20mV.

I've tried using multiple signal generators. Checking the waveform with a square wave, it gives the "shark fin" look indicative of reactance.

Interestingly, two 1kOhm resistors in the exact same configuration gives a 50/50 split like expected. I've bough a new breadboard, cables and oscilloscope probes, and still have the same behavior. The source is on High Z. The oscilloscope probes are on 1x.
posted by arrhn at 7:58 PM on August 4, 2006


o i bet cillit bang's got it with the scope impedance!
posted by sergeant sandwich at 7:59 PM on August 4, 2006


Your scope probe has a 1meg input impedance, which effectively appears in parallel with the resistor to ground.

If that was the only extra effect going on, you would read 33mVpp from ground to mid point, so there must also be a frequency response issue..
posted by Chuckles at 8:04 PM on August 4, 2006


Try it with a different oscilloscope or some different signal acquisition. It seems unlikely to me that parasitic inductance should have as large an effect as you mention at a few kHz. It's much more likely that the measurement is incorrect or somehow loading the circuit.
posted by splitpeasoup at 8:10 PM on August 4, 2006


As usual, wikipedia knows all: Test prove. Here is another article, from Yale.

I guess I should have said the scope has a 1meg input impedance, actually..
posted by Chuckles at 8:10 PM on August 4, 2006


The source is on High Z.

That would account for the rest of the drop.. No frequency response issue at all, I would say.
posted by Chuckles at 8:12 PM on August 4, 2006


Checking the waveform with a square wave, it gives the "shark fin" look indicative of reactance.

It could be a slew rate limit, rather than simple frequency response..
posted by Chuckles at 8:28 PM on August 4, 2006


I think the other responders have got it right: your scope probe impedance is too similar to the divider resistor impedances, and the scope is throwing the circuit off.

You didn't say if you have something else connected to the center point of the divider. If you're not feeding enough current through the divider, the load at the center point could be pulling the center point down.

It definitely sounds like you need to use smaller resistors in the divider. But with 1M resistors, the scope probe alone would do it.
posted by Steven C. Den Beste at 8:38 PM on August 4, 2006


I've tried it with DC. The signal starts out being 50mVpp between the two resistors as expected.

This is the only thing that is strange about what you describe, and it is really strange..
Well.. That and the fact that nobody else can figure out the problem..

Neglecting frequency response issues (which should be valid at 4kHz), your measurement looks like this:
          Source          Impedance   ------/\/\/\-------   |                  |   |                  |ideal                 \source                /   |                  \ R_1   |                  /  ---                 \   -                  /                      |                      |-------   <-- V_measured                       |       |                      /       \                      \       /                 R_2  /       \ Scope &                      \       / Probe                      /       \                      \       / Input Impedance                      |       |                      |       |                     ---     ---                      -       -
So your reading will be:

V_measured = V_source x (R_2 || R_in) / ( R_s + R_1 + (R_2 || R_in) )

The typical assumptions are:
R_in --> infinity
and
R_s --> zero
but if they don't hold..


Anyway, switch your scope probe to x10, that will up the input impedance to 10meg.
posted by Chuckles at 8:54 PM on August 4, 2006


Get a 10x probe instead of a 1x probe. This increases the input impedance from 1M to 10M. You will still have an error of about 6% but that is much better than the 50% error with a 1x probe. There is a trade off. You attenuate the input signal by a factor of 10 with a higher impedance probe that is then amplified by the scope so that the signal to noise ratio is decreased.

You always have to be careful of literally believing everything you see with a scope. The scope has finite input capacitance and resistance which can affect high impedance circuits. It is not uncommon for digital circuits to suddenly start or stop working as soon as you touch them with a scope probe. Also, your grounding scheme can affect the appearance of high frequency signals.
posted by JackFlash at 9:00 PM on August 4, 2006


4kHz is not high frequency at all, really. You shouldn't have to worry about parasitics that much. (The exception is with the breadboard. Using a breadboard equals stuffing picofarad range capacitors between all points of your circuit.)

I'm surprised techs and professors are like "WTF," just skimming your question I knew you had a scope probe set on 1x.

Anyways, why are you using megohm resistors? Unless your application has some weird signal source, it ought to be able to drive loads with much lower impedance. If it can't drive a load of 10s-100s of Kohm then it probably should be buffered. Then, consider that using megohm resistors requires you to have a very high input impedance on the next stage. Not a problem if you are using ICs with teraohm input impedances, but even then it leads to problems in testing, as you saw.

Yes, you theoretically use less current by having megohm resistors, but do the math to consider how much 100 mV across 10Kohm actually uses.
posted by TheOnlyCoolTim at 9:21 PM on August 4, 2006


I've tried it with DC. The signal starts out being 50mVpp between the two resistors as expected.

This is the only thing that is strange about what you describe, and it is really strange..


Yeah, I wasn't quite sure of that either. AC coupling on the scope putting capacitance directly in series with the input resistance? I'd think that would give out way before 800 Hz.
posted by TheOnlyCoolTim at 9:27 PM on August 4, 2006


Metal film resistors are in fact sometimes inductive, because the film is cut in a spiral pattern. You can get non-inductive metal film resistors, though.

You say the response starts to drop off at 800 Hz. Does it continue dropping past 4kHz? A simple RC filter with R = 500kΩ and Fc = (wild guess at 3dB point) 2000 Hz would have C = 150 pF, which is not an unreasonable amount of stray capacitance. Actually, that might be the scope's actual input capacitance — the usual 10-20 pF spec is with a 10X probe. Take a look at your scope and see what it says on the front panel.

If you want to fully understand this, take measurements from 100Hz through 10kHz, plot them on a log-log scale, and compare it to the calculated response given the oscilloscope's input impedance (resistive and capacitive). I'm guessing they'll match pretty closely.
posted by hattifattener at 12:58 AM on August 5, 2006


Response by poster: For some reason, I foolishly neglected to think about the input impedance of the oscilloscope and other measurement utilities.

So, based on the responses, I am going to try using a JFET op amp against the resistors on Monday, to correctly match the impedances. It should have an input impedance of somewhere around a teraohm, with a low output impedance, so that should fix up the signal nicely.

The high impedance is required to limit current to 100nA, just small enough to run through cancer cells for around 48 hours without affecting them via power dissapation.
posted by arrhn at 2:38 AM on August 5, 2006


Scope inputs are usually spec'd as a certain resistance AND parallel capacitance... typically 15 pF, or so. It is not all 'R'.

Since the DC reads properly, looks like reactance is causing your problem. It's obviously frequency dependent.

Quick calc... I get 2.65 MOhm at 4 KHz, 15 pF.

In parallel with the 1 MOhm in the divider, the 1 MOhm in the scope, this yields 569K. 1 MOhm in series with this would result in a voltage divider of 36 mV, more or less.

A few pF either way, more or less, could cause the measurement deviation from the expected value.

I'll bet that's your culprit. Use a low capacitance FET probe and see if the reading changes. Failing that, make a buffer amp with an JFET input op-amp at unity gain and use it to isolate the scope input from the measurement node and see if that corrects the problem. Then you'll know for sure.
posted by FauxScot at 2:56 AM on August 5, 2006


Neglecting frequency response issues (which should be valid at 4kHz), your measurement looks like this:

Apparently not, as FauxScot and hattifattener point out. Don't be lazy, always do the math!

For the record, for a typical 60 MHz probe (which will have a 15 MHz bandwidth on the x1 setting):
at x1 the input capacitance is ~50pF,
at x10 the input capacitance is ~15pF.
posted by Chuckles at 5:05 AM on August 5, 2006


If you are trying to limit maximum current using a high impedance voltage source, then the output voltage is going to depend on your load, the resistance of the cancer cells. Given a constant load resistance, you can't independently select the load current and load voltage. They depend on each other according to Ohm's Law. If you want a fixed voltage, then the current is determined only by the load and you may as well use a 1K ohm voltage divider. If you want a fixed current, then you should use an op amp current source, but in that case the voltage is dependent on the load resistance.

If you are going to buffer the voltage divider with a JFET, then the output current is limited only by the load and the high impedance of the 1M divider is irrelevant. That seems kind of pointless.

It isn't real clear what you are trying to do here. If you are trying to limit the voltage and current in the load, you need to characterize the load impedance, but in any case you can't arbitrarily choose both the current and the voltage. Ohm -- not just a good idea, its the law.
posted by JackFlash at 9:40 AM on August 5, 2006


« Older Name This City   |   Help a guitarist become a shoegazer! Newer »
This thread is closed to new comments.