I know there's some sort of Law of Why This Shit Works
October 29, 2008 12:36 AM   Subscribe

MathFilter: When you take the average of a series of ratios expressed as fractions, you get the same result whether you add together all of the fractions, then divide them by the size of the series or divide the sum of all of the numerators by the sum or all of the denominators. What's this called, and what's the formal proof for this look like?



If you average those together, you get the same result as dividing 14 (the sum of the numerators) by 25 (the sum of the denominators). Here's a longer example (with decimal approximations so that I don't have to deal with really big common denominators):

2/6 = .33
3/7 = .429
6/6 = 1
3/8 = .375
4/9 = .4444
7/9 = 7778
6/7 = .857
7/8 = .875

5.0905 / 8 = .6363

Sum of numerators/sum of the denominators:
38/60 = .6333

I know this has to do with commutative properties of something or other, but I can't recall just what.
posted by ignignokt to Science & Nature (14 answers total)
I don't see this. 1/2 plus 1/3 is 5/6, and half that is 5/12 (or 0.416-repeating), which is halfway between 1/2 and 1/3. Adding the numerators and denominators gets you 2/5, which is 0.4 exactly.
posted by wanderingmind at 12:41 AM on October 29, 2008

It can't be a law since there are counterexamples. I'm surprised how well it works for small numbers in numerator and denominator while ignoring rounding, though.
posted by dhoe at 1:03 AM on October 29, 2008

What I want to say is, there might still be a name for it. Maybe there are professions where people often have to compute the average of fractions with small numbers in their head, similar to the Rule of 72 in finance.
posted by dhoe at 1:08 AM on October 29, 2008

Response by poster: Yeah, it doesn't yield exactly the same value both ways, but it seems pretty close every time. It's late, I'm tired, and I assumed the differences came from rounding, but wanderingmind proved that's not the case.

Now, I'm even more interested in why it gets close.
posted by ignignokt at 1:51 AM on October 29, 2008

For one, if a/b < c/d, then a/b < (a+c)/(b+d) < c/d, so both this bogus average of two fractions and their real average lie between them. The bogus one might be closer to one fraction than the other, but maybe this is somewhat balanced when "averaging" multiple fractions. (It's too late/early to do any more serious investigation.)
posted by parudox at 2:08 AM on October 29, 2008


sum is 10.4

sum of numerators is 24
sum of denominators is 24
(sum of numerators)/(sum of denominators) = 1

So it's not true but if you randomly pick single digit numerators and denominators, restrict yourself to fractional values that are less than one, and sum long sequences of fractions it'll appear to be a good approximation if you don't look too closely.
posted by rdr at 2:08 AM on October 29, 2008

The actual identity is
a/b + c/d = (a*d + b*c) / (b*d)
This is due to the multiplicative identity x*1=x and its equivalent written using division, y/y=1. Using these two rules, you may rewrite the left-hand side of the equation as
(a/b)*(d/d) + (c/d)*(b/b)
Further application of the properties of multiplication gives
(a*d)/(b*d) + (b*c)/(b*d)
followed by "undistribution" of the common factor 1/(b*d) over addition:
(a*d + b*c)/(b*d)∎
I don't see any reason why the rule you propose, unless it is this rule.
posted by jepler at 5:15 AM on October 29, 2008 [2 favorites]

I'm afraid you are simply wrong. The result you state is false (a+b)/(c+d) ¬ = Avg (a/b, c/d)
posted by mary8nne at 7:00 AM on October 29, 2008 [1 favorite]

The "law" you're looking for is called "the law of coincidence."
posted by toomuchpete at 7:07 AM on October 29, 2008

Something else to note: if you express some of the fractions differently, the fake average changes while the real one does not. For example, if you replace the 6/7 in your first example with 6000000000/7000000000, then the real average is unchanged, but the fake average becomes very very close to 6/7.

In other words, the real average of two fractions is halfway in between them; the fake average is somewhere in between them, but is drawn closer to the one whose terms are larger. In your examples, all the fractions have similarly-sized terms (single digits for all), so the fake average isn't very far from the real average. But if you start increasing the disparity between the term sizes, the averages drift apart. You can use rdr's construction to make the difference between them as large as you want (just replace the 10 with a large enough number).

Also, requiring the fractions to be in simplest terms still doesn't force the averages to be close: in the first example, replace 6/7 with 59999/70000.
posted by equalpants at 8:50 AM on October 29, 2008

This is false and it isn't even close. Consider:


Average is .402

(1+1+1+1+1)/(2+2+2+2+100) = .04629..
posted by Class Goat at 9:19 AM on October 29, 2008 [1 favorite]

Response by poster: Ah, never mind, then. I'm kinda curious about which ranges for which it works, but not that curious. Thanks for checking that out, guys.
posted by ignignokt at 7:50 PM on October 29, 2008

When it comes to mathematics, either you're exactly right or you're completely wrong. This is completely wrong. It doesn't "work" reliably for any ranges at all.
posted by Class Goat at 8:04 PM on October 29, 2008

Best answer: For the two-fraction case (a/b and c/d), the difference between the fake average and the real average is:

a(bd - d2) - c(b2 - bd)
2b2d + 2bd2

...so the ranges for which it "works" (i.e. the fake average is close to the real average) are the ranges where this value is small. Just glancing at it, you can see a couple things:
-If b and d are equal, then the terms in parentheses on the top become zero, and the fake average equals the real one.
-In general, the top is small when the imbalance between the parenthetical terms is "balanced out" by the difference between a and c. For example, if the terms in parentheses ended up being 5 and 8, that would be balanced out if a and c were 8 and 5: the top would be 8*5 - 5*8. So the fake average gets more accurate when the difference between b and d is countered by the difference between a and c, or when both of those differences are small.
-The bottom increases with b and d, so all other things being equal, the fake average gets more accurate as b and d increase.

The many-fraction case is a lot more complicated, but I think it ends up being pretty similar, in that the accuracy gets better when the numerators/denominators get closer together.
posted by equalpants at 7:25 AM on October 30, 2008

« Older Practice to Build a Computer?   |   Fastmail vs Gmail Newer »
This thread is closed to new comments.