# How do I determine the growth rate of these numbers?

November 16, 2007 10:47 AM Subscribe

I have several arrays of numbers. I want to know how they correlate to another array. I think this is called exponential growth, but I can't figure out the formula.

For example:

35, 42, 51

41, 49, 60

47, 56, 69

All relate to

100, 120, 150

So if they were coordinates on a graph it would be (100,35) (120,42) (100,51) and so on

How do I determine the number which relates to 160 or 90?

I have tried getting the average growth from 100 to 150 but it does not come out the same. Each group will have it's own relationship to the 100,120,150 group. How do I figure out what the growth rate is?

Thanks a ton! And no, it's not homework, I'm not even in school anymore ;) Maybe if I'd stayed longer I'd know how to do this. :) I figured if I had one of those fancy graphing calculators it might know how to estimate a formula, but alas.

For example:

35, 42, 51

41, 49, 60

47, 56, 69

All relate to

100, 120, 150

So if they were coordinates on a graph it would be (100,35) (120,42) (100,51) and so on

How do I determine the number which relates to 160 or 90?

I have tried getting the average growth from 100 to 150 but it does not come out the same. Each group will have it's own relationship to the 100,120,150 group. How do I figure out what the growth rate is?

Thanks a ton! And no, it's not homework, I'm not even in school anymore ;) Maybe if I'd stayed longer I'd know how to do this. :) I figured if I had one of those fancy graphing calculators it might know how to estimate a formula, but alas.

42/35 = 1.2

51/42 = 1.214

49/41 = 1.195

60/49 = 1.224

56/47 = 1.191

69/56 = 1.232

In general, each number is larger than the previous by about a factor of 1.2.

That's just a rough pattern I noticed. Maybe it's enough for your purposes. If you have more data points, you can do some fancier math and be more precise about it.

But then again, I just noticed that your x-axis points (100, 120, 150) are not uniformly spaced. Is that deliberate? A uniform spacing would be 100, 120, 140. (I still think the growth rate is about 1.2 per 20 units of x-axis).

posted by PercussivePaul at 11:01 AM on November 16, 2007

51/42 = 1.214

49/41 = 1.195

60/49 = 1.224

56/47 = 1.191

69/56 = 1.232

In general, each number is larger than the previous by about a factor of 1.2.

That's just a rough pattern I noticed. Maybe it's enough for your purposes. If you have more data points, you can do some fancier math and be more precise about it.

But then again, I just noticed that your x-axis points (100, 120, 150) are not uniformly spaced. Is that deliberate? A uniform spacing would be 100, 120, 140. (I still think the growth rate is about 1.2 per 20 units of x-axis).

posted by PercussivePaul at 11:01 AM on November 16, 2007

Response by poster: Paul, I think you're onto the solution, thank you so much! Way simpler than I thought it was.

And yes, the data is not even - it's what I have to work with, just those three points. Thanks!

posted by jesirose at 11:12 AM on November 16, 2007

And yes, the data is not even - it's what I have to work with, just those three points. Thanks!

posted by jesirose at 11:12 AM on November 16, 2007

On second thought, I'm going to approximate your data points for 100,120,140 by doing a quick linear interpolation.

100,120,150 --> 100, 120, 140 (approx)

35,42,51 --> 35, 42, 48

41,49,60 --> 41, 49, 56

47,56,69 --> 47, 56, 64

Now we can see the datasets are all very close to linear. They grow by a constant amount per 20 units which ranges from 6 to 8 depending on the dataset. If you were to graph them, they would look like straight lines.

More information on what specifically you're trying to do might help here.

posted by PercussivePaul at 11:16 AM on November 16, 2007

100,120,150 --> 100, 120, 140 (approx)

35,42,51 --> 35, 42, 48

41,49,60 --> 41, 49, 56

47,56,69 --> 47, 56, 64

Now we can see the datasets are all very close to linear. They grow by a constant amount per 20 units which ranges from 6 to 8 depending on the dataset. If you were to graph them, they would look like straight lines.

More information on what specifically you're trying to do might help here.

posted by PercussivePaul at 11:16 AM on November 16, 2007

Best answer: With linear datasets, by the way, you can easily have Excel fit a line to the plot, or you can figure out an equation by hand.

slope = rise / run = (51-35)/(150-100) = 0.32

y = 0.32x + b (equation of line with slope 0.32)

35 = 0.32*100 + b (choose any point on the line to find b)

b = 3 (this is the y-intercept)

so for the first dataset, y= 0.32x + 3

plop in 100, 120, 150, and you get 35, 41.4, 51, pretty damn close. if you want to know the value at 90 or 160, stick that in for x and calculate y. You can find the equations for the other datasets in the same way.

posted by PercussivePaul at 11:25 AM on November 16, 2007 [1 favorite]

slope = rise / run = (51-35)/(150-100) = 0.32

y = 0.32x + b (equation of line with slope 0.32)

35 = 0.32*100 + b (choose any point on the line to find b)

b = 3 (this is the y-intercept)

so for the first dataset, y= 0.32x + 3

plop in 100, 120, 150, and you get 35, 41.4, 51, pretty damn close. if you want to know the value at 90 or 160, stick that in for x and calculate y. You can find the equations for the other datasets in the same way.

posted by PercussivePaul at 11:25 AM on November 16, 2007 [1 favorite]

Response by poster: So then it would grow 6/20 per unit?

Using the first data set, we'd have (101,35.3), is that correct?

So then the value at 200 should be 35+((6/20) * (200-35)) which comes out to 84.5

Is that right?

posted by jesirose at 11:26 AM on November 16, 2007

Using the first data set, we'd have (101,35.3), is that correct?

So then the value at 200 should be 35+((6/20) * (200-35)) which comes out to 84.5

Is that right?

posted by jesirose at 11:26 AM on November 16, 2007

Response by poster: It works! Thanks so much Paul!!!

posted by jesirose at 11:30 AM on November 16, 2007

posted by jesirose at 11:30 AM on November 16, 2007

Just to offer another point of view: The datasets are close to linear, but not precisely linear. You can instead fit a quadratic equation to each set. The formulas for doing so are described here, among other places. For the first set, for example, the equation comes out to:

You'll find that this works if you plug in the x values. Plugging in 200 gives you 62, though, not 84.5. Whether this analysis is valid, though, depends on what it is you're measuring.

posted by cerebus19 at 11:38 AM on November 16, 2007

`y = -0.001x`^{2} + 0.57x - 12

You'll find that this works if you plug in the x values. Plugging in 200 gives you 62, though, not 84.5. Whether this analysis is valid, though, depends on what it is you're measuring.

posted by cerebus19 at 11:38 AM on November 16, 2007

You can fit the numbers to a line, or you can fit them to a quadratic curve, or you can fit them to an exponential curve. Three data points per set isn't really enough to say which it should be, given the numbers alone. If you have some theoretical reason, based on what the numbers represent, why it should be one or the other, you can apply the appropriate technique.

posted by DevilsAdvocate at 11:53 AM on November 16, 2007

posted by DevilsAdvocate at 11:53 AM on November 16, 2007

With only three values for the independent variableâ€” and noisy measurementsâ€” you can fit all sorts of different functions to the data. At this point it's probably good to go to the actual thing you're measuring and work out what

posted by hattifattener at 11:53 AM on November 16, 2007

*kind*of functions would make sense for it (linear? quadratic or higher polynomials? exponential? logarithmic?). Otherwise it's hard to tell if, for example, the small x^{2}term in cerebus19's fit is only nonzero because of noise or inaccuracy in the measurements, or if it represents something real about the system you're measuring.posted by hattifattener at 11:53 AM on November 16, 2007

There is very limited data here to decide on an appropriate fit, but even so I'd consider exponential fit a very bad candidate. All three data sets show slight downward concavity - the rate of increase on the second interval, between 120 and 150 is less than the rate on the first interval, between 100 and 120. You could model that with an exponential function, but it certainly wouldn't have a growth rate of 1.2 as PercussivePaul suggests! (The nonuniform spacing of the x-coordinates matters, not just a little.)

That shape explains why cerebus19's quadratic has a negative coefficient on the x^2 term - and the fact that I said

In the absence of any compelling reason to do otherwise with this particular problem, I'd advise sticking with a linear fit. If you average the three data sets, you get

posted by Wolfdog at 12:01 PM on November 16, 2007

That shape explains why cerebus19's quadratic has a negative coefficient on the x^2 term - and the fact that I said

*slight*downward concavity explains why that coefficient is so small.In the absence of any compelling reason to do otherwise with this particular problem, I'd advise sticking with a linear fit. If you average the three data sets, you get

**y=0.3088x+10.87**, which is a very very good fit to the average (not hard to do when there only three data points).posted by Wolfdog at 12:01 PM on November 16, 2007

Yeah the 1.2 rate would only work if the datapoints were uniformly spaced; I didn't notice they weren't until I was already done writing the comment. Only when I went to go back and check did I realize that it makes a big difference, and that a linear fit seems much more appropriate.

posted by PercussivePaul at 12:37 PM on November 16, 2007

posted by PercussivePaul at 12:37 PM on November 16, 2007

Without any expectation for how the data varies, you don't want to make complicated guesses like exponential fits. Also, with three data points, you absolutely do not want to fit a quadratic function ( ax^2 + bx + c ). Just like you can draw a line between any two points, you can draw a parabola that connects any three points and thus your fit is infinitely "accurate" but very useless because none of the noise cancels out. Besides, if you plot the data, it looks extremely linear. All you need to do is get an equation for the line associated with each set of data and then plug in 90 and 160 for x.

So, how do you find the equations for the lines? Have Excel do a least squares fit for you. The documentation should assist in understanding how to do that, but it won't be hard. Please, though, don't use PercussivePaul's way to fit the equation for a line. The right way to fit a line, least squares, involves finding the line that minimizes the distance between it and your data. The process that he describes is a sloppy estimate that happens to be close because your data is so linear to begin with, but is not the actual best fit line.

posted by Schismatic at 1:32 PM on November 16, 2007

So, how do you find the equations for the lines? Have Excel do a least squares fit for you. The documentation should assist in understanding how to do that, but it won't be hard. Please, though, don't use PercussivePaul's way to fit the equation for a line. The right way to fit a line, least squares, involves finding the line that minimizes the distance between it and your data. The process that he describes is a sloppy estimate that happens to be close because your data is so linear to begin with, but is not the actual best fit line.

posted by Schismatic at 1:32 PM on November 16, 2007

In general situations like this, the magic words are "trend lines", as shown here.

posted by anaelith at 1:33 PM on November 16, 2007

posted by anaelith at 1:33 PM on November 16, 2007

well, he only has three data points, each of which is accurate to only two figures. least squares may give you the most correct line given those three points but any difference between that and the approximation I described is less than simple rounding error. I would argue in favor of simplicity here.

posted by PercussivePaul at 1:38 PM on November 16, 2007

posted by PercussivePaul at 1:38 PM on November 16, 2007

This thread is closed to new comments.

posted by weirdoactor at 10:59 AM on November 16, 2007