February 20, 2012 2:09 PM Subscribe

Is it true that it is always better to estimate parameters jointly, even when they are completely unrelated?

I vaguely recall reading about a result in statistics which stated something along the lines of "the joint estimation of two parameters always performs better (i.e. smaller variance) than the separate estimation of each parameter, even if the parameters are completely unrelated".

Is this statement, or some modification of it, true? If so, is there an intuitive explanation? Can anyone point me to a reference?
posted by TheyCallItPeace to Science & Nature (3 answers total) 1 user marked this as a favorite

I vaguely recall reading about a result in statistics which stated something along the lines of "the joint estimation of two parameters always performs better (i.e. smaller variance) than the separate estimation of each parameter, even if the parameters are completely unrelated".

Is this statement, or some modification of it, true? If so, is there an intuitive explanation? Can anyone point me to a reference?

I am not sure what to change to make the statement make sense, as it is just a vague recollection. My interpretation of performance of the estimate would lead me say that the variances that would be compared would be the variances of each variable separately using each estimation procesure/approach.

If the performance of linear regression does not change by adding variables, then it does not sound like the foggy result I have in mind.

posted by TheyCallItPeace at 6:03 PM on February 20, 2012

If the performance of linear regression does not change by adding variables, then it does not sound like the foggy result I have in mind.

posted by TheyCallItPeace at 6:03 PM on February 20, 2012

Found it! I was thinking of Stein's example, although it does not appear to lead to a real improvement in the estimate of any given variable.

posted by TheyCallItPeace at 6:08 PM on February 20, 2012 [1 favorite]

posted by TheyCallItPeace at 6:08 PM on February 20, 2012 [1 favorite]

This thread is closed to new comments.

maybe you're thinking about regression? if you keep adding variables to the regression equation the R-squared can't go down, and depending on the data it can go up. so, by adding a bunch of arbitrary variable you can explain more variation in the response variable, but the result isn't very useful.

posted by cupcake1337 at 3:42 PM on February 20, 2012