Is joint parameter estimation always strictly better than individual parameter estimation? February 20, 2012 2:09 PMSubscribe

Is it true that it is always better to estimate parameters jointly, even when they are completely unrelated?

I vaguely recall reading about a result in statistics which stated something along the lines of "the joint estimation of two parameters always performs better (i.e. smaller variance) than the separate estimation of each parameter, even if the parameters are completely unrelated".

Is this statement, or some modification of it, true? If so, is there an intuitive explanation? Can anyone point me to a reference?

your statement doesn't make a whole lot of sense on it's own. you could construct examples where it's true and where it's not true. it all depends on the question you're asking. what variances are you comparing?

maybe you're thinking about regression? if you keep adding variables to the regression equation the R-squared can't go down, and depending on the data it can go up. so, by adding a bunch of arbitrary variable you can explain more variation in the response variable, but the result isn't very useful. posted by cupcake1337 at 3:42 PM on February 20, 2012

I am not sure what to change to make the statement make sense, as it is just a vague recollection. My interpretation of performance of the estimate would lead me say that the variances that would be compared would be the variances of each variable separately using each estimation procesure/approach.

If the performance of linear regression does not change by adding variables, then it does not sound like the foggy result I have in mind. posted by TheyCallItPeace at 6:03 PM on February 20, 2012

Found it! I was thinking of Stein's example, although it does not appear to lead to a real improvement in the estimate of any given variable. posted by TheyCallItPeace at 6:08 PM on February 20, 2012 [1 favorite]

« Older Tell me about your favorite ar... | We live in the sticks. I'd lik... Newer »

maybe you're thinking about regression? if you keep adding variables to the regression equation the R-squared can't go down, and depending on the data it can go up. so, by adding a bunch of arbitrary variable you can explain more variation in the response variable, but the result isn't very useful.

posted by cupcake1337 at 3:42 PM on February 20, 2012