Uncertainty Analysis for Dummies like me.
January 15, 2010 11:19 AM   Subscribe

Please recommend some good references on the topic of Test Measurement Uncertainty and Uncertainty Analysis.

Hi!

I have a background in EE, and this is a subject that my schooling did not teach me very well (sadly). I've reviewed ASME PTC 19.1-2005 "Test Uncertainty" and some of it was really helpful and good but a lot of it flew over my head. None of the examples provided were simple. The examples were interesting, but were overly complicated for (what I believe) are my needs.

I need a good reference for how to get from Point A - knowing the independent accuracies of my instruments, to Point B - knowing the accuracy of a particular number in a data file.

I feel like I've gotten a grip on correcting systematic errors in my measurement process. For instance, we have standard calibrated pressure transducers of various ranges that we use. Certificates of conformance and calibration traceability are kept to high standards here.

It is the random errors that I have a hard time working with. I just have so many questions that it is difficult to outline them all here. So I'm after a good reference that defines the measurement process, defines how to measure (or calculate) the error of that measurement process, and the standard ways of reporting this error.

If anyone can recommend some good reference I would greatly appreciate it. I have a stack of standards, but I need something on the introductory level before I feel like I can start applying the standards to my practice.

Thank you for your input. I lack mentor-ship on these subjects, so must turn to the hive mind.

-nic
posted by nickerbocker to Education (10 answers total) 3 users marked this as a favorite
 
Best answer: This PDF was used in my college physics Electricity and Optics Lab. It's pretty low-level, but should answer your questions with regard to random error. You might be especially interested on the section titled "Combining Unrelated Sources of Error" and the sections on Error Prorogation.

This stuff is pretty difficult to do correctly! If you gave a bit more information about what sorts of transformations you're doing on the collected data to get "a particular number in a data file", I might be able to narrow down which type of error calculation you need to do.
posted by muddgirl at 11:27 AM on January 15, 2010


Propagation. Error Propagation.
posted by muddgirl at 11:32 AM on January 15, 2010


Response by poster: Thanks for the response.

Most of what we measure here is temperature and pressure. So for instance a pressure transducer has some reported error of 1% lets say. I take that into a computerized DAQ system that has some other error % relating to the range of voltage input. On top of that, there is always this pesky thing called noise in the system. According to ASME PTC 19.1-2005, I should sample the noise, take the RMS of the noise, and include that as an error. There also seems to be a preference towards analog filtering over digital filtering...as in, you measure the noise after your analog filters but w/o any digital filtering. I don't understand what the difference here is. Also, as my sample rate for measuring the noise goes up, my noise goes up because I'm seeing more stuff. What sample rate should I use to quantify the noise in my system? Is it some interval above the rate that I'm sampling the signal prior to digital filtering?

Uhhhggg.....this stuff gets so hairy. I wish I had a mentor here to show me the ropes. I've been at this job for almost 4 years now, and I'm it as far as the EE guy goes. I've had to rely on books and references for everything. I need some hand holding here :P.
posted by nickerbocker at 11:36 AM on January 15, 2010


Best answer: This text was my undergrad reference. It is out of print, and that is a shame.
posted by jet_silver at 11:47 AM on January 15, 2010


Response by poster: Thanks for the links guys. That is what I'm after here. Jet_silver, that books is available at Amazon Marketplace for not too much money. I'm going to pick it up. Thanks.
posted by nickerbocker at 12:01 PM on January 15, 2010


Best answer: Data Reduction and Error Analysis for the Physical Sciences by Philip Bevington is a good book for learning the error propagation and statistics you need for this kind of error analysis. However, it won't help you with the A/D filtering and sample rate issues. My guess regarding the analog vs. digital filters is that the analog filter will introduce noise on its own, while the digital filter will not. I don't understand why increasing your sample rate increases noise levels unless it is periodic in some way such as 60Hz line noise.
posted by wigner3j at 12:09 PM on January 15, 2010


Response by poster: I would like to know what a good procedure for sampling, recording and reporting noise would be. As your sample rate goes up you will see more transients from EMF (probably). The standard I have says to just "sample it" but it is a little more complicated than that I think...
posted by nickerbocker at 12:48 PM on January 15, 2010


Best answer: This book was what I used as an undergrad. I found it useful for exactly what you seem to want.

Plus, it has a funny picture on the front.
posted by chicago2penn at 1:13 PM on January 15, 2010


Best answer: The references given so far are perfectly good, but aren't going to entirely answer your concerns. The reference above by mudgirl is really nice because it includes the calculus of calculating errors, which one doesn't often see in intro texts. Bevington and Taylor are both quite good for the basic math.

None of these however, cover some of the important aspects of measurement uncertainty. Expanded uncertainty, for example, is widely used as uncertainty estimate in place of standard error because it's more generic. The level of confidence can be explicitly defined from the sample variance and the desired confidence in the result. Likewise, the Type A, holistic approach to measurement uncertainty estimation is much preferred to the Type B error-propagation approach taught at most undergraduate levels. If you're not in a course that requires Type B, like an undergrad lab, you really, really want to use a Type A approach for anything real world.

The "mother document" for uncertainty in industry is "The Guide to the Expression of Uncertainty in Measurement" (GUM), published by the ISO. It can be ordered through the ISO directly. A free PDF version can be had from BPIM. The GUM largely based on work by NIST in the US, and the NIST documents can also be had for free: here.

Now, the GUM is a really technical document, in international standard, with all that entails. Lots of people have written commentaries and interpretations on it. Here's a nice plain language introduction to the concepts: Bell, A Beginner's Guide to Uncertainty of Measurement.

That's a taste. Feel free to mail me if you have specific questions.
posted by bonehead at 1:41 PM on January 15, 2010


Response by poster: Thanks for the additional resources! I'm going to go rummage through our local used book store for these suggestions this weekend.
posted by nickerbocker at 3:35 PM on January 15, 2010


« Older Low-key Miami Beach?   |   Looking for great 'annoying' music heard on NPR. Newer »
This thread is closed to new comments.