Shop OBEX P1 Docs P2 Docs Learn Events
Accuracy, Resolution, and Precision — Parallax Forums

Accuracy, Resolution, and Precision

Jay KickliterJay Kickliter Posts: 446
edited 2009-03-04 00:28 in General Discussion
I looking to use a tilt-compensated compass in the IMU object I'm working. Mainly, because it outputs pre-computed pitch and roll, which would free up some cog ram by not having to have the PASM atan/atan2. My problem is that the datasheet for the compass says it has only 1 degree accuracy. But it also has 0.1 degree resolution, and 0.2 degree repeatability.

Can I take it to mean that although it it may be off by a degree, that error can be subtracted out due to the fact that it has 0.2 degree repeatability? 1 degree of error is too much for my purposes. But 0.2 is fine.

Comments

  • PhilldapillPhilldapill Posts: 1,283
    edited 2009-03-01 19:57
    I think it can be explained best as an example...

    If your absolute heading is 57.3 degrees, then your compass may say it's anywhere between 56.3 and 58.3 degrees, and will give a reading in tenths of a degree, but will always be within 0.2 degrees of it's other readings at 57.3 degrees. Basically, if it says you are heading 57.7 degrees, then the variance will always be within 0.2 degrees of that.

    Make sense? I've always had a hard time understanding the difference between those three words, myself.

    EDIT: Better put, Accuracy deals with the relationship between the measuring device and the real world. Resolution deals with the relationship between the devices and the numerical system by which it outputs readings. Precision deals with the variance between measurements within the device itself.
  • Jay KickliterJay Kickliter Posts: 446
    edited 2009-03-01 21:47
    Thanks. So as long as repeatability is within what you need, and you can develop a baseline reference, then accuracy isn't that important?
  • SRLMSRLM Posts: 5,045
    edited 2009-03-01 22:01
    My guess is that accuracy can drift at the rate of repeatability: so, it could read 57.7, then 57.5, then 57.3, and so on. A guess, but to say that your device is always off by the same amount seems like something that could easily be calibrated out by the device itself...
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-03-01 23:06
    Jay,

    You will probably need to calibrate your compass anyway, just to account for the effects of nearby ferrous metals and electrically-induced magnetic fields. Although this will be less of a concern in a balloon payload than in your workshop, say, it's still a factor that needs consideration.

    -Phil
  • Carl HayesCarl Hayes Posts: 841
    edited 2009-03-02 06:43
    Hi, Jay --

    Accuracy·is the degree to which the readings represent the true value of the thing being measureed.· One-degree accruacy means that the measurement can always be as much as one degree in error inside the device, before it is even presented to the outisde world.

    Resolution, also called precision,·is the fineness of the gradations in the representation to the outside world.· If your device can represent 1.0 degrees, 1.1, 1.2, etc. but not, for example, 1.15 -- in other words, if it changes it always changes by one or more entire tenths, then the reading it presents can include half a·step, or 0.05 degree, as additional error (in addition to the one-degree accuracy error.· So, the measurement you see can be off by as much as 1.05 degree because of the (accuracy ferror) plus the (resolution error).· Resolution error is also called quantization error, but only when the presentation is digital.· Resolution is simply a measure of the fineness with which it can present any result, whether perfectly accurate or wildly inaccurate.·

    Repeatability·is just what you'd expect.· If the thing measures the same real-world physical quantity two or more times, and the real-world value didn't change at all, 0.2-degree repeatability says that the result presented might still change by 0.2 degrees.· This is in addition to the woes from the accuracy and resolution limits.

    Suppose, for example, the device were much less good than it is.· Let's say it's accurate within fifteen degrees.· However, it always gives you five digits of resolution·to the right of the decimal point, even if those five digits are totally meaningless.· Then the accuracy is still plus/minus 15 degrees, even though the reading gives you a number to 0.00001 of a degree.· It's just a more detailed (more precise, or higher resolution)·lie.

    Now, suppose our imaginary crummy equipment also wobbles around a bit -- say it wobbles around by three degrees all by itself.· That means that the lie it tells you will still be very detailed (resolution), but not only wrong (accuracy) but also inconsistent from one measurement to the next (repeatability).

    I guarantee the accuracy, repeatability, and precision of this answer.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    · -- Carl, nn5i@arrl.net

    Post Edited (Carl Hayes) : 3/2/2009 6:49:17 AM GMT
  • Carl HayesCarl Hayes Posts: 841
    edited 2009-03-04 00:28
    Then, of course, there's also linearity and a quite different thing, differential linearity.· Which of these is more important, if either is important at all, depends upon the application.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    · -- Carl, nn5i@arrl.net
Sign In or Register to comment.