Precision vs. Accuracy




Just as a quick note, i thought it would be prudent to explain the difference between precision vs. accuracy. Accuracy can be expressed as how close the mean of the results reaches the target value. if you aimed to type 30 words per minute, and durring a typing session, typed 20 wpm in the first minute, then 25, 35, 40, and 30, the mean is 30 wpm... 20+25+35+40+30 = 150 words in 5 minutes, or 30 words per minute.

Precision, on the other hand, is a measure of variation. If you aimed to type 30, but typed at 34,36,35,35,35, that's an average of 35, not accurate. However, your results didn't vary by a large margin.

To show in a picture, a common picture, explainging this topic, imagine you're an archer. The following results at aiming for the bullseye are show with labels:

Overall, precision is often perfered to accuracy when the offset can be compensated. If a digital sample was being taken, software could compensate for the offset to have an accurate and precise result. However, is systems that compensation is difficult, accuracy may be perfered, as simple filtering techniques can be used to average the results. In such a case, both precision and accuracy can be gained at the loss of being able to perceive rapid changes.