Precision vs Accuracy

Let us make a few definitions:
precision
- the number of digits available to represent the mantissa.
accuracy
- the maximum error we introduce because we truncate the digits. This is half of the value of the least significant digit present.
For example, in the national debt
```     4.137e12

```
we can be off by as much as
```     0.0005e12

5.000 e8

```
or \$5 hundred million.

But the accuracy is dependent on the value of the exponent. For example, the number of people in this course is

```     8.300e 1

```
which is off at most by
```     0.0005e 1

5.000 e-3

```
or only 0.005 people.

In both of these cases we have 4 digits of precision, but vastly different accuracy in the representation.

So we see that the limited number of digits affects the precision and accuracy with which we can store numbers in the computer. In addition, because of this limitation, performing arithmetic operations can affect the accuracy of the result.

For example, consider:

Division
Multiplication