Python Decimal module stops adding Decimals to another Decimal once it reaches 1.0

Question:

I am using python’s decimal module to do some work involving decimals. I have the following code:

from decimal import *
getcontext().prec = 2  # use two decimal places

counter = Decimal(0)
while counter != Decimal(1000.01):
    print(counter)
    counter += Decimal(0.01)

This should print every number from 0 to 1000.00 in increments of 0.01, but for some reason,
the numbers 0.01 to 0.09 have three decimal places (i.e. 0.010 instead of 0.01), and after counter reaches 1.0 (with one decimal place for some reason), it just stops increasing at all and remains at 1.0. The output looks something like this:

0
0.010
0.020
0.030
0.040
0.050
0.060
0.070
0.080
0.090
0.10
0.11
0.12
...
0.97
0.98
0.99
1.0
1.0
1.0

(repeats 1.0 forever)

What am I doing wrong here?

Asked By: Justin Cheng

||

Answers:

Precision is for number of total digits, not number after the decimal, for calculations, so for 1000.01 you need at least 6.

Also use strings to initialize a Decimal, because using a float will already be inaccurate for values that aren’t nicely represented in base 2.

Example:

>>> from decimal import Decimal as d, getcontext
>>> d(0.01)  # don't use float.  It is already inaccurate
Decimal('0.01000000000000000020816681711721685132943093776702880859375')
>>> getcontext().prec  # default precision
28
>>> d(0.01) + d(0.01)
Decimal('0.02000000000000000041633363423')
>>> d('0.01')   # exact!
Decimal('0.01')
>>> getcontext().prec = 6
>>> d(0.01)  # doesn't affect initialization.
Decimal('0.01000000000000000020816681711721685132943093776702880859375')
>>> d(0.01) + d(0.01)  # now the calculation truncates to 6 digits of precision
Decimal('0.0200000')   # note that 2 is the first digit and 00000 are the next 5.
>>> d('0.01') + d('0.01')
Decimal('0.02')

Fixes to OP example:

from decimal import *
getcontext().prec = 6  # digits of precision, not really needed...default works too.

counter = Decimal('0')
while counter != Decimal('1000.01'):
    print(counter)
    counter += Decimal('0.01')
Answered By: Mark Tolonen
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.