Floating point numbers are represented as the float type in Python. Physically, they are represented as base two (binary) fractions. As a result, certain fractions cannot be represented accurately, but merely as approximations. For more details, see the Python documentation on floating point numbers.
1.1 + 1.1
As you can see, most of the math operations above behave as you would expect. However, from ".1 + .2", the inaccuracy of floating point arithmetic is apparent. For general applications, there isn't much of a concern for this inaccuracy isn't much of a concern. But when you are dealing with money or long running calculations, the small inconsistencies can become a large problem. In Python, this can be circumvented by utilizing the decimal module. The decimal module accurately handles floating point arithmetic without the limitations of base 2 fractions. You can use the decimal module like the following:
from decimal import * Decimal('.1') + Decimal('.2')
Notice that we import all (*) of the decimal module for use. We give the module a string representation of the number and use arithmetic operators as if Decimal was a number. This yields a Decimal with the correct result.
Apart from overcoming limitations of floating point arithmetic, the decimal module will calculate the result to a defined precision (20 decimal places by default) and round. Look at the examples below. The normal operation of "2/3" yields .6666 repeating, but does not get rounded to .666666666667 when the repeating decimal gets cut off. The decimal module will automatically round without you having to do so explicitly.
Decimal("2") / Decimal("3")
Don't want 20 points of precision? You can change that like so:
getcontext().prec = 4 Decimal("2") / Decimal("3")
Notice that the data type of our result is a Decimal object. If you want to cleanly output this for the user or use it as a number in a different kind of calculation or conditional, you can convert it like so:
float(Decimal("2") / Decimal("3"))
str(Decimal("2") / Decimal("3"))
int(Decimal("2") / Decimal("3"))