I worked briefly at a bank and one of the guys was trying to write a little JavaScript app that allowed punters to see how much interest they would pay over a given period without having to go back to the server all the time. Really simple bit of dividing over a 3 year period.

JavaScript uses doubles for numbers and the jolly old ISO-standard for how they are used in calculations.

9/3 was coming out at 2.9999999999 (9999999 …)

and there was no way to fix it.

I still don’t know how he made it look right.

I had a hack in Java for him and floats were coming out at 3, doubles had the same problem. Java and JavaScript both use the ISO standard so no surprise.

It still amuses me that more precision meant a less accurate result, at least to a human.