Don't understand why numbers add up weird



var money = [["PENNY", 1.01], ["NICKEL", 2.05], ["DIME", 3.10], ["QUARTER", 4.25], ["ONE", 90.00], ["FIVE", 55.00], ["TEN", 20.00], ["TWENTY", 60.00], ["ONE HUNDRED", 100.00]];

var total = 0;

for (var i = 0; i < money.length; i++) {
	total += money[i][1];
}
console.log(total);

I don’t really understand why the var total ends up being 335.40999999999997 instead of a number rounded up to two decimals.I know that I can round the number with Math.round or make it round to as many decimals that I want. I solved the challenge already but this remained a mystery. I had the same problem some time ago when I was adding some other numbers in a similar way.What exactly happens here?

It’s because floating point arithmetic in JavaScript is not accurate all the time, it has to do with the fact that numbers in JavaScript are always stored as double precision floating point (64 bit)

if you console.log inside your loop you can see the culprit is 1.01

1.01 1.01
2.05 3.0599999999999996
3.1 6.16
4.25 10.41
90 100.41
55 155.41
20 175.41
60 235.41
100 335.40999999999997
335.40999999999997

furthermore if you do
console.log(2.05 + 1.01 === 3.06) you get false.
but console.log(2.05 + 1.01 === 3.0599999999999996) you get true

quirky? indeed, welcome to js, if you really want a more precise answer read here http://floating-point-gui.de/basic/

If not just be careful when dealing with floating points , and always round to exactly what you need especially if precision is of paramount importance.

2 Likes