Tell us what’s happening:
Hello, I encountered a weird problem. I wrote a code that should work but it failed on test #5, falling short by one penny. The reason of the code failing was because the interpreter has some weird rounding problem on the nth place after the dot.
To test it, I just put a simple return statement at the very beginning, and here’s the output:
// outputs 1.01 - everything is ok
// outputs 0.010000000000000009 - what?
so because of this weird problem, my change when I reach pennies is 0.039999999999994872 instead of 0.04
I actually “fixed” it by adding .toFixed(2) here and there, but is the interpreter even supposed to behave like that? Is it a bug or a feature?
Link to the challenge: