var worfsMoney = 600.90; var bloodWinePrice = 200.30; var worfsTotal = bloodWinePrice * 3; // Outputs: false console.log(worfsMoney >= worfsTotal); // Outputs: 600.9000000000001 console.log(worfsTotal);
As you can see, simply checking whether Worf has enough money fails because his total doesn't come to exactly $600.90. But why is that?
Your second thought might be to do something like this:
// Outputs: true console.log(worfsMoney.toFixed(2) >= worfsTotal.toFixed(2));
But even though we get the correct result, using
toFixed means that we are just comparing formatted strings rather than actual numbers. And what we really want is a way to get an accurate numeric representation of our numbers.
var worfsMoney = 60090; var bloodWinePrice = 20030; var worfsTotal = bloodWinePrice * 3; // Outputs: true console.log(worfsMoney >= worfsTotal); // Outputs: 60090 console.log(worfsTotal);
Because integers can be represented perfectly accurately, just shifting our perspective to treat cents as the basic unit of currency means that we can often avoid the problems of comparisons in floating point arithmetic.
Thanks for reading!