Inaccuracy Float Point in Computer World

This is a famous problem in computer world. It is originated from the design of CPU which uses binary as it essential unit. Let's see the problem here:

var a = 0.2 + 0.4;


In most browser, we may have the value 0.600...0001. It is very anonying when we want to show a value with decimal point. We all know there are few ways to deal with this issue. We focus on two here:

1. Math.round(a * 10) / 10

2. parseFloat(a.toFixed(1))

I believe that you will say method 1 has high performance. But may you say how many times method 1 is better than method 2?


Please see the result here: