JavaScript does not define different types of numbers (e.g. integers, long, etc.). JavaScript numbers are always stored as double precision floating point numbers, following the international IEEE 754 standard.

Integers are accurate up to 15 digits. The maximum number of decimals is 17, but floating point arithmetic will not always be 100% accurate. This can lead to some kooky results, so beware!

Example

console.log(.1 + .2 == .3); // false
console.log(.1 + .2); // 0.30000000000000004

console.log(9999999999999999 == 10000000000000000); // true
console.log(9999999999999999); // 10000000000000000
console.log(999999999999999); // 999999999999999