admin管理员组文章数量:1430595
I'm curious, I have programmed JavaScript already few years but sometimes I get confused when I see the following variable declarations: (ofc. those could be any other numbers as well).
var exampleOne = 0.5;
var exampleTwo = .5;
What is the difference between these two, or is there any? Are there some sort of hidden benefits which I clearly don't understand?
I'm curious, I have programmed JavaScript already few years but sometimes I get confused when I see the following variable declarations: (ofc. those could be any other numbers as well).
var exampleOne = 0.5;
var exampleTwo = .5;
What is the difference between these two, or is there any? Are there some sort of hidden benefits which I clearly don't understand?
Share Improve this question edited Feb 18, 2014 at 2:10 Felix Kling 818k181 gold badges1.1k silver badges1.2k bronze badges asked Feb 18, 2014 at 1:50 Mauno VähäMauno Vähä 9,7883 gold badges35 silver badges55 bronze badges 3- Some people find the first version easier to read, but there's no difference to the puter. – Barmar Commented Feb 18, 2014 at 1:52
- Relevant section in the spec: es5.github.io/#x7.8.3 (DecimalLiteral) – Felix Kling Commented Feb 18, 2014 at 1:52
- The difference is one character. – Ja͢ck Commented Feb 18, 2014 at 1:54
2 Answers
Reset to default 4To quote the specification:
0.5
matches the rule DecimalLiteral :: DecimalIntegerLiteral . DecimalDigits which is evaluated as (MV means mathematical value):
The MV of DecimalLiteral :: DecimalIntegerLiteral . DecimalDigits is the MV of DecimalIntegerLiteral plus (the MV of DecimalDigits times 10–n), where n is the number of characters in DecimalDigits.
.5
matches the rule DecimalLiteral :: . DecimalDigits which is evaluated as
The MV of DecimalLiteral :: . DecimalDigits is the MV of DecimalDigits times 10–n, where n is the number of characters in DecimalDigits.
So you can see that the only difference is that the value of the digits preceding the .
are added to the final value. And adding 0
to a value doesn't change the value.
There is no difference.
The Numeric Literals are parsed equivalently - that is, both 0.5
and .5
(as would .50
) represent the same number. (Unlike most other languages, JavaScript has only one kind of number.)
I prefer to always include the [optional] leading 0 before the decimal.
本文标签: javascriptWhat39s the difference when defining var as 05 compared to 5Stack Overflow
版权声明:本文标题:javascript - What's the difference when defining var as 0.5 compared to .5? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745539609a2662432.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论