另一个浮点问题 [英] Another floating point question

查看:133
本文介绍了另一个浮点问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经阅读了这里关于浮点的大部分文章,并且我理解使用IEEE 754的基本问题(仅仅以二进制存储数字的性质),某些分数不能被表示。我试图找出以下几点:如果Python和JavaScript使用IEEE 754标准,为什么它在Python中执行以下内容:

.1 + .1



结果在
0.20000000000000001(这是预料之中)



至少Chrome和Firefox)的答案是.2

然而执行

<1> .1 + .2



在这两种语言中结果为
0.30000000000000004

另外,执行
var a = 0.3;在JavaScript和打印结果在
0.3



在python中执行a = 0.3的结果是0.29999999999999999



另外,OS上的许多帖子都链接到Java的BigDecimal的JavaScript端口,但是,该链接已经死亡。有没有人有一个副本?

解决方案


在Python中执行a = 0.3结果
0.29999999999999999

不完全 - 手表:

 >>> a = 0.3 
>>>打印
0.3
>>> a
0.29999999999999999

如您所见, print ing a 显示 0.3 - 因为默认情况下 print rounds to 6或7个十进制数字,在提示符处输入一个表达式(这里 a 是一个单变量表达式)显示结果的位数超过了两倍(因此揭示了浮点数的内部限制)。

对于如何显示数字,Javascript可能会有稍微不同的舍入规则,舍入的确切细节足以解释您观察到的差异。请注意,例如(在Chrome的JavaScript控制台上):

 > (1 + .1)* 1000000000 
1100000000
> (1 + .1)* 100000000000000
110000000000000.02

如果你能看到更多的数字,那么异常(不可避免的 )也会变得可见。

I have read most of the posts on here regarding floating point, and I understand the basic underlying issue that using IEEE 754 (and just by the nature of storing numbers in binary) certain fractions cannot be represented. I am trying to figure out the following: If both Python and JavaScript use the IEEE 754 standard, why is it that executing the following in Python

.1 + .1

Results in 0.20000000000000001 (which is to be expected)

Where as in Javascript (in at least Chrome and Firefox) the answer is .2

However performing

.1 + .2

In both languages results in 0.30000000000000004

In addition, executing var a = 0.3; in JavaScript and printing a results in 0.3

Where as doing a = 0.3 in Python results in 0.29999999999999999

I would like to understand the reason for this difference in behavior.

In addition, many of the posts on OS link to a JavaScript port of Java's BigDecimal, but the link is dead. Does anyone have a copy?

解决方案

doing a = 0.3 in Python results in 0.29999999999999999

Not quite -- watch:

>>> a = 0.3
>>> print a
0.3
>>> a
0.29999999999999999

As you see, printing a does show 0.3 -- because by default print rounds to 6 or 7 decimal digits, while typing an expression (here a is a single-variable expression) at the prompt shows the result with over twice as many digits (thus revealing floating point's intrinsic limitations).

Javascript may have slightly different rounding rules about how to display numbers, and the exact details of the rounding are plenty enough to explain the differences you observe. Note, for example (on a Chrome javascript console):

> (1 + .1) * 1000000000
  1100000000
> (1 + .1) * 100000000000000
  110000000000000.02

see? if you manage to see more digits, the anomalies (which inevitably are there) become visible too.

这篇关于另一个浮点问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆