火花十进制类型精度损失 [英] Spark decimal type precision loss

查看:100
本文介绍了火花十进制类型精度损失的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在测试火花十进制类型的货币量度,并在设置刻度和精度时看到一些奇怪的精度结果,如下所示.我想确保在计算过程中不会丢失任何数据,但是下面的示例并不能保证这一点.谁能告诉我为什么Spark sql会发生这种情况?当前版本为2.3.0

I'm doing some testing of spark decimal types for currency measures and am seeing some odd precision results when I set the scale and precision as shown below. I want to be sure that I won't have any data loss during calculations but the example below is not reassuring of that. Can anyone tell me why this is happening with spark sql? Currently on version 2.3.0

val sql = """select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val"""
spark.sql(sql).show

这将返回

+----------------+
|             val|
+----------------+
|0.33333300000000|
+----------------+

推荐答案

这是当前未解决的问题,请参见

This is a current open issue, see SPARK-27089. The suggested work around is to adjust the setting below. I validated that the SQL statement works as expected with this setting set to false.

spark.sql.decimalOperations.allowPrecisionLoss=false

这篇关于火花十进制类型精度损失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆