Spark十进制类型精度损失 [英] Spark decimal type precision loss

查看:71
本文介绍了Spark十进制类型精度损失的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在对货币度量的火花小数类型进行一些测试,当我设置如下所示的比例和精度时,我看到了一些奇怪的精度结果.我想确保在计算过程中不会丢失任何数据,但下面的示例并不能保证这一点.谁能告诉我为什么会在 spark sql 中发生这种情况?当前版本为 2.3.0

I'm doing some testing of spark decimal types for currency measures and am seeing some odd precision results when I set the scale and precision as shown below. I want to be sure that I won't have any data loss during calculations but the example below is not reassuring of that. Can anyone tell me why this is happening with spark sql? Currently on version 2.3.0

val sql = """select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val"""
spark.sql(sql).show

返回

+----------------+
|             val|
+----------------+
|0.33333300000000|
+----------------+

推荐答案

这是一个当前未解决的问题,请参阅 SPARK-27089.建议的解决方法是调整以下设置.我验证了 SQL 语句在将此设置设置为 false 的情况下按预期工作.

This is a current open issue, see SPARK-27089. The suggested work around is to adjust the setting below. I validated that the SQL statement works as expected with this setting set to false.

spark.sql.decimalOperations.allowPrecisionLoss=false

这篇关于Spark十进制类型精度损失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆