如何在Spark UDF中设置小数返回类型的精度和小数位数? [英] How to set the precision and scale of decimal return type in Spark UDF?

查看:81
本文介绍了如何在Spark UDF中设置小数返回类型的精度和小数位数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我的示例代码.我期望从UDF返回十进制(16,4)作为返回类型,但它是十进制(38,18).

Here is my sample code. I am expecting decimal(16,4) as return type from the UDF, but it is decimal(38,18).

有没有更好的解决方案?

Is there any better solution?

我不希望回答"cast(价格为小数(16,4))",因为我的UDF中有一些其他业务逻辑,而不仅仅是转换.

I am NOT expecting the answer "cast(price as decimal(16,4))", as I have some other business logic in my UDF other than just casting.

谢谢.

import scala.util.Try
import org.apache.spark.sql.functions.udf
import org.apache.spark.sql.types.Decimal
val spark = SparkSession.builder().master("local[*]").appName("Test").getOrCreate()
import spark.implicits._

val stringToDecimal = udf((s:String, precision:Int, scale: Int) => {
  Try(Decimal(BigDecimal(s), precision, scale)).toOption
})

spark.udf.register("stringToDecimal", stringToDecimal)

val inDf = Seq(
  ("1", "864.412"),
  ("2", "1.600"),
  ("3", "2,56")).toDF("id", "price")

val outDf = inDf.selectExpr("id", "stringToDecimal(price, 16, 4) as price")
outDf.printSchema()
outDf.show()

------------------output----------------
root
  |-- id: string (nullable = true)
  |-- price: decimal(38,18) (nullable = true)

+---+--------------------+
| id|               price|
+---+--------------------+
|  1|864.4120000000000...|
|  2|1.600000000000000000|
|  3|                null|
+---+--------------------+

推荐答案

Spark将 Decimal decimal(38,18)相关联.您需要显式转换

Spark associates Decimal with decimal(38, 18). You need an explicit cast

$"price".cast(DataTypes.createDecimalType(32,2))

这篇关于如何在Spark UDF中设置小数返回类型的精度和小数位数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆