SPARK SQl 中的 DATEDIFF [英] DATEDIFF in SPARK SQl

查看:59
本文介绍了SPARK SQl 中的 DATEDIFF的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是 Spark SQL 的新手.我们正在将数据从 SQL 服务器迁移到 Databricks.我正在使用 SPARK SQL .您能否建议如何在 SPARK sql 中为以下日期函数实现以下功能.我可以看到 datediff 在 spark sql 中只给出了几天.

I am new to Spark SQL. We are migrating data from SQL server to Databricks. I am using SPARK SQL . Can you please suggest how to achieve below functionality in SPARK sql for the below datefunctions. I can see datediff gives only days in spark sql.

DATEDIFF(YEAR,StartDate,EndDate)
DATEDIFF(Month,StartDate,EndDate) 
DATEDIFF(Quarter,StartDate,EndDate)

推荐答案

正如您所提到的,SparkSQL 确实支持 DATEDIFF,但仅支持几天.我也会小心,因为对于 Spark 来说,参数似乎是相反的,即

As you have mentioned SparkSQL does support DATEDIFF but for days only. I would also be careful as it seems the parameters are the opposite way round for Spark, ie

--SQL Server
DATEDIFF ( datepart , startdate , enddate )

--Spark
DATEDIFF ( enddate , startdate )

然而,Spark 确实支持名为 months_between 的类似函数,您可以使用它代替 DATEDIFF(month ....该函数还返回一个十进制数,因此可以选择强制转换将其转换为 INT 以获得与

Spark does however support a similar function called months_between which you could use in place of DATEDIFF( month .... This function also returns a decimal amount so optionally cast it to INT for similar functionality to the

SELECT startDate, endDate, 
  DATEDIFF( endDate, startDate ) AS diff_days,
  CAST( months_between( endDate, startDate ) AS INT ) AS diff_months      
FROM yourTable
ORDER BY 1;

还有 yearquarter 函数分别用于确定日期的年份和季度.您可以简单地减去年份,但季度会更棘手.可能您必须做数学"或最终使用日历表.

There are also year and quarter functions for determining the year and quarter of a date respectively. You could simply minus the years but quarters would be more tricky. It may be you have to 'do the math' or end up using a calendar table.

这篇关于SPARK SQl 中的 DATEDIFF的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆