Pyspark:如何将十天添加到现有日期列 [英] Pyspark: How to add ten days to existing date column
问题描述
我在Pyspark中有一个数据框,其中的日期列称为"report_date".
I have a dataframe in Pyspark with a date column called "report_date".
我想创建一个名为"report_date_10"的新列,该列已添加到原始report_date列中10天.
I want to create a new column called "report_date_10" that is 10 days added to the original report_date column.
下面是我尝试的代码:
df_dc["report_date_10"] = df_dc["report_date"] + timedelta(days=10)
这是我得到的错误:
AttributeError:"datetime.timedelta"对象没有属性"_get_object_id"
AttributeError: 'datetime.timedelta' object has no attribute '_get_object_id'
帮助!
推荐答案
似乎您正在使用 pandas
语法添加一列;对于Spark,您需要使用 withColumn
添加新列;要添加日期,有内置的 date_add
函数:
It seems you are using the pandas
syntax for adding a column; For spark, you need to use withColumn
to add a new column; For adding the date, there's the built in date_add
function:
import pyspark.sql.functions as F
df_dc = spark.createDataFrame([['2018-05-30']], ['report_date'])
df_dc.withColumn('report_date_10', F.date_add(df_dc['report_date'], 10)).show()
+-----------+--------------+
|report_date|report_date_10|
+-----------+--------------+
| 2018-05-30| 2018-06-09|
+-----------+--------------+
这篇关于Pyspark:如何将十天添加到现有日期列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!