如何在 pyspark 的 For 循环中插入自定义函数? [英] How to insert a custom function within For loop in pyspark?
本文介绍了如何在 pyspark 的 For 循环中插入自定义函数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我在 Azure 数据块中面临 Spark 挑战.我有一个数据集
I am facing a challenge in spark within Azure databricks. I have a dataset as
+------------------+----------+-------------------+---------------+
| OpptyHeaderID| OpptyID| Date|BaseAmountMonth|
+------------------+----------+-------------------+---------------+
|0067000000i6ONPAA2|OP-0164615|2014-07-27 00:00:00| 4375.800000|
|0065w0000215k5kAAA|OP-0218055|2020-12-23 00:00:00| 4975.000000|
+------------------+----------+-------------------+---------------+
现在我需要使用循环函数将行附加到此数据帧.我想在 pyspark 中复制以下功能.
Now I need to use a loop function to append rows to this dataframe. I want to replicate the below function in pyspark.
Result = ()
for i in (1:12)
{
select a.PootyHeaderID
,a.OpptyID
,dateadd(MONTH, i, a.Date) as Date
,BaseAmountMonth
from FinalOut
Result = Result.Append()
print(i)
}
每个附加行中的日期必须是下一个月份(滚动 12 个月).它应该是这样的.
The date in each of the appended rows must have a succeeding month (rolling 12 months). It should look like this.
+------------------+----------+-------------------+---------------+
| OpptyHeaderID| OpptyID| Date|BaseAmountMonth|
+------------------+----------+-------------------+---------------+
|0067000000i6ONPAA2|OP-0164615|2014-07-27 00:00:00| 4375.800000|
|0067000000i6ONPAA2|OP-0164615|2014-08-27 00:00:00| 4375.800000|
|0067000000i6ONPAA2|OP-0164615|2014-09-27 00:00:00| 4375.800000|
.
.
.
|0067000000i6ONPAA2|OP-0164615|2015-06-27 00:00:00| 4375.800000|
|0065w0000215k5kAAA|OP-0218055|2020-12-23 00:00:00| 4975.000000|
|0065w0000215k5kAAA|OP-0218055|2021-01-23 00:00:00| 4975.000000|
|0065w0000215k5kAAA|OP-0218055|2021-02-23 00:00:00| 4975.000000|
.
.
.
|0065w0000215k5kAAA|OP-0218055|2021-11-23 00:00:00| 4975.000000|
+------------------+----------+-------------------+---------------+
如何根据另一个字段使间隔长度动态化?
How will I make the interval lengths dynamic based on another field?
+------------------+----------+-------------------+---------------+--------+
| OpptyHeaderID| OpptyID| Date|BaseAmountMonth|Interval|
+------------------+----------+-------------------+---------------+--------+
|0067000000i6ONPAA2|OP-0164615|2014-07-27 00:00:00| 4375.800000| 12|
|0065w0000215k5kAAA|OP-0218055|2020-12-23 00:00:00| 4975.000000| 7|
+------------------+----------+-------------------+---------------+--------+
推荐答案
你可以分解一个时间戳序列:
You can explode a sequence of timestamps:
import pyspark.sql.functions as F
df2 = df.withColumn(
'Date',
F.expr("""
explode(
sequence(
timestamp(Date),
add_months(timestamp(Date), `Interval` - 1),
interval 1 month
)
)
""")
)
df2.show(99)
+------------------+----------+-------------------+---------------+--------+
| OpptyHeaderID| OpptyID| Date|BaseAmountMonth|Interval|
+------------------+----------+-------------------+---------------+--------+
|0067000000i6ONPAA2|OP-0164615|2014-07-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2014-08-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2014-09-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2014-10-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2014-11-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2014-12-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-01-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-02-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-03-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-04-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-05-27 00:00:00| 4375.800000| 12|
|0067000000i6ONPAA2|OP-0164615|2015-06-27 00:00:00| 4375.800000| 12|
|0065w0000215k5kAAA|OP-0218055|2020-12-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-01-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-02-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-03-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-04-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-05-23 00:00:00| 4975.000000| 7|
|0065w0000215k5kAAA|OP-0218055|2021-06-23 00:00:00| 4975.000000| 7|
+------------------+----------+-------------------+---------------+--------+
这篇关于如何在 pyspark 的 For 循环中插入自定义函数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文