从 PySpark 数据框中选择随机行 [英] Select random rows from PySpark dataframe

查看:321
本文介绍了从 PySpark 数据框中选择随机行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从 PySpark 数据帧(最好是新的 PySpark 数据帧的形式)中选择 n 个随机行(无替换).这样做的最佳方法是什么?

I want to select n random rows (without replacement) from a PySpark dataframe (preferably in the form of a new PySpark dataframe). What is the best way to do this?

以下是一个包​​含十行的数据框示例.

Following is an example of a dataframe with ten rows.

+-----+-------------------+-----+
| name|          timestamp|value|
+-----+-------------------+-----+
|name1|2019-01-17 00:00:00|11.23|
|name2|2019-01-17 00:00:00|14.57|
|name3|2019-01-10 00:00:00| 2.21|
|name4|2019-01-10 00:00:00| 8.76|
|name5|2019-01-17 00:00:00|18.71|
|name5|2019-01-10 00:00:00|17.78|
|name4|2019-01-10 00:00:00| 5.52|
|name3|2019-01-10 00:00:00| 9.91|
|name1|2019-01-17 00:00:00| 1.16|
|name2|2019-01-17 00:00:00| 12.0|
+-----+-------------------+-----+

上面给出的数据帧使用以下代码生成:

The above given dataframe generated by using the following code:

from pyspark.sql import *

df_Stats = Row("name", "timestamp", "value")

df_stat1 = df_Stats('name1', "2019-01-17 00:00:00", 11.23)
df_stat2 = df_Stats('name2', "2019-01-17 00:00:00", 14.57)
df_stat3 = df_Stats('name3', "2019-01-10 00:00:00", 2.21)
df_stat4 = df_Stats('name4', "2019-01-10 00:00:00", 8.76)
df_stat5 = df_Stats('name5', "2019-01-17 00:00:00", 18.71)
df_stat6 = df_Stats('name5', "2019-01-10 00:00:00", 17.78)
df_stat7 = df_Stats('name4', "2019-01-10 00:00:00", 5.52)
df_stat8 = df_Stats('name3', "2019-01-10 00:00:00", 9.91)
df_stat9 = df_Stats('name1', "2019-01-17 00:00:00", 1.16)
df_stat10 = df_Stats('name2', "2019-01-17 00:00:00", 12.0)

df_stat_lst = [df_stat1 , df_stat2, df_stat3, df_stat4, df_stat5,
               df_stat6, df_stat7, df_stat8, df_stat9, df_stat10]
df = spark.createDataFrame(df_stat_lst)

推荐答案

pyspark.sql.DataFrame 上有一个 sample 方法.文档 这里应该有帮助.

There is a sample method on a pyspark.sql.DataFrame. The docs here should be helpful.

用法:

df.sample(withReplacement=False, fraction=desired_fraction)

这篇关于从 PySpark 数据框中选择随机行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆