从火花数据框中获取特定行 [英] get specific row from spark dataframe
问题描述
scala spark 数据帧中是否有 df[100, c("column")]
的替代方法.我想从一列火花数据框中选择特定的行.例如 100th
上面 R 等效代码中的行
Is there any alternative for df[100, c("column")]
in scala spark data frames. I want to select specific row from a column of spark data frame.
for example 100th
row in above R equivalent code
推荐答案
首先,您必须了解 DataFrames
是分布式的,这意味着您无法在典型的程序 方式,您必须先进行分析.虽然,你问的是 Scala
我建议你阅读 Pyspark 文档,因为它的示例比任何其他文档都多.
Firstly, you must understand that DataFrames
are distributed, that means you can't access them in a typical procedural way, you must do an analysis first. Although, you are asking about Scala
I suggest you to read the Pyspark Documentation, because it has more examples than any of the other documentations.
然而,继续我的解释,我将使用 RDD
API 的一些方法,因为所有 DataFrame
都有一个 RDD
作为属性.请看我下面的例子,注意我是如何获取第二条记录的.
However, continuing with my explanation, I would use some methods of the RDD
API cause all DataFrame
s have one RDD
as attribute. Please, see my example bellow, and notice how I take the 2nd record.
df = sqlContext.createDataFrame([("a", 1), ("b", 2), ("c", 3)], ["letter", "name"])
myIndex = 1
values = (df.rdd.zipWithIndex()
.filter(lambda ((l, v), i): i == myIndex)
.map(lambda ((l,v), i): (l, v))
.collect())
print(values[0])
# (u'b', 2)
希望有人能用更少的步骤给出另一种解决方案.
Hopefully, someone gives another solution with fewer steps.
这篇关于从火花数据框中获取特定行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!