查看RDD内容在Python火花? [英] View RDD contents in Python Spark?
问题描述
在pyspark运行一个简单的应用程序。
Running a simple app in pyspark.
f = sc.textFile("README.md")
wc = f.flatMap(lambda x: x.split(' ')).map(lambda x: (x, 1)).reduceByKey(add)
我想用行动的foreach查看RDD内容:
I want to view RDD contents using foreach action:
wc.foreach(print)
这将引发一个语法错误:
This throws a syntax error:
SyntaxError: invalid syntax
我是什么失踪?
推荐答案
此错误是因为打印
是在Python 2.6不是一个函数。
This error is because print
isn't a function in Python 2.6.
您可以定义执行打印帮手UDF,或者使用 __future__ 库治疗打印
作为一个函数:
You can either define a helper UDF that performs the print, or use the __future__ library to treat print
as a function:
>>> from operator import add
>>> f = sc.textFile("README.md")
>>> def g(x):
... print x
...
>>> wc.foreach(g)
或
>>> from __future__ import print_function
>>> wc.foreach(print)
不过,我认为这将是更好地使用收集()
带来的RDD内容回驱动程序,因为的foreach
执行工作节点上,输出不一定会出现在你的驱动程序/壳(它可能会在本地
模式,而不是一个集群上运行时)
However, I think it would be better to use collect()
to bring the RDD contents back to the driver, because foreach
executes on the worker nodes and the outputs may not necessarily appear in your driver / shell (it probably will in local
mode, but not when running on a cluster).
>>> for x in wc.collect():
... print x
这篇关于查看RDD内容在Python火花?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!