我如何在Hadoop串流作业中包含python软件包? [英] How can I include a python package with Hadoop streaming job?
问题描述
我尝试在Hadoop流式作业中包含python包(NLTK),但不知道如何在不通过CLI参数-file手动包含每个文件的情况下执行此操作。
I am trying include a python package (NLTK) with a Hadoop streaming job, but am not sure how to do this without including every file manually via the CLI argument, "-file".
编辑:一种解决方案是在所有的奴隶上安装这个软件包,但我目前没有这个选项。
One solution would be to install this package on all the slaves, but I don't have that option currently.
推荐答案
我会将包压缩成 .tar.gz
或 .zip
并将整个tarball或归档文件以 -file
选项传递给您的hadoop命令。我以前用Perl做过,但不是Python。
I would zip up the package into a .tar.gz
or a .zip
and pass the entire tarball or archive in a -file
option to your hadoop command. I've done this in the past with Perl but not Python.
也就是说,如果你使用Python的 zipimport
在 http://docs.python.org/library/zipimport .html ,它允许您直接从zip导入模块。
That said, I would think this would still work for you if you use Python's zipimport
at http://docs.python.org/library/zipimport.html, which allows you to import modules directly from a zip.
这篇关于我如何在Hadoop串流作业中包含python软件包?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!