hadoop,python,子进程失败,代码为127 [英] hadoop, python, subprocess failed with code 127
问题描述
我正在尝试使用mapreduce运行非常简单的任务.
I'm trying to run very simple task with mapreduce.
mapper.py:
mapper.py:
#!/usr/bin/env python
import sys
for line in sys.stdin:
print line
我的txt文件:
qwerty
asdfgh
zxc
运行作业的命令行:
hadoop jar /usr/lib/hadoop-0.20-mapreduce/contrib/streaming/hadoop-streaming-2.6.0-mr1-cdh5.8.0.jar \
-input /user/cloudera/In/test.txt \
-output /user/cloudera/test \
-mapper /home/cloudera/Documents/map.py \
-file /home/cloudera/Documents/map.py
错误:
INFO mapreduce.Job: Task Id : attempt_1490617885665_0008_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 127
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:325)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:538)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
如何解决此问题并运行代码?
当我使用cat /home/cloudera/Documents/test.txt | python /home/cloudera/Documents/map.py
时,效果很好
How to fix this and run the code?
When I use cat /home/cloudera/Documents/test.txt | python /home/cloudera/Documents/map.py
it works fine
!!!!! UPDATE
我的* .py文件出了点问题.我已经从github'tom white hadoop book'复制了文件,并且一切正常.
Something wrong with my *.py file. I have copied file from github 'tom white hadoop book' and everything is working fine.
但是我不明白是什么原因.它不是权限和字符集(如果我没记错的话).还有什么呢?
But I cant understand what is the reason. It is not the permissions and charset (if I am not wrong). What else can it be?
推荐答案
我遇到了同样的问题.
问题: 在Windows环境中创建python文件时,新的换行符为 CRLF . 我的hadoop在Linux上运行,该Linux将换行符理解为 LF
Issue: When the python file is created in Windows environment the new line character is CRLF. My hadoop runs on Linux which understands the newline character as LF
解决方案: 将 CRLF 更改为 LF 后,该步骤成功运行.
Solution: After changing the CRLF to LF the step ran successfully.
这篇关于hadoop,python,子进程失败,代码为127的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!