我们可以在Hadoop流中级联多个MapReduce作业吗(lang:Python) [英] Can we cascade multiple MapReduce jobs in Hadoop Streaming (lang: Python)

查看:214
本文介绍了我们可以在Hadoop流中级联多个MapReduce作业吗(lang:Python)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Python,并且必须使用Hadoop流处理以下场景: a)Map1-> Reduce1-> Map2-> Reduce2 b)我不想存储中间文件 c)我不想安装诸如Cascading,Yelp,Oozie之类的软件包.我保留了它们作为最后的选择.

I am using Python and have to work on following scenario using Hadoop Streaming: a) Map1->Reduce1->Map2->Reduce2 b) I dont want to store intermediate files c) I dont want to install packages like Cascading, Yelp, Oozie. I have kept them as last option.

我已经在SO和其他地方进行了同样的讨论,但是找不到关于Python的答案.你能建议一下吗?

I already went through the same kind of discussion on SO and elsewhere but could not find an answer wrt Python. Can you please suggest.

推荐答案

b)我不想存储中间文件

b) I dont want to store intermediate files

c)我不想安装诸如Cascading,Yelp,Oozie之类的软件包.

c) I dont want to install packages like Cascading, Yelp, Oozie.

任何原因吗?根据响应,可以提供更好的解决方案.

Any reason why? Based on the response, a better solution could be provided.

无法避免中间文件,因为上一个Hadoop作业的o/p无法作为i/p流传输到下一个作业.创建这样的脚本

Intermediates files cannot be avoided, because the o/p of the previous Hadoop job cannot be streamed as i/p to the next job. Create a script like this

运行流作业1
如果job1不成功,则退出
运行流作业2
如果job2成功,则删除job1的o/p,否则退出
运行流作业3
如果job3成功,则删除job2的o/p,否则退出

run streaming job1
if job1 is not success then exit
run streaming job2
if job2 is success them remove o/p of job1 else exit
run streaming job3
if job3 is succcess them remove o/p of job2 else exit

这篇关于我们可以在Hadoop流中级联多个MapReduce作业吗(lang:Python)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆