Hadoop-3.0.0与旧版本的Hive,Pig,Sqoop和Spark的兼容性如何 [英] How is Hadoop-3.0.0 's compatibility with older versions of Hive, Pig, Sqoop and Spark

查看:756
本文介绍了Hadoop-3.0.0与旧版本的Hive,Pig,Sqoop和Spark的兼容性如何的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们当前在10个节点的群集上使用hadoop-2.8.0,并计划升级到最新的hadoop-3.0.0.

We are currently using hadoop-2.8.0 on a 10 node cluster and are planning to upgrade to latest hadoop-3.0.0.

我想知道如果将hadoop-3.0.0与较旧版本的Spark和其他组件(如Hive,Pig和Sqoop)一起使用,是否会出现问题.

I want to know whether there will be any issue if we use hadoop-3.0.0 with an older version of Spark and other components such as Hive, Pig and Sqoop.

推荐答案

最新的Hive版本不支持Hadoop3.0.看来Hive将来可能会在Spark或其他计算引擎上建立.

Latest Hive version does not support Hadoop3.0.It seems that Hive may be established on Spark or other calculating engines in the future.

这篇关于Hadoop-3.0.0与旧版本的Hive,Pig,Sqoop和Spark的兼容性如何的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆