星火未安装EMR集群 [英] Spark not installed on EMR cluster

查看:211
本文介绍了星火未安装EMR集群的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直使用星火电子病历集群上几个星期,现在没有问题 - 设置是与AMI 3.8.0和1.3.1星火,我通过'-x'作为参数传递给星火(没有这个它似乎没有安装)。

I have been using Spark on an EMR cluster for a few weeks now without problems - the setup was with the AMI 3.8.0 and Spark 1.3.1, and I passed '-x' as an argument to Spark (without this it didn't seem to be installed).

我想升级到较新版本的Spark和今天的纺与电子病历-4.1.0 AMI集群,包含星火1.5.0。当集群启动它声称已经成功安装火花(至少在AWS集群管理页面上),但是当我ssh到'的hadoop @ [IP地址]我没有看到任何在Hadoop的目录下,其中在$ p $安装pvious版的Spark(我也试图与其他应用程序和有同样的结果,并试图通过ssh作为EC2用户,但星火也没有安装在那里)。当我旋转起来与EMR-4.1.0 AMI集群,我没有要通过-x参数星火的选项,我不知道是否有什么我失踪。

I want to upgrade to a more recent version of Spark and today spun up a cluster with the emr-4.1.0 AMI, containing Spark 1.5.0. When the cluster is up it claims to have successfully installed Spark (at least on the cluster management page on AWS) but when I ssh into 'hadoop@[IP address]' I don't see anything in the 'hadoop' directory, where in the previous version Spark was installed (I've also tried with other applications and had the same result, and tried to ssh in as ec2-user but Spark is also not installed there). When I spin up the cluster with the emr-4.1.0 AMI I don't have the option to pass the '-x' argument to Spark, and I'm wondering if there is something I'm missing.

有谁知道我在做什么错在这里?

Does anyone know what I'm doing wrong here?

非常感谢。

推荐答案

这实际上解决了,而平凡。

This was actually solved, rather trivially.

在previous AMI所有路径的星火和其他应用程序分别在Hadoop的文件夹中可用软链接。在较新的AMI这些都被删除,但应用程序仍安装,并且可以通过火花壳(例如)在命令行访问。

In the previous AMI all of the paths to Spark and other applications were soft links available in the hadoop folder. In the newer AMI these have been removed but the applications are still installed and can be accessed by 'spark-shell' (for example) at the command line.

这篇关于星火未安装EMR集群的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆