如何提交一个Spark罐子到EMR集群? [英] How do I submit a Spark jar to a EMR cluster?
问题描述
我已经使用了网上AWS控制台启动我的集群以及与Apache的火花。我根据我的Spark应用程序中的脂肪罐子,我已经上传到一个S3桶。
当我尝试它作为一个步骤使用自定义罐
发送,该过程将失败。
任何指针将大大AP preciated。
I have used the online AWS console to launch my cluster along with Apache Spark. I have a fat jar based on my Spark app and I have uploaded it to a S3 Bucket.
When I try to send it as a Step with a Custom Jar
, the process fails.
Any pointers would be greatly appreciated.
推荐答案
使用的 EMR引导安装星火,并作为文档中描述提交作业: <一href="https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md" rel="nofollow">https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md
Use EMR bootstrap to install Spark, and submit the job as described in the documentation: https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md
这篇关于如何提交一个Spark罐子到EMR集群?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!