提交Spark作业未加载spark-cloudant:2.0.0-s_2_11程序包 [英] Submit Spark job is not loading spark-cloudant:2.0.0-s_2_11 package

查看:98
本文介绍了提交Spark作业未加载spark-cloudant:2.0.0-s_2_11程序包的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用以下命令提交Spark,

I am submitting Spark using below command,

./spark-submit-软件包cloudant-labs:spark-cloudant:2.0.0-s_2.11 --class spark.cloudant.connecter.cloudantconnecter --master本地[*]/opt/demo/sparkScripts/ScoredJob/sparkcloudantconnecter.jar

./spark-submit --packages cloudant-labs:spark-cloudant:2.0.0-s_2.11 --class spark.cloudant.connecter.cloudantconnecter --master local[*] /opt/demo/sparkScripts/ScoredJob/sparkcloudantconnecter.jar

但是似乎"spark-cloudant"软件包未加载. 可能是由于Spark软件包存储库中的工件( https://dl. bintray.com/spark-packages/maven/)

But it seems "spark-cloudant" package is not loading.. Might be it is not loading because of artifact located at Spark Packages repository(https://dl.bintray.com/spark-packages/maven/)

那么,加载包需要在命令中进行哪些更改?

此软件包的完整详细信息在此处显示 https://mvnrepository.com/artifact/cloudant- labs/spark-cloudant/2.0.0-s_2.11

Full details of This package is showing here https://mvnrepository.com/artifact/cloudant-labs/spark-cloudant/2.0.0-s_2.11

推荐答案

似乎需要在使用Spark-submit命令加载程序包时添加存储库,

It seems require to add repository while loading package using Spark-submit command,

-存储库网址

-- repository url

因此,新命令将成为

./spark-submit-软件包cloudant-labs:spark-cloudant:2.0.0-s_2.11 -存储库 https://dl.bintray.com/spark-packages/maven/ -class spark.cloudant.connecter.cloudantconnecter --master本地[*] /opt/demo/sparkScripts/ScoredJob/sparkcloudantconnecter.jar

./spark-submit --packages cloudant-labs:spark-cloudant:2.0.0-s_2.11 --repositories https://dl.bintray.com/spark-packages/maven/ --class spark.cloudant.connecter.cloudantconnecter --master local[*] /opt/demo/sparkScripts/ScoredJob/sparkcloudantconnecter.jar

这篇关于提交Spark作业未加载spark-cloudant:2.0.0-s_2_11程序包的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆