初始化SparkContext时出错:必须在配置中设置主URL [英] Error initializing SparkContext: A master URL must be set in your configuration

查看:451
本文介绍了初始化SparkContext时出错:必须在配置中设置主URL的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用了此代码

我的错误是:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0

17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable

17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and  
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended).

17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: A master URL must be set in your 
configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at   
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Process finished with exit code 1

推荐答案

如果单独运行spark,那么

If you are running spark stand alone then

val conf = new SparkConf().setMaster("spark://master") //missing 

您可以在提交作业时传递参数

and you can pass parameter while submit job

spark-submit --master spark://master

如果您在本地运行spark,则

If you are running spark local then

val conf = new SparkConf().setMaster("local[2]") //missing 

您可以在提交作业时传递参数

you can pass parameter while submit job

spark-submit --master local

如果您在纱线上运行火花

if you are running spark on yarn then

spark-submit --master yarn

这篇关于初始化SparkContext时出错:必须在配置中设置主URL的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆