" ./bin/spark-shell“不适用于ubuntu 14.04上的Hadoop 2.6+的预建版本Spark 1.6 [英] " ./bin/spark-shell " Not working with Pre-built version of Spark 1.6 with Hadoop 2.6+ on ubuntu 14.04

查看:170
本文介绍了" ./bin/spark-shell“不适用于ubuntu 14.04上的Hadoop 2.6+的预建版本Spark 1.6的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在Ubuntu 14.04上新下载了带有Hadoop 2.6+的预装版本的Spark 1.6,并将其安装在桌面上。



I导航到Spark壳并根据下面给出的链接
快速启动Spark Link 使用

  ./ bin / spark-shell 

我收到以下错误。我在Mac OSX上看到了一个类似的问题此处

  ashwin @ Console:〜/ Desktop / spark-1.6.0-bin-hadoop2.6 $ ./bin/spark-shell 
log4j:WARN记录器(org.apache.hadoop.metrics2.lib.MutableMetricsFactory)找不到appender。
log4j:WARN请正确初始化log4j系统。
log4j:WARN请参阅http://logging.apache.org/log4j/1.2/faq.html#noconfig了解更多信息。
使用Spark的repl log4j配置文件:org / apache / spark / log4j-defaults-repl.properties
要调整日志记录级别,请使用sc.setLogLevel(INFO)
欢迎使用
____ __
/ __ / __ ___ _____ / / __
_ \ \ / _ \ / _`/ __ /'_ /
/ ___ / .__ / \_使用Scala 2.10.5版本(OpenJDK 64位服务器虚拟机,Java 1.7.0_91),_ / _ / / _ / \_\版本1.6.0
/ _ /


键入表达式让他们评估。
输入:help获取更多信息。
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定在端口0上。正在尝试端口1.
16/01/05 12:36:25 WARN Utils:Service 'sparkDriver'无法绑定在端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:服务'sparkDriver'无法绑定到端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定在端口0上。尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'could不绑定在端口0.试图端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定在端口0上。尝试端口1.
16/01 / 05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定到端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定到端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:服务'sparkDriver'无法绑定到端口0.尝试端口1.
16/01/05 12:36 :25 WARN Utils:Service' sparkDriver'无法绑定到端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定到端口0.尝试端口1.
16 / 01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定到端口0.试图端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'could not绑定到端口0.尝试端口1.
16/01/05 12:36:25 WARN Utils:服务'sparkDriver'无法绑定到端口0.正在尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定在端口0上。正在尝试端口1.
16/01/05 12:36:25 WARN Utils:Service'sparkDriver'无法绑定到端口0 。尝试端口1.
16/01/05 12:36:25错误SparkContext:初始化SparkContext时出错。
java.net.BindException:无法分配请求的地址:服务'sparkDriver'在16次重试后失败!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:463)
at sun.nio.ch. Net.bind(Net.java:455)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java: 74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel $ AbstractUnsafe.bind(AbstractChannel.java:485)
在io.netty.channel.DefaultChannelPipeline $ HeadContext.bind(DefaultChannelPipeline.java:1089)
在io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
在IO .netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
在io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
在io.netty.channel.AbstractChannel.bind (AbstractChannel.java:198)在io.netty上
在io.netty.util.concurrent.SingleThreadEventExecutor.runAllTask​​s .bootstrap.AbstractBootstrap $ 2.run(AbstractBootstrap.java:348)
(SingleThreadEventExecutor.java:357)
。在io.netty.channel.nio。 NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor $ 2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java :745)
java.net.BindException:无法分配请求的地址:服务'sparkDriver'在16次重试后失败!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:463)
at sun.nio.ch. Net.bind(Net.java:455)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java: 74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel $ AbstractUnsafe.bind(AbstractChannel.java:485)
在io.netty.channel.DefaultChannelPipeline $ HeadContext.bind(DefaultChannelPipeline.java:1089)
在io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
在IO .netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
在io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
在io.netty.channel.AbstractChannel.bind (AbstractChannel.java:198)在io.netty上
在io.netty.util.concurrent.SingleThreadEventExecutor.runAllTask​​s .bootstrap.AbstractBootstrap $ 2.run(AbstractBootstrap.java:348)
(SingleThreadEventExecutor.java:357)
。在io.netty.channel.nio。 NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor $ 2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java :745)

java.lang.NullPointerException $ b $ org.apache.spark.sql.SQLContext $ .createListenerAndUI(SQLContext.scala:1367)
at org.apache.spark .sql.hive.HiveContext< INIT>(HiveContext.scala:101)
。在sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
。在sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java :57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
在$ iwC $$ $ iwC。< init>(< console>:15)
at $ iwC上的
(org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) < init>(< console> 24)
at< init>(< console> 26)
at< < init>(< console> 7)
at。< clinit>(< console>)
。在$打印(小于控制台>)
。在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
。在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
。在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
在java.lang.reflect.Method.invoke(Method.java:606)
。在org.apache.spark.repl.SparkIMain $ ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain $ Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain。 loadAndRunReq $ 1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain。阶:819)
。在org.apache.spark.repl.SparkILoop.reallyInterpret $ 1(SparkILoop.scala:857)
。在org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902 )
在org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
在org.apache.spark.repl.SparkILoopInit $$ anonfun $ initializeSpark $ 1.apply(SparkILoopInit.scala :132)
at org.apache.spark.repl.SparkILoopInit $$ anonfun $ initializeSpark $ 1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain .scala:324)
。在org.apache.spark.repl.SparkILoopInit $ class.initializeSpark(SparkILoopInit.scala:124)
。在org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala :64)
在org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$进程$ 1 $$ anonfun $ apply $ mcZ $ sp $ 5.apply $ mcV $ sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit $ class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit $ class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$ process $ 1 .apply $ mcZ $ sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$进程$ 1.apply(SparkILoop.scala :945)
在org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$进程$ 1.apply(SparkILoop.scala:945)
在scala。 tools.nsc.util.ScalaClassLoader $ .savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org $ apache $ spark $ repl $ SparkILoop $$进程(SparkILoop.scala: 945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main $ .main(Main.scala:31)
。在org.apache.spark.repl.Main.main(Main.scala)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl .java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at $ org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit .scala:181)
at org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala :121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

< console>:16:error:not found:valu e sqlContext
import sqlContext.implicits._
^
< console>:16:error:not found:value sqlContext
import sqlContext.sql
^

有什么帮助吗?

解决方案 div>

导致您的问题的一个可能原因是您尝试绑定到非法的IP地址。在 $ SPARK_HOME / conf / spark-env.sh 中,有一个名为 $ SPARK_LOCAL_IP 的变量。如果已设置,请确保它确实代表机器,您正在运行Spark shell或尝试将其注释掉。否则,如果未设置,则可以尝试将其设置为,例如 127.0.0.1


Freshly downloaded the Pre-built version of Spark 1.6 with Hadoop 2.6+ on Ubuntu 14.04 onto the desktop.

I Navigated to the spark shell and initiated spark as per the link given below Quick Start Spark Link using

./bin/spark-shell

I am receiving the following errors . I saw a similar question asked for Mac OSX here .

ashwin@Console:~/Desktop/spark-1.6.0-bin-hadoop2.6$ ./bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_91)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/05 12:36:25 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:463)
    at sun.nio.ch.Net.bind(Net.java:455)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:463)
    at sun.nio.ch.Net.bind(Net.java:455)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:15)
    at $iwC.<init>(<console>:24)
    at <init>(<console>:26)
    at .<init>(<console>:30)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:16: error: not found: value sqlContext
         import sqlContext.implicits._
                ^
<console>:16: error: not found: value sqlContext
         import sqlContext.sql
                ^

Any help ?

解决方案

A likely cause for your problem is that you are trying to bind to an illegal IP address. In the $SPARK_HOME/conf/spark-env.sh, there is variable named $SPARK_LOCAL_IP. If it is set, please make sure that it is really representing the machine, you are running the Spark shell on or try to comment it out. Otherwise, if it is not set, you can try to set it to, e.g., 127.0.0.1.

这篇关于&QUOT; ./bin/spark-shell“不适用于ubuntu 14.04上的Hadoop 2.6+的预建版本Spark 1.6的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆