构建独立Spark 1.3.1时出错 [英] Error on building stand-alone Spark 1.3.1

查看:92
本文介绍了构建独立Spark 1.3.1时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试安装spark 1.3.1(sbt版本0.13.7):

Im trying to install spark 1.3.1 (sbt version 0.13.7):

wget http://d3kbcqa49mib13.cloudfront.net/spark-1.3.1.tgz
tar xvf spark-1.3.1.tgz
cd spark-1.3.1
sudo sbt assembly

但是出现以下错误:

sbt.ResolveException: unresolved dependency: org.apache.spark#spark-network-common_2.10;1.3.1: configuration not public in org.apache.spark#spark-network-common_2.10;1.3.1: 'test'. It was required from org.apache.spark#spark-network-shuffle_2.10;1.3.1 test
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
    at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
    at sbt.IvySbt.withIvy(Ivy.scala:123)
    at sbt.IvySbt.withIvy(Ivy.scala:120)
    at sbt.IvySbt$Module.withModule(Ivy.scala:151)
    at sbt.IvyActions$.updateEither(IvyActions.scala:157)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343)
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
[error] (network-shuffle/*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-network-common_2.10;1.3.1: configuration not public in org.apache.spark#spark-network-common_2.10;1.3.1: 'test'. It was required from org.apache.spark#spark-network-shuffle_2.10;1.3.1 test
[error] Total time: 2 s, completed May 25, 2016 11:49:46 PM

我还已经从主目录中删除了 .ivy2 文件夹.

Also I have already deleted the .ivy2 folder from the home directory.

怎么了?

推荐答案

正如@Tristan所说-我修改了文件/project/build.properties 以使用sbt 0.13.10 ,并且有效.

As @Tristan commented - I modified the file /project/build.properties to use sbt 0.13.10 and it worked.

这篇关于构建独立Spark 1.3.1时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆