是否有可能使用json4s 3.2.11星火1.3.0? [英] Is it possible to use json4s 3.2.11 with Spark 1.3.0?

查看:255
本文介绍了是否有可能使用json4s 3.2.11星火1.3.0?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

星火对json4s 3.2.10的依赖,但是这个版本有一些错误,我需要使用3.2.11。我加json4s本地3.2.11依赖性build.sbt和一切编译罚款。但是,当我火花提交我的JAR它为我提供了3.2.10。

Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10.

build.sbt

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`

plugins.sbt

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

App1.scala

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
}

SBT 0.13.7 + SBT-装配0.13.0
斯卡拉2.10.4

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

有没有办法强制3.2.11版本的使用情况如何?

Is there a way to force 3.2.11 version usage?

推荐答案

我问星火用户邮件列表同样的问题,并得到了两种答案如何使其工作:

I asked the same question in the Spark User Mailing List, and got two answers how to make it work:


  1. 使用spark.driver.userClassPathFirst =真实spark.executor.userClassPathFirst =真选项,但它只能在星火1.3,可能会需要像你构建不包括Scala类的其他一些修改。

  1. Use spark.driver.userClassPathFirst=true and spark.executor.userClassPathFirst=true options, but it works only in Spark 1.3 and probably will require some other modifications like excluding Scala classes from your build.

重建与json4s星火3.2.11版本(你可以在核心/ pom.xml的改变)

Rebuild Spark with json4s 3.2.11 version (you can change it in core/pom.xml)

无论做工精细,我prefered第二个。

Both work fine, I prefered the second one.

这篇关于是否有可能使用json4s 3.2.11星火1.3.0?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆