是否可以将 json4s 3.2.11 与 Spark 1.3.0 一起使用? [英] Is it possible to use json4s 3.2.11 with Spark 1.3.0?

查看:17
本文介绍了是否可以将 json4s 3.2.11 与 Spark 1.3.0 一起使用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Spark对json4s 3.2.10有依赖,但是这个版本有几个bug,我需要使用3.2.11.我在 build.sbt 中添加了 json4s-native 3.2.11 依赖,一切都编译得很好.但是当我提交我的 JAR 时,它为我提供了 3.2.10.

Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10.

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`

plugins.sbt

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
}

sbt 0.13.7 + sbt 组装 0.13.0斯卡拉 2.10.4

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

有没有办法强制使用3.2.11版本?

Is there a way to force 3.2.11 version usage?

推荐答案

我们遇到了类似于 Necro 描述的问题,但是在构建程序集 jar 时从 3.2.11 降级到 3.2.10 并没有解决它.我们最终通过在作业程序集 jar 中着色 3.2.11 版本来解决它(使用 Spark 1.3.1):

We ran into a problem similar to the one Necro describes, but downgrading from 3.2.11 to 3.2.10 when building the assembly jar did not resolve it. We ended up solving it (using Spark 1.3.1) by shading the 3.2.11 version in the job assembly jar:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("org.json4s.**" -> "shaded.json4s.@1").inAll
)

这篇关于是否可以将 json4s 3.2.11 与 Spark 1.3.0 一起使用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆