将Scala 2.12与Spark 2.x结合使用 [英] Using Scala 2.12 with Spark 2.x

查看:415
本文介绍了将Scala 2.12与Spark 2.x结合使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在Spark 2.1 文档中,提到

Spark可在Java 7 +,Python 2.6 +/3.4 +和R 3.1+上运行.对于Scala API,Spark 2.1.0使用Scala 2.11.您将需要使用兼容的Scala版本(2.11.x).

在Scala 2.12 发布新闻中,还提到:

尽管Scala 2.11和2.12在很大程度上与源代码兼容,以促进交叉构建,但它们与二进制文件不兼容.这使我们能够不断改进Scala编译器和标准库.

但是当我构建一个超级jar(使用Scala 2.12)并在Spark 2.1上运行它时.每件事都很好.

,我知道它不是任何官方消息,而是在 47度博客,他们提到Spark 2.1确实支持Scala 2.12.

如何解释这些(冲突?)信息?

解决方案

Spark 不支持Scala 2.12.您可以按照 SPARK-14220 (针对Scala构建并测试Spark 2.12 )来获取最新状态.

更新: 添加了Spark 2.4 一个实验性的Scala 2.12支持./p>

At the Spark 2.1 docs it's mentioned that

Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 2.1.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

at the Scala 2.12 release news it's also mentioned that:

Although Scala 2.11 and 2.12 are mostly source compatible to facilitate cross-building, they are not binary compatible. This allows us to keep improving the Scala compiler and standard library.

But when I build an uber jar (using Scala 2.12) and run it on Spark 2.1. every thing work just fine.

and I know its not any official source but at the 47 degree blog they mentioned that Spark 2.1 does support Scala 2.12.

How can one explain those (conflicts?) pieces of information ?

解决方案

Spark does not support Scala 2.12. You can follow SPARK-14220 (Build and test Spark against Scala 2.12) to get up to date status.

update: Spark 2.4 added an experimental Scala 2.12 support.

这篇关于将Scala 2.12与Spark 2.x结合使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆