Spark 2.0 缺少 spark 隐式 [英] Spark 2.0 missing spark implicits
问题描述
使用 Spark 2.0,我看到可以将行的数据帧转换为案例类的数据帧.当我尝试这样做时,我收到一条消息,说明要导入 spark.implicits._
.我遇到的问题是 Intellij 没有将其识别为有效的导入语句,我想知道它是否已移动并且消息尚未更新,或者我的构建设置中是否没有正确的包,在这里是我的 build.sbt
Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row's into a dataframe of case classes. When I try to do so, Im greeted with a message stating to import spark.implicits._
. The issue that I have is that Intellij isn't recognizing that as a valid import statement, Im wondering if that has moved and the message hasn't been updated, or if I don't have the correct packages in my build settings, here is my build.sbt
libraryDependencies ++= Seq(
"org.mongodb.spark" % "mongo-spark-connector_2.11" % "2.0.0-rc0",
"org.apache.spark" % "spark-core_2.11" % "2.0.0",
"org.apache.spark" % "spark-sql_2.11" % "2.0.0"
)
推荐答案
没有名为 spark.implicits
的包.
这里的 spark
指的是 SparkSession.如果您在 REPL 中,会话已定义为 spark
,因此您只需键入:
With spark
here it refers to SparkSession. If you are inside the REPL the session is already defined as spark
so you can just type:
import spark.implicits._
如果您在代码中的某处定义了自己的 SparkSession
,则相应地进行调整:
If you have defined your own SparkSession
somewhere in your code, then adjust it accordingly:
val mySpark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
// For implicit conversions like converting RDDs to DataFrames
import mySpark.implicits._
这篇关于Spark 2.0 缺少 spark 隐式的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!