Spark 2.0缺少Spark隐式 [英] Spark 2.0 missing spark implicits
问题描述
使用Spark 2.0,我发现可以将行的数据框转换为案例类的数据框.当我尝试这样做时,我收到一条消息,提示要导入spark.implicits._
.我遇到的问题是Intellij无法将其识别为有效的导入语句,我想知道它是否已移动且消息尚未更新,或者我的构建设置中没有正确的软件包,在这里是我的build.sbt
Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row's into a dataframe of case classes. When I try to do so, Im greeted with a message stating to import spark.implicits._
. The issue that I have is that Intellij isn't recognizing that as a valid import statement, Im wondering if that has moved and the message hasn't been updated, or if I don't have the correct packages in my build settings, here is my build.sbt
libraryDependencies ++= Seq(
"org.mongodb.spark" % "mongo-spark-connector_2.11" % "2.0.0-rc0",
"org.apache.spark" % "spark-core_2.11" % "2.0.0",
"org.apache.spark" % "spark-sql_2.11" % "2.0.0"
)
推荐答案
没有名为spark.implicits
的软件包.
此处使用spark
表示 SparkSession .如果您位于REPL内,则该会话已被定义为spark
,因此您只需键入:
With spark
here it refers to SparkSession. If you are inside the REPL the session is already defined as spark
so you can just type:
import spark.implicits._
如果您在代码中的某个位置定义了自己的SparkSession
,请相应地进行调整:
If you have defined your own SparkSession
somewhere in your code, then adjust it accordingly:
val mySpark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
// For implicit conversions like converting RDDs to DataFrames
import mySpark.implicits._
这篇关于Spark 2.0缺少Spark隐式的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!