Spark上的Scala无法解析符号 [英] Scala on Spark Cannot resolve symbol
问题描述
嗨
我有一些scala代码,我希望将其提交给Hdinsight(火花),该行在此行已损坏:
I have some scala code, I am looking to submit it to Hdinsight (spark) it is broken at this line:
val landDF = parseRDD(spark.Context.textFile(datapath)).map(parseLand).toDF().cache()
val landDF = parseRDD(spark.Context.textFile(datapath)).map(parseLand).toDF().cache()
使用intellij作为scala,它对我大哭:无法解析符号parseRDD"
Using intellij for scala, it is crying at me it says: "Cannot resolve symbol parseRDD"
我想念什么?依赖还是什么?我需要在sbt文件中添加一些内容吗?
What am i missing? A dependency or something? Do i need to add something to the sbt file?
谢谢
Conor
推荐答案
您可以在代码中使用以下import语句,看看它是否可以解决您的问题.
import spark.sqlContext.implicits._
您可以要检查 在HDInsight Spark群集上运行Spark Scala应用程序 .
You may want to check Run a Spark Scala application on an HDInsight Spark cluster.
- -------------------------------------------------- --------------------------------------
如果此答案有帮助,请单击标记为答案"或上投票".要提供有关您的论坛体验的其他反馈,请单击
span> .
------------------------------------------------------------------------------------------
If this answer was helpful, click "Mark as Answer" or "Up-Vote". To provide additional feedback on your forum experience, click
here.
这篇关于Spark上的Scala无法解析符号的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!