Spark - “sbt package” - “value $不是StringContext的成员” - 缺少Scala插件? [英] Spark - "sbt package" - "value $ is not a member of StringContext" - Missing Scala plugin?
问题描述
当从一个小型Spark Scala应用程序的命令行运行sbt package时,我在以下代码行中得到value $不是StringContext的成员编译错误:
When running "sbt package" from the command line for a small Spark Scala application, I'm getting the "value $ is not a member of StringContext" compilation error on the following line of code:
val joined = ordered.join(empLogins, $"login" === $"username", "inner")
.orderBy($"count".desc)
.select("login", "count")
Intellij 13.1给了我同样的错误信息。在Eclipse 4.4.2中编译相同的.scala源代码没有任何问题。而且它也可以在命令行的单独maven项目中与maven一起使用。
Intellij 13.1 is giving me the same error message. The same .scala source code gets compiled without any issue in Eclipse 4.4.2. And also it works well with maven in a separate maven project from the command line.
看起来sbt无法识别$符号,因为我错过了一些插件在我的project / plugins.sbt文件或build.sbt文件中的某些设置。
It looks like sbt doesn't recognize the $ sign because I'm missing some plugin in my project/plugins.sbt file or some setting in my build.sbt file.
你熟悉这个问题吗?任何指针将不胜感激。如果需要,我可以提供build.sbt和/或project / plugins.sbt。
Are you familiar with this issue? Any pointers will be appreciated. I can provide build.sbt and/or project/plugins.sbt if needed be.
推荐答案
你需要确保你< a href =https://spark.apache.org/docs/1.3.0/api/scala/index.html#org.apache.spark.sql.SQLContext$implicits$> import sqlContext .implicits ._
You need to make sure you import sqlContext.implicits._
这会让你隐式类StringToColumn扩展AnyRef
评论为:
将$col name转换为一栏。
Converts $"col name" into an Column.
这篇关于Spark - “sbt package” - “value $不是StringContext的成员” - 缺少Scala插件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!