Scala案例类忽略了Spark shell中的导入 [英] Scala case class ignoring import in the Spark shell
问题描述
我希望这个问题有明显的答案!
I hope there is an obvious answer to this question!
我刚刚升级到Spark v2.0,并且 spark-shell (Scala 2.11内部版本)有一个奇怪的问题.
I've just upgraded to Spark v2.0 and have an odd problem with the spark-shell (Scala 2.11 build).
如果我输入以下最小Scala,
If I enter the following minimal Scala,
import java.sql.Timestamp
case class Crime(caseNumber: String, date: Timestamp, description: String, detail: String, arrest: Boolean)
我收到以下错误消息,
<console>:11: error: not found: type Timestamp
如果我在其他地方使用Java Timestamp 类,例如函数中,则不会生成任何错误(正如您期望的那样,因为导入).
If I use the Java Timestamp class elsewhere, e.g. in a function, then no errors are generated (as you would expect because of the import).
如果我完全符合条件并在案例类中使用 java.sql.Timestamp ,那么它将起作用!
If I fully qualify and use java.sql.Timestamp in the case class it works!
我缺少明显的东西吗?
推荐答案
只是在案例类声明中未加载时间戳,要解决此问题,您可以:
It's just that the Timestamp is not loaded in the case class declaration, to fix this you can:
:paste
import java.sql.Timestamp
case class Crime(caseNumber: String, date: Timestamp, description: String, detail: String, arrest: Boolean)
或
case class Crime(caseNumber: String, date: java.sql.Timestamp, description: String, detail: String, arrest: Boolean)
这篇关于Scala案例类忽略了Spark shell中的导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!