带有Spark解释器的Zeppelin会忽略在类/函数定义之外声明的导入 [英] Zeppelin with Spark interpreter ignores imports declared outside of class/function definition
问题描述
我正在尝试使用Spark解释器在Zeppelin 0.8.0中使用一些Scala代码:
I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:
%spark
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]) {
}
但是似乎没有考虑进口
import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
@BeanProperty val parent: Option[Node]) {
^
编辑:我发现以下代码有效:
I found out that the following code works :
class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}
这也很好:
def loadCsv(CSVPATH: String): DataFrame = {
import org.apache.spark.sql.types._
//[...] some code
val schema = StructType(
firstRow.map(s => StructField(s, StringType))
)
//[…] some code again
}
因此,我想如果将其导入到括号之间或在使用时直接用path.to.package.Class
指定,则一切正常.
So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class
when used.
问题::如何在类/函数定义之外导入?
QUESTION: How do I import outside of a class/function definition?
推荐答案
在Zeppelin中,通过path.to.package.Class
导入非常有效.您可以尝试导入和使用java.sql.Date
;
Importing by path.to.package.Class
works well in Zeppelin. You can try it with importing and using java.sql.Date
;
import java.sql.Date
val date = Date.valueOf("2019-01-01")
问题与 Zeppelin上下文有关.如果您尝试在Zeppelin中使用以下代码段,则会发现它工作正常;
The problem is about Zeppelin context. If you try to use following code snippets in Zeppelin, you will see that it works fine;
object TestImport {
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None
希望对您有帮助!
这篇关于带有Spark解释器的Zeppelin会忽略在类/函数定义之外声明的导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!