Scalatest 和 Spark 给出“java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper"; [英] Scalatest and Spark giving "java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper"

查看:73
本文介绍了Scalatest 和 Spark 给出“java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper";的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在com.holdenkarau.spark-testing-base"和scalatest的帮助下测试 Spark Streaming 应用程序.

I’m testing a Spark Streaming application with the help of "com.holdenkarau.spark-testing-base" and scalatest.

import com.holdenkarau.spark.testing.StreamingSuiteBase
import org.apache.spark.rdd.RDD
import org.scalatest.{ BeforeAndAfter, FunSuite }

class Test extends FunSuite with BeforeAndAfter with StreamingSuiteBase {

  var delim: String = ","

  before {
    System.clearProperty("spark.driver.port")
   }

  test("This Fails") {

    val source = scala.io.Source.fromURL(getClass.getResource("/some_logs.csv"))
    val input = source.getLines.toList

    val rowRDDOut = Calculator.do(sc.parallelize(input))   //Returns DataFrame

    val report: RDD[String] = rowRDDOut.map(row => new String(row.getAs[String](0) + delim + row.getAs[String](1))

    source.close
  }
}

我收到字段delim"的序列化异常:

I get Serialization exception for field 'delim':

org.apache.spark.SparkException: Task not serializable
[info]   at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
[info]   at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
[info]   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
[info]   at org.apache.spark.SparkContext.clean(SparkContext.scala:2055)
[info]   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:324)
[info]   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:323)
[info]   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
[info]   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
[info]   at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
[info]   at org.apache.spark.rdd.RDD.map(RDD.scala:323)
[info]   ...
[info]   Cause: java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper
[info] Serialization stack:
[info]  - object not serializable (class: org.scalatest.Assertions$AssertionsHelper, value: org.scalatest.Assertions$AssertionsHelper@78b339fa)
[info]  - field (class: org.scalatest.FunSuite, name: assertionsHelper, type: class org.scalatest.Assertions$AssertionsHelper)

如果我将 'delim' 替换为 String 值,它就可以正常工作.

If I replace 'delim' by String value in place, it works fine.

val report: RDD[String] = rowRDDOut.map(row => new String(row.getAs[String](0) + "," + row.getAs[String](1))

第一版和第二版有什么区别?

What’s the difference between first and second version?

提前致谢!

推荐答案

问题不在于 delim (String) 的类型,而在于 delim 本身.

The problem is not the type of delim (String) it's delim itself.

尽量不要在 test() 方法之外定义变量.如果您在 test 中定义了 delm 它应该可以工作.

Try not to define variables outside your test() methods. If you define delm inside your test it should work.

test("This Fails") {
   val delim = ","
   ...
}

现在,你可能会问为什么?好吧,当您从外部作用域引用 delim 时,Scala 会尝试将封闭对象 class Test 组合在一起.此对象包含对 org.scalatest.Assertions$AssertionsHelper 的引用,它不可序列化(请参阅您的堆栈跟踪).

Now, you may ask why? Well, when you reference delim from the outer scope, Scala will try to bring together the enclosing object class Test. This object contains a reference to org.scalatest.Assertions$AssertionsHelper that it's not Serializable (see your stacktrace).

这篇关于Scalatest 和 Spark 给出“java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper";的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆