具有ArrayBuffer的Scala对象 - 使用Gson反序列化问题 [英] Scala Object with ArrayBuffer - Deserialize issue using Gson

查看:91
本文介绍了具有ArrayBuffer的Scala对象 - 使用Gson反序列化问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



FinalOutput.scala:


$ b我有一个顶级scala类,如下所示: $ b

  class FinalOutput extends Serializable {

@BeanProperty
var userId:String = _

@BeanProperty
var tenantId:String = _

@BeanProperty
@SerializedName(type)
var dataType:String = _

@BeanProperty
var data:FinalData = _

@BeanProperty
var userCreatedDate:String = _
}

FinalData.scala:
$ b

  class FinalData扩展Serializable {

@BeanProperty
var list1:ArrayBuffer [DataType1] = _

@BeanProperty
var list2:ArrayBuffer [DataType2] = _

@BeanProperty
var list3:ArrayBuffer [DataType3] = _
$ b @BeanProperty
var list4:ArrayBuffer [DataType4] = _

....
....

@BeanProperty
var list15:ArrayBuffer [DataType15] = _

@BeanProperty
var userName:String = _
}

和所有 DataType * 类扩展 BaseBean



我使用这个将Scala对象序列化为json字符串。



ArrayBufferSerializer.scala

  class ArrayBufferSerializer [T :ClassTag] extends JsonSerializer [ArrayBuffer [T]] {
override def serialize(src:ArrayBuffer [T],typeOfSrc:Type,context:JsonSerializationContext):JsonElement = {
context.serialize(src.toArray然后使用下面的代码序列化为字符串:[b] [b] [b]
}
}



  val gson = new GsonBuilder()。registerTypeAdapter(classOf [ArrayBuffer [FinalData]],new ArrayBufferSerializer [FinalData]()) .serializeNulls.create 
val data = gson.toJson(row)

现在我想将json字符串反序列化为 FinalOutput 对象,所以我创建了 ArrayBufferDeSerializer 类似于此

 类ArrayBufferDeSerializer [T:ClassTag]扩展了JsonDeserializer [ArrayBuffer [T]] {
覆盖def deserialize(json:JsonElement,typeOfT:Type,context:JsonDeserializationContext):ArrayBuffer [T] = {
context.deserialize(json,typeOfT)
}
}

然后调用下面的deserialzie:

  val gson = new GsonBuilder()。registerTypeAdapter(classOf [ArrayBuffer [FinalData]],new ArrayBufferSerializer [FinalData]())。serializeNulls.create 
gson.fromJson ,获得以下错误:


$ $ b

 线程main中的异常org.apache.spark.SparkException:由于阶段失败而导致作业中止:阶段2.0中的任务0失败1次,最近的失败:阶段2.0(TID 3,localhost,executor驱动程序)中的Lost任务0.0:java.lang.StackOverflowError 
,com.google.gson.internal.bind.TypeAdapters $ 29.read(TypeAdapters.java:720 )
在com.google.gson.internal.bind.TypeAdapters $ 29.read(TypeAdapters.java:743)
在com.google.gson.internal.bind.TypeAdapters $ 29.read(TypeAdapters.java :735)
在com.google.gson.internal.bind.TypeAdapters $ 29.read(TypeAdapters.java:718)
在com.google.gson.internal.Streams.parse(Streams.java: 48)
在com.google.gson.TreeTypeAdapter.read(TreeTypeAdapter.java:54)
在com.google.gson.Gson.fromJson(Gson.java:861)
在com .google.gson.Gson.fromJson(Gson.java:926)
在com.google.gson.Gson $ 1.deserialize(Gson.java:131)
在com.cv.util.ArrayBufferDeSerializer。 deserialize(ArrayBufferDeSerializer.scala:15)
at com.cv.util.ArrayBufferDeSerializer.deserialize(ArrayBufferDeSerializer.scala:13)
at com.google.gson.TreeTyp eAdapter.read(TreeTypeAdapter.java:58)


解决方案

反序列化器除了使用相同的参数(相同的json和相同的类型)将反序列化委托给上下文,这会导致上下文再次调用解串器 / em> - 它创建无限循环并产生 StackOverflowError



解串器必须改进 - 我们将序列化 ArrayBuffer 到简单数组中,我们必须相应地对它们进行反序列化。以下是一种方法:

  import com.google.gson.reflect.TypeToken 

class ArrayBufferDeSerializer [T:ClassTag]扩展了JsonDeserializer [ArrayBuffer [T]] {
覆盖def deserialize(json:JsonElement,typeOfT:Type,context:JsonDeserializationContext):ArrayBuffer [T] = {
//因为我们通过将ArrayBuffers转换为简单的数组,我们序列化了
//输入:
val数组:util.ArrayList [T] = context.deserialize(json,new TypeToken [ Array [T]](){}。getType)

//然后,我们将它转​​换回ArrayBuffer:
import collection.JavaConverters._
ArrayBuffer [T] (array.asScala:_ *)
}
}


I have a top level scala class something like below:

FinalOutput.scala:

class FinalOutput extends Serializable {

  @BeanProperty
  var userId: String = _

  @BeanProperty
  var tenantId: String = _

  @BeanProperty
  @SerializedName("type")
  var dataType: String = _

  @BeanProperty
  var data: FinalData = _

  @BeanProperty
  var userCreatedDate: String = _
}

FinalData.scala :

class FinalData extends Serializable {

  @BeanProperty
  var list1: ArrayBuffer[DataType1] = _

  @BeanProperty
  var list2: ArrayBuffer[DataType2] = _

  @BeanProperty
  var list3: ArrayBuffer[DataType3] = _

  @BeanProperty
  var list4: ArrayBuffer[DataType4] = _

  ....
  ....

  @BeanProperty
  var list15: ArrayBuffer[DataType15] = _

  @BeanProperty
  var userName: String = _
}

and all DataType* classes extending BaseBean

I have used this to serialize Scala object into json string.

ArrayBufferSerializer.scala

class ArrayBufferSerializer[T: ClassTag] extends JsonSerializer[ArrayBuffer[T]] {
  override def serialize(src: ArrayBuffer[T], typeOfSrc: Type, context: JsonSerializationContext): JsonElement = {
    context.serialize(src.toArray[Any])
  }
}

then serializing into string using this:

val gson = new GsonBuilder().registerTypeAdapter(classOf[ArrayBuffer[FinalData]], new ArrayBufferSerializer[FinalData]()).serializeNulls.create
val data = gson.toJson(row)

Now I wanted to do the deserialize the json string to FinalOutput object, so I have created ArrayBufferDeSerializer something like this

class ArrayBufferDeSerializer[T: ClassTag] extends JsonDeserializer[ArrayBuffer[T]] {
  override def deserialize(json: JsonElement, typeOfT: Type, context: JsonDeserializationContext): ArrayBuffer[T] = {
    context.deserialize(json, typeOfT)
  }
}

and then calling the below to deserialzie:

val gson = new GsonBuilder().registerTypeAdapter(classOf[ArrayBuffer[FinalData]], new ArrayBufferSerializer[FinalData]()).serializeNulls.create
gson.fromJson(row, classOf[FinalLevelOneSmsOutput])

getting the following error:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0 (TID 3, localhost, executor driver): java.lang.StackOverflowError
    at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:720)
    at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:743)
    at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:735)
    at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:718)
    at com.google.gson.internal.Streams.parse(Streams.java:48)
    at com.google.gson.TreeTypeAdapter.read(TreeTypeAdapter.java:54)
    at com.google.gson.Gson.fromJson(Gson.java:861)
    at com.google.gson.Gson.fromJson(Gson.java:926)
    at com.google.gson.Gson$1.deserialize(Gson.java:131)
    at com.cv.util.ArrayBufferDeSerializer.deserialize(ArrayBufferDeSerializer.scala:15)
    at com.cv.util.ArrayBufferDeSerializer.deserialize(ArrayBufferDeSerializer.scala:13)
    at com.google.gson.TreeTypeAdapter.read(TreeTypeAdapter.java:58)

解决方案

Your Deserializer does nothing but delegate the deserialization back to the context with the same arguments (same json and same type), which would cause the context to call the deserializer again - which creates the infinite loop and the resulting StackOverflowError.

The deserializer has to be improved - since we've serialized ArrayBuffers into "simple" arrays, we have to deserialize them accordingly. Here's one way to do it:

import com.google.gson.reflect.TypeToken

class ArrayBufferDeSerializer[T: ClassTag] extends JsonDeserializer[ArrayBuffer[T]] {
  override def deserialize(json: JsonElement, typeOfT: Type, context: JsonDeserializationContext): ArrayBuffer[T] = {
    // since we've serialized ArrayBuffers by converting them to simple Arrays, we deserialize the
    // input as a simple Array first:
    val array: util.ArrayList[T] = context.deserialize(json, new TypeToken[Array[T]](){}.getType)

    // Then, we convert it back into an ArrayBuffer:
    import collection.JavaConverters._
    ArrayBuffer[T](array.asScala: _*)
  }
}

这篇关于具有ArrayBuffer的Scala对象 - 使用Gson反序列化问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆