Bean类中不能有循环引用,但可以得到类类org.apache.avro.Schema的循环引用 [英] Cannot have circular references in bean class, but got the circular reference of class class org.apache.avro.Schema

查看:45
本文介绍了Bean类中不能有循环引用,但可以得到类类org.apache.avro.Schema的循环引用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将包含Avro对象(例如MyAvroClsass的对象)的javaRDD转换为java spark中的数据帧.我低于错误

I want to convert javaRDD which contain Avro objects(eg. objects of MyAvroClsass) to data frame in java spark. I am getting below ERROR

Cannot have circular references in bean class, but got the circular reference of class class org.apache.avro.Schema

代码:

JavaRDD<Row> test; 
Dataset<Row> outputDF = sparksession.createDataFrame(test.rdd(),<MyAvroClsass>.class);

推荐答案

这与以下内容有关:对于avro类型,createDataFrame中的无限递归

spark-avro项目中正在解决此问题,请参见:https://github.com/databricks/spark-avro/pull/217 https://github.com/databricks/spark-avro/pull/216

There is work being done in the spark-avro project to address this issue see: https://github.com/databricks/spark-avro/pull/217 and https://github.com/databricks/spark-avro/pull/216

一旦合并,应该有一个函数可以将Avro对象的RDD转换为DataSet(行的DataSet等效于DataFrame),而在生成的类中,getSchema()函数不会产生循环引用问题

Once this is merged, there should be a function to convert an RDD of Avro objects into a DataSet (a DataSet of Rows is equivalent to a DataFrame), without the circular reference issue with the getSchema() function in the generated class.

这篇关于Bean类中不能有循环引用,但可以得到类类org.apache.avro.Schema的循环引用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆