无法转换为org.apache.spark.serializer.Serializer [英] Cannot be cast to org.apache.spark.serializer.Serializer

查看:107
本文介绍了无法转换为org.apache.spark.serializer.Serializer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Java解决HashMaps的Spark序列化问题.我指的是链接保存Spark将数据框导入Elasticsearch-无法处理类型异常.

I am trying to solve a Spark serialization issue with HashMaps using Java. I am referring to the link Save Spark Dataframe into Elasticsearch - Can’t handle type exception .

现在我遇到以下问题:

java.lang.ClassCastException: com.spark.util.umf.MyKryoRegistrator cannot be cast to org.apache.spark.serializer.Serializer
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:259)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.(SparkContext.scala:270)
    at org.apache.spark.api.java.JavaSparkContext.JavaSparkContext.scala:61)
    at com.spark.util.umf.MyMain.main(MyMain.java:46)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
15/10/16 01:47:22 INFO yarn.ApplicationMaster: Final app status:
FAILED, exitCode: 15, (reason: User class threw exception:
com.spark.util.umf.MyKryoRegistrator cannot be cast to
org.apache.spark.serializer.Serializer)

我按照以下步骤创建我的 Kryo 注册器:

I create my Kryo registrator as followed :

import java.io.Serializable;
import org.apache.spark.serializer.KryoRegistrator;
import com.esotericsoftware.kryo.Kryo;

public class MyKryoRegistrator implements KryoRegistrator, Serializable {
    @Override
    public void registerClasses(Kryo kryo) {
        // Product POJO associated to a product Row from the DataFrame            
        kryo.register(MyRecord.class); 
    }
}

主要方法:

public static void main(String args[]){

    SparkConf sConf= new SparkConf().setAppName("SparkTestJob");
    sConf.set( "spark.driver.allowMultipleContexts", "true");
    //Kryo kryo = new Kryo();;
    //kryo.setDefaultSerializer(MyRecord.class);
    //my.registerClasses(kryo);
    sConf.set("spark.serializer","com.spark.util.umf.MyKryoRegistrator");

    [...]
}

推荐答案

基于我在问题中提到的链接中提供的答案,您可以看到我已经定义了两个参数:

Based of the answer I provided in the link that you have mentioned in your question, you can see that you I have defined both parameters :

spark.serializer spark.kryo.registrator

因此您必须设置两个参数.

So you have to set both parameters.

如果在未设置串行器的情况下设置了注册器,则不会设置kryo串行器.

If you set the registrator without setting the serializer, the kryo serializer won't be set.

这篇关于无法转换为org.apache.spark.serializer.Serializer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆