为什么我得到未读的块数据-非法状态异常? [英] Why I get unread block data - Illegal State Exception?

查看:233
本文介绍了为什么我得到未读的块数据-非法状态异常?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我只有以下内容:

JavaPairRDD<ImmutableBytesWritable, Result> dataRDD = jsc
            .newAPIHadoopRDD(
                    hbase_conf,
                    TableInputFormat.class,
                    org.apache.hadoop.hbase.io.ImmutableBytesWritable.class,
                    org.apache.hadoop.hbase.client.Result.class);

sparkConf.log().info("Count of data = "+String.valueOf(dataRDD.count()));

我收到此异常:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, server-name): java.lang.IllegalStateException: unread block data
    java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2394)


推荐答案

您可以在此处找到一些提示: https://issues.apache.org/jira/browse/SPARK-1867
一些pepole用hadoop通用库替换了hadoop-client。您的案例似乎与我的案例相似,需要适当的hbase jars。我在 spark.executor.extraClasspath 中添加了 hbase / hbase-0.98.12 / lib / * 。这是设置此罐子的另一种方法。 https://groups.google.com/forum/#!topic/spark -users / gXSfbjauAjo

You can find some hints here: https://issues.apache.org/jira/browse/SPARK-1867 Some pepole replaced hadoop-client with hadoop-common libs. Your case seems similar to mine where it needs proper hbase jars. I added hbase/hbase-0.98.12/lib/* to spark.executor.extraClasspath and it went away. Here's another way to set this jar. https://groups.google.com/forum/#!topic/spark-users/gXSfbjauAjo

这篇关于为什么我得到未读的块数据-非法状态异常?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆