使用带有自定义类型的 CoGroupByKey 会导致 Coder 错误 [英] Using CoGroupByKey with custom type ends up in a Coder error

查看:22
本文介绍了使用带有自定义类型的 CoGroupByKey 会导致 Coder 错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想加入两个 PCollection(分别来自不同的输入)并按照此处描述的步骤实现,加入 CoGroupByKey"部分:https://cloud.google.com/dataflow/model/group-by-键

I want to join two PCollection (from a different input respectively) and implement by following the step described here, "Joins with CoGroupByKey" section: https://cloud.google.com/dataflow/model/group-by-key

就我而言,我想加入 GeoIP 的块"信息和位置"信息.所以我将 Block 和 Location 定义为一个自定义类,然后像下面这样写:

In my case, I want to join GeoIP's "block" information and "location" information. So I defined Block and Location as a custom class and then wrote like below:

final TupleTag<Block> t1 = new TupleTag<Block>();
final TupleTag<Location> t2 = new TupleTag<Location>();
PCollection<KV<Long, CoGbkResult>> coGbkResultColl = KeyedPCollectionTuple.of(t1, kvGeoNameIDBlock)
        .and(t2, kvGeoNameIDLocation).apply(CoGroupByKey.<Long>create());

一个键有一个 Long 类型的值.我以为已经完成了,但是当我运行 mvn compile 时,它输出以下错误:

A key has a Long type value. I thought it's done but when I run mvn compile, it outputs a following error:

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:java (default-cli) on project xxxx: An exception occured while executing the Java class. null: InvocationTargetException: Unable to return a default Coder for Extract GeoNameID-Block KV/ParMultiDo(ExtractGeoNameIDBlock).out0 [PCollection]. Correct one of the following root causes:
[ERROR]   No Coder has been manually specified;  you may do so using .setCoder().
[ERROR]   Inferring a Coder from the CoderRegistry failed: Cannot provide coder for parameterized type org.apache.beam.sdk.values.KV<java.lang.Long, com.xxx.platform.geoip2.Block>: Unable to provide a Coder for com.xxx.platform.geoip2.Block.
[ERROR]   Building a Coder using a registered CoderProvider failed.
[ERROR]   See suppressed exceptions for detailed failures.
[ERROR]   Using the default output Coder from the producing PTransform failed: Cannot provide coder for parameterized type org.apache.beam.sdk.values.KV<java.lang.Long, com.xxx.platform.geoip2.Block>: Unable to provide a Coder for com.xxx.platform.geoip2.Block.

输出错误的确切 DoFn 是 ExtractGeoNameIDBlock,它只是创建一个键值对(要连接的键)和它本身.

The exact DoFn which outputs an error is ExtractGeoNameIDBlock, which simply creates a key-value pair of its key (to be joined) and itself.

// ExtractGeoNameIDBlock creates KV collection while reading from block CSV
static class ExtractGeoNameIDBlock extends DoFn<String, KV<Long, Block>> {
private static final long serialVersionUID = 1L;

  @ProcessElement
  public void processElement(ProcessContext c) throws Exception {
    String line = c.element();

    if (!line.startsWith("network,")) { // exclude headerline
      Block b = new Block();
      b.loadFromCsvLine(line);

      if (b.getGeonameId() != null) {
        c.output(KV.of(b.getGeonameId(), b));
      }
    }
  }
}

loadFromCsvLine 只是解析 CSV 行,将字段转换为每个对应的类型并分配给其私有字段.

loadFromCsvLine just parse CSV line, convert fields to each corresponding type and assign to its private fields.

所以看起来我需要为我的自定义类设置一些编码器以使其工作.我找到了一份关于编码器的文件,但仍然不确定如何实现我的.https://cloud.google.com/dataflow/model/data-encoding

So it looks I need to set some coder to my custom class to make it work. I found a document referring the coder but still not sure how I can implement mine. https://cloud.google.com/dataflow/model/data-encoding

是否有任何真实示例可以为我的自定义类创建自定义编码器?

Is there any real example that I can follow to create a custom coder for my custom class?

[更新 13:02 09/26/2017]我添加了

[Update 13:02 09/26/2017] I added

CoderRegistry cr = p.getCoderRegistry();
cr.registerCoderForClass(Block.class, AvroCoder.of(Block.class));

然后出现错误

 java.lang.NullPointerException: in com.xxx.platform.geoip2.Block in long null of long in field representedCountryGeonameId of com.xxx.platform.geoip2.Block

[更新 14:05 09/26/2017]我改变了这样的实现:

[Update 14:05 09/26/2017] I changed the implementation like this:

@DefaultCoder(AvroCoder.class)
public class Block {
    private static final Logger LOG = LoggerFactory.getLogger(Block.class);

    @Nullable
    public String network;
    @Nullable
    public Long registeredCountryGeonameId;
:
:

(将@Nullable 设置为所有属性)

(Set @Nullable to all properties)

但仍然出现此错误:

(22eeaf3dfb26f8cc): java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: cannot encode a null Long
    at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:191)
    at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructUnionTableFn.processElement(CoGroupByKey.java:185)
Caused by: org.apache.beam.sdk.coders.CoderException: cannot encode a null Long
    at org.apache.beam.sdk.coders.VarLongCoder.encode(VarLongCoder.java:51)
    at org.apache.beam.sdk.coders.VarLongCoder.encode(VarLongCoder.java:35)
    at org.apache.beam.sdk.coders.Coder.encode(Coder.java:135)
    at com.google.cloud.dataflow.worker.ShuffleSink$ShuffleSinkWriter.encodeToChunk(ShuffleSink.java:320)
    at com.google.cloud.dataflow.worker.ShuffleSink$ShuffleSinkWriter.add(ShuffleSink.java:216)
    at com.google.cloud.dataflow.worker.ShuffleSink$ShuffleSinkWriter.add(ShuffleSink.java:178)
    at com.google.cloud.dataflow.worker.util.common.worker.WriteOperation.process(WriteOperation.java:80)
    at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
    at com.google.cloud.dataflow.worker.ReifyTimestampAndWindowsParDoFnFactory$ReifyTimestampAndWindowsParDoFn.processElement(ReifyTimestampAndWindowsParDoFnFactory.java:68)
    at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
    at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
    at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:183)
    at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructUnionTableFn.processElement(CoGroupByKey.java:185)
    at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructUnionTableFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:233)
    at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
    at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
    at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:183)
    at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    at com.bandainamcoent.platform.GeoIpPopulateTable$ExtractGeoNameIDBlock.processElement(GeoIpPopulateTable.java:79)
    at com.bandainamcoent.platform.GeoIpPopulateTable$ExtractGeoNameIDBlock$DoFnInvoker.invokeProcessElement(Unknown Source)
    at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:233)
    at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
    at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
    at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:187)
    at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:148)
    at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:68)
    at com.google.cloud.dataflow.worker.DataflowWorker.executeWork(DataflowWorker.java:336)
    at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:294)
    at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:135)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:115)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:102)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

谢谢.

推荐答案

看起来您的自定义类 Block 没有指定编码器.您可以创建自己的 Coder,或使用一种通用的,例如 AvroCoder.您还应该使用 CoderRegistry 注册它,以便管道知道如何对 Block 进行编码.

It looks like your custom class Block doesn't have a coder specified. You can create your own Coder, or use one of the general ones such as AvroCoder. You should also register it with the CoderRegistry so the pipeline knows how to encode Blocks.

这篇关于使用带有自定义类型的 CoGroupByKey 会导致 Coder 错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆