Spark中两个DStream的笛卡尔积 [英] Cartesian product of two DStream in Spark

查看:117
本文介绍了Spark中两个DStream的笛卡尔积的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何在Apache流中生成两个DStream,例如 cartesian(RDD< U>),当在类型T和U的数据集上调用时,该数据流将返回(T,U)对的数据集(全部对元素).

How I can product two DStream in apache streaming like cartesian(RDD<U>) which when called on datasets of types T and U, returns a dataset of (T, U) pairs (all pairs of elements).

一种解决方案是使用join,但效果似乎不太好.

One solution is using join as follow that doesn't seem good.

    JavaPairDStream<Integer, String> xx = DStream_A.mapToPair(s -> {
        return new Tuple2<>(1, s);
    });

    JavaPairDStream<Integer, String> yy = DStream_B.mapToPair(e -> {
        return new Tuple2<>(1, e);
    });

    DStream_A_product_B = xx.join(yy);

有没有更好的解决方案?或如何使用RDD的笛卡尔方法?

Is there any better solution? or how i can use Cartesian method of RDD?

推荐答案

我找到了答案:

JavaPairDStream<String, String> cartes = DStream_A.transformWithToPair(DStream_B, 
     new Function3<JavaPairRDD<String, String>, JavaRDD<String>, Time, JavaPairRDD<String, String>>() {
        @Override
        public JavaPairRDD<String, String> call(JavaRDD<String> rddA, JavaRDD<String> rddB, Time v3) throws Exception {
            JavaPairRDD<String, String> res = rddA.cartesian(rddB);
            return res;
        }
    });

这篇关于Spark中两个DStream的笛卡尔积的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆