笛卡尔积中的Spark Unique对 [英] Spark Unique pair in cartesian product
本文介绍了笛卡尔积中的Spark Unique对的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我有这个:
In [1]:a = sc.parallelize([a,b,c])
In [2]:a.cartesian(a).collect()
Out[3]: [(a, a), (a, b), (a, c), (b, a), (c, a), (b, b), (b, c), (c, b), (c, c)]
我想要以下结果:
In [1]:a = sc.parallelize([1,2,3])
In [2]:a.cartesianMoreInteligent(a).collect()
Out[3]: [(a, a), (a, b), (a, c), (b, b), (b, c), (c, c)]
因为我的演算返回了对称矩阵(相关性). 实现此目标的最佳方法是什么? (无循环) 使用a,b和c可以是任何值,甚至可以是元组.
Because my calculus return a symetrical matrix (correlation). What is the best way to achieve this ? (No loop) With a, b and c can be anything, even tuple.
推荐答案
不确定python语法,但是在scala中,您可以编写:
Not sure about the python syntax, but in scala you could write:
a.cartesian(a).filter{ case (a,b) => a <= b }.collect()
我的猜测是在python中,它类似于:
My guess is in python it would be something like:
a.cartesian(a).filter(lambda a, b: a <= b).collect()
这篇关于笛卡尔积中的Spark Unique对的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文