Spark SQL 中的数组交集 [英] Array Intersection in Spark SQL

查看:27
本文介绍了Spark SQL 中的数组交集的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个表,其中包含一个名为 writer 的数组类型列,它的值类似于 array[value1, value2], array[value2, value3].... 等

I have a table with a array type column named writer which has the values like array[value1, value2], array[value2, value3].... etc.

我正在执行 self join 以获得在数组之间具有共同值的结果.我试过了:

I am doing self join to get results which have common values between arrays. I tried:

sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECTION(R1.writer, R2.writer)[0] is not null ")

sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECT(R1.writer, R2.writer)[0] is not null ")

但得到同样的异常:

线程main"org.apache.spark.sql.AnalysisException 中的异常:未定义的函数:'ARRAY_INTERSECT'.这个函数既不是注册的临时功能或注册的永久功能数据库默认".第 1 行 pos 80

Exception in thread "main" org.apache.spark.sql.AnalysisException: Undefined function: 'ARRAY_INTERSECT'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 80

可能 Spark SQL 不支持 ARRAY_INTERSECTIONARRAY_INTERSECT.如何在 Spark SQL 中实现我的目标?

Probably Spark SQL does not support ARRAY_INTERSECTION and ARRAY_INTERSECT. How can I achieve my goal in Spark SQL?

推荐答案

你需要一个 udf:

import org.apache.spark.sql.functions.udf

spark.udf.register("array_intersect", 
  (xs: Seq[String], ys: Seq[String]) => xs.intersect(ys))

然后检查交叉点是否为空:

and then check if intersection is empty:

scala> spark.sql("SELECT size(array_intersect(array('1', '2'), array('3', '4'))) = 0").show
+-----------------------------------------+
|(size(UDF(array(1, 2), array(3, 4))) = 0)|
+-----------------------------------------+
|                                     true|
+-----------------------------------------+


scala> spark.sql("SELECT size(array_intersect(array('1', '2'), array('1', '4'))) = 0").show
+-----------------------------------------+
|(size(UDF(array(1, 2), array(1, 4))) = 0)|
+-----------------------------------------+
|                                    false|
+-----------------------------------------+

这篇关于Spark SQL 中的数组交集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆