如何实现spark sql分页查询 [英] how to implement spark sql pagination query
本文介绍了如何实现spark sql分页查询的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
有没有人如何在 spark sql 查询中进行分页?
Does anyone how to do pagination in spark sql query?
我需要使用spark sql但不知道如何进行分页.
I need to use spark sql but don't know how to do pagination.
尝试过:
select * from person limit 10, 10
推荐答案
已经6年了,不知道那时候能不能
It has been 6 years, don't know if it was possible back then
我会在答案上添加一个连续的 id 并搜索寄存器之间的偏移量和偏移量 + 限制
I would add a sequential id on the answer and search for registers between offset and offset + limit
在纯 spark sql 查询上,它会是这样的,偏移 10 和限制 10
On pure spark sql query it would be something like this, for offset 10 and limit 10
WITH count_person AS (
SELECT *, monotonically_increasing_id() AS count FROM person)
SELECT * FROM count_person WHERE count > 10 AND count < 20
在 Pyspark 上会非常相似
On Pyspark it would be very similar
import pyspark.sql.functions as F
offset = 10
limit = 10
df = df.withColumn('_id', F.monotonically_increasing_id())
df = df.where(F.col('_id').between(offset, offset + limit))
即使对于大量数据,它也足够灵活和快速
Its flexible and fast enough even for a big volume of data
这篇关于如何实现spark sql分页查询的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文