有没有办法将pyspark数据帧写入redis的azure缓存? [英] Is there a way to write pyspark dataframe to azure cache for redis?

查看:110
本文介绍了有没有办法将pyspark数据帧写入redis的azure缓存?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个包含 2 列的 pyspark 数据框.我为 redis 实例创建了一个 azure 缓存.我想将pyspark数据帧写入redis,数据帧的第一列作为键,第二列作为值.我怎样才能在 azure 中做到这一点?

I'm having a pyspark dataframe with 2 columns. I created a azure cache for redis instance. I would like to write the pyspark dataframe to redis with first column of dataframe as key and second column as value. How can I do it in azure?

推荐答案

你需要利用这个库:https://github.com/RedisLabs/spark-redis以及所需的相关 jar(取决于您使用的 spark+scala 版本).

You need to leverage this library:https://github.com/RedisLabs/spark-redis along with the associated jar needed(depending on which version of spark+scala you are using).

就我而言,我在 spark cluster(Scala=2.12) 最新 spark 上安装了 3 个 jar:

In my case I have installed 3 jars on spark cluster(Scala=2.12) latest spark:

  1. spark_redis_2_12_2_6_0.jar
  2. commons_pool2_2_10_0.jar
  3. jedis_3_6_0.jar

关于连接redis的配置:

Along the configuration for connecting to redis:

spark.redis.auth PASSWORD
spark.redis.port 6379
spark.redis.host xxxx.xxx.cache.windows.net

确保你有 azure redis 4.0,库可能有 6.0 的问题.推送示例代码:

Make sure you have azure redis 4.0, the library might have issue with 6.0. Sample code to push:

    from pyspark.sql.types import StructType, StructField, StringType
schema = StructType([
    StructField("id", StringType(), True),
    StructField("colA", StringType(), True),
    StructField("colB", StringType(), True)
])

data = [
    ['1', '8', '2'],
    ['2', '5', '3'],
    ['3', '3', '1'],
    ['4', '7', '2']
]
df = spark.createDataFrame(data, schema=schema)
df.show()
--------------
(
    df.
    write.
    format("org.apache.spark.sql.redis").
    option("table", "mytable").
    option("key.column", "id").
    save()
)

 

这篇关于有没有办法将pyspark数据帧写入redis的azure缓存?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆