将pandas数据框以csv格式保存到gcloud存储桶中 [英] Save pandas data frame as csv on to gcloud storage bucket

查看:87
本文介绍了将pandas数据框以csv格式保存到gcloud存储桶中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

from pyspark import SparkContext, SparkConf
from pyspark.sql import SparkSession
import gc
import pandas as pd
import datetime
import numpy as np
import sys



APP_NAME = "DataFrameToCSV"

spark = SparkSession\
    .builder\
    .appName(APP_NAME)\
    .config("spark.sql.crossJoin.enabled","true")\
    .getOrCreate()

group_ids = [1,1,1,1,1,1,1,2,2,2,2,2,2,2]

dates = ["2016-04-01","2016-04-01","2016-04-01","2016-04-20","2016-04-20","2016-04-28","2016-04-28","2016-04-05","2016-04-05","2016-04-05","2016-04-05","2016-04-20","2016-04-20","2016-04-29"]

#event = [0,1,0,0,0,0,1,1,0,0,0,0,1,0]
event = [0,1,1,0,1,0,1,0,0,1,0,0,0,0]

dataFrameArr = np.column_stack((group_ids,dates,event))

df = pd.DataFrame(dataFrameArr,columns = ["group_ids","dates","event"])

以上python代码将在gcloud dataproc的spark集群上运行.我想将熊猫数据帧另存为gs://mybucket/csv_data/

The above python code is to be run on a spark cluster on gcloud dataproc. I would like to save the pandas dataframe as csv file in gcloud storage bucket at gs://mybucket/csv_data/

我该怎么做?

推荐答案

您也可以将此解决方案与Dask一起使用.您可以将DataFrame转换为Dask DataFrame,然后将其写入Cloud Storage上的csv

You can also use this solution with Dask. You can convert your DataFrame to Dask DataFrame, which can be written to csv on Cloud Storage

import dask.dataframe as dd
import pandas
df # your Pandas DataFrame
ddf = dd.from_pandas(df,npartitions=1, sort=True)
ddf.to_csv('gs://YOUR_BUCKET/ddf-*.csv', index=False, sep=',', header=False,  
                               storage_options={'token': gcs.session.credentials}) 

storage_options参数是可选的

storage_options argument is optional

这篇关于将pandas数据框以csv格式保存到gcloud存储桶中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆