Spark解析未用引号引起来的CSV文件中的反斜杠转义逗号 [英] Spark to parse backslash escaped comma in CSV files that are not enclosed by quotes

查看:262
本文介绍了Spark解析未用引号引起来的CSV文件中的反斜杠转义逗号的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

例如,似乎spark无法在未用引号引起来的CSV文件中转义字符,

Seems spark is not able to escape characters in CSV files that are not enclosed by quotes, for example,

Name,Age,Address,Salary
Luke,24,Mountain View\,CA,100

我正在使用pyspark,以下代码显然不适用于地址"字段中的逗号.

I am using pyspark, the following code apparently won't work with the comma inside Address field.

df = spark.read.csv(fname, schema=given_schema,
                sep=',', quote='',mode="FAILFAST")

有什么建议吗?

推荐答案

能否请您先尝试使用rdd,将其重新格式化,然后在其上创建一个数据框.

Could you please give a try using rdd first, reformat it and then create a dataframe over it.

df  = sc.textFile(PATH_TO_FILE) \
    .map(lambda x: x.replace("\\," ,"|")) \
    .mapPartitions(lambda line: csv.reader(line,delimiter=','))\
    .filter(lambda line: line[0] != 'Name') \
    .toDF(['Name','Age','Address','Salary'])

这是您的数据框现在的样子:

this is how your dataframe looks like now:

>>> df.show();
+----+---+----------------+------+
|Name|Age|         Address|Salary|
+----+---+----------------+------+
|Luke| 24|Mountain View|CA|   100|
+----+---+----------------+------+

我必须将地址列"\"替换为"|"然后我使用定界符','分割了数据.不确定它如何满足您的要求,但是否可以正常工作.

I have to replace address column "\," with "|" and then I splitted the data using delimiter ','. Not sure how it matches your requirement but it's working.

这篇关于Spark解析未用引号引起来的CSV文件中的反斜杠转义逗号的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆