在sc.textFile中加载本地文件 [英] Loading local file in sc.textFile

查看:206
本文介绍了在sc.textFile中加载本地文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我尝试如下加载本地文件

I try to load a local file as below

File = sc.textFile('file:///D:/Python/files/tit.csv')
File.count()

完整追溯

IllegalArgumentException                  Traceback (most recent call last)
<ipython-input-72-a84ae28a29dc> in <module>()
----> 1 File.count()

/databricks/spark/python/pyspark/rdd.pyc in count(self)
   1002         3
   1003         """
-> 1004         return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
   1005 
   1006     def stats(self):

/databricks/spark/python/pyspark/rdd.pyc in sum(self)
    993         6.0
    994         """
--> 995         return self.mapPartitions(lambda x: [sum(x)]).fold(0, operator.add)
    996 
    997     def count(self):

/databricks/spark/python/pyspark/rdd.pyc in fold(self, zeroValue, op)
    867         # zeroValue provided to each partition is unique from the one provided
868         # to the final reduce call
--> 869         vals = self.mapPartitions(func).collect()
    870         return reduce(op, vals, zeroValue)
    871 

/databricks/spark/python/pyspark/rdd.pyc in collect(self)
    769         """
    770         with SCCallSiteSync(self.context) as css:
--> 771             port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
    772         return list(_load_from_socket(port, self._jrdd_deserializer))
    773 

/databricks/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py in __call__(self, *args)
    811         answer = self.gateway_client.send_command(command)
    812         return_value = get_return_value(
--> 813             answer, self.gateway_client, self.target_id, self.name)
    814 
    815         for temp_arg in temp_args:

/databricks/spark/python/pyspark/sql/utils.pyc in deco(*a, **kw)
     51                 raise AnalysisException(s.split(': ', 1)[1], stackTrace)
     52             if s.startswith('java.lang.IllegalArgumentException: '):
---> 53                 raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
     54             raise
     55     return deco

IllegalArgumentException: u'java.net.URISyntaxException: Expected scheme-specific part at index 2: D:'

怎么了?我照常做例如加载本地文件以使用sc.textFile()触发或者如何在sc.textFile中加载本地文件,而不是HDFS 这些示例适用于scala,但如果我不介意,则适用于python

what's wrong? I do usual way for example load a local file to spark using sc.textFile() or How to load local file in sc.textFile, instead of HDFS These examles are for scala but for python is thr same way if i don't mind

但是

val File = 'D:\\\Python\\files\\tit.csv'


SyntaxError: invalid syntax
  File "<ipython-input-132-2a3878e0290d>", line 1
    val File = 'D:\\\Python\\files\\tit.csv'
           ^
SyntaxError: invalid syntax

推荐答案

更新:Hadoop中似乎存在:"问题.

Update: There seems to be an issue with ":" in hadoop...

filenames with ':' colon throws java.lang.IllegalArgumentException

https://issues.apache.org/jira/browse/HDFS-13

Path should handle all characters

https://issues.apache.org/jira/browse/HADOOP-3257

在此问答中,有人设法通过spark 2.0克服了它

In this Q&A someone manage to overcome it with spark 2.0

Spark 2.0:绝对URI中的相对路径(火花仓库)

这个问题有几个问题:

1)python访问Windows中的本地文件

1) python access to local files in windows

File = sc.textFile('file:///D:/Python/files/tit.csv')
File.count()

可以请您尝试:

import os
inputfile = sc.textFile(os.path.normpath("file://D:/Python/files/tit.csv"))
inputfile.count()

os.path.normpath(path)

os.path.normpath(path)

通过折叠冗余分隔符和上级引用来标准化路径名,以便A//B,A/B/,A/./B和A/foo/../B都成为A/B.此字符串操作可能会更改包含符号链接的路径的含义.在Windows上,它将正斜杠转换为反斜杠.要规范大小写,请使用normcase().

Normalize a pathname by collapsing redundant separators and up-level references so that A//B, A/B/, A/./B and A/foo/../B all become A/B. This string manipulation may change the meaning of a path that contains symbolic links. On Windows, it converts forward slashes to backward slashes. To normalize case, use normcase().

https://docs.python.org/2/library/os.path.html#os.path.normpath

输出为:

>>> os.path.normpath("file://D:/Python/files/tit.csv")
'file:\\D:\\Python\\files\\tit.csv'

2)在python中测试的scala代码:

2) scala code tested in python:

val File = 'D:\\\Python\\files\\tit.csv'
SyntaxError: invalid syntax

此代码不是Scala代码,因此无法在python中运行.

This code doesn't run in python as it is scala code.

这篇关于在sc.textFile中加载本地文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆