如何解决阅读和阅读中的代理错误使用Python写入HDFS? [英] How to resolve proxy error in reading & writing to HDFS using Python?

查看:107
本文介绍了如何解决阅读和阅读中的代理错误使用Python写入HDFS?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个要使用Python脚本读写的HDFS.

I have a HDFS to which I want to read and write using a Python script.

import requests
import json
import os
import kerberos
import sys

node = os.getenv("namenode").split(",")
print (node)

local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
read_or_write = sys.argv[3]
print (local_file_path,remote_file_path)

def check_node_status(node):
    for name in node:
        print (name)
        request = requests.get("%s/jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus"%name,
                               verify=False).json()
        status = request["beans"][0]["State"]
        if status =="active":
            nnhost = request["beans"][0]["HostAndPort"]
            splitaddr = nnhost.split(":")
            nnaddress = splitaddr[0]
            print(nnaddress)
            break
    return status,name,nnaddress

def kerberos_auth(nnaddress):
    __, krb_context = kerberos.authGSSClientInit("HTTP@%s"%nnaddress)
    kerberos.authGSSClientStep(krb_context, "")
    negotiate_details = kerberos.authGSSClientResponse(krb_context)
    headers = {"Authorization": "Negotiate " + negotiate_details,
                "Content-Type":"application/binary"}
    return headers

def kerberos_hdfs_upload(status,name,headers):
    print("running upload function")
    if status =="active":
        print("if function")
        data=open('%s'%local_file_path, 'rb').read()
        write_req = requests.put("%s/webhdfs/v1%s?op=CREATE&overwrite=true"%(name,remote_file_path),
                                 headers=headers,
                                 verify=False, 
                                 allow_redirects=True,
                                 data=data)
        print(write_req.text)

def kerberos_hdfs_read(status,name,headers):
    if status == "active":
        read = requests.get("%s/webhdfs/v1%s?op=OPEN"%(name,remote_file_path),
                            headers=headers,
                            verify=False,
                            allow_redirects=True)

        if read.status_code == 200:
            data=open('%s'%local_file_path, 'wb')
            data.write(read.content)
            data.close()
        else : 
            print(read.content)


status, name, nnaddress= check_node_status(node)
headers = kerberos_auth(nnaddress)
if read_or_write == "write":
    kerberos_hdfs_upload(status,name,headers)
elif read_or_write == "read":
    print("fun")
    kerberos_hdfs_read(status,name,headers)

该代码可在我自己的计算机上运行,​​该计算机不位于任何代理后面.但是,在代理后的办公计算机中运行它时,会出现以下代理错误:

The code works on my own machine which is not behind any proxy. But when running it in the office machine, which is behind a proxy, it is giving the following proxy error:

$ python3 python_hdfs.py ./1.png /user/testuser/2018-02-07_1.png write
['https://<servername>:50470', 'https:// <servername>:50470']
./1.png /user/testuser/2018-02-07_1.png
https://<servername>:50470
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 555, in urlopen
    self._prepare_proxy(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 753, in _prepare_proxy
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 230, in connect
    self._tunnel()
  File "/usr/lib/python3.5/http/client.py", line 832, in _tunnel
    message.strip()))
OSError: Tunnel connection failed: 504 Unknown Host

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 376, in send
    timeout=timeout
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 610, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 273, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='<servername>', port=50470): Max retries exceeded with url: /jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 504 Unknown Host',)))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "python_hdfs.py", line 68, in <module>
    status, name, nnaddress= check_node_status(node)
  File "python_hdfs.py", line 23, in check_node_status
    verify=False).json()
  File "/usr/lib/python3/dist-packages/requests/api.py", line 67, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 53, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 468, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 576, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 437, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='<server_name>', port=50470): Max retries exceeded with url: /jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 504 Unknown Host',)))

我尝试在代码中提供代理信息,如下所示:

I tried giving proxy info in the code, like so:

proxies = {
"http": "<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>",
"https": "<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>",
}

node = os.getenv("namenode").split(",")
print (node)
local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
print (local_file_path,remote_file_path)


local_file_path = sys.argv[1]
remote_file_path = sys.argv[2]
read_or_write = sys.argv[3]
print (local_file_path,remote_file_path)

def check_node_status(node):
        for name in node:
                print (name)
                request = requests.get("%s/jmx?qry=Hadoop:service=NameNode,name=NameNodeStatus"%name,proxies=proxies,
                                                           verify=False).json()
                status = request["beans"][0]["State"]
                if status =="active":
                        nnhost = request["beans"][0]["HostAndPort"]
                        splitaddr = nnhost.split(":")
                        nnaddress = splitaddr[0]
                        print(nnaddress)
                        break
        return status,name,nnaddress
### Rest of the code is the same

现在出现以下错误:

$ python3 python_hdfs.py ./1.png /user/testuser/2018-02-07_1.png write
['https://<servername>:50470', 'https:// <servername>:50470']
./1.png /user/testuser/2018-02-07_1.png
https://<servername>:50470
Traceback (most recent call last):
  File "python_hdfs.py", line 73, in <module>
    status, name, nnaddress= check_node_status(node)
  File "python_hdfs.py", line 28, in check_node_status
    verify=False).json()
  File "/usr/lib/python3/dist-packages/requests/api.py", line 67, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 53, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 468, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 576, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 343, in send
    conn = self.get_connection(request.url, proxies)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 254, in get_connection
    proxy_manager = self.proxy_manager_for(proxy)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 160, in proxy_manager_for
    **proxy_kwargs)
  File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 281, in proxy_from_url
    return ProxyManager(proxy_url=url, **kw)
  File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 232, in __init__
    raise ProxySchemeUnknown(proxy.scheme)
requests.packages.urllib3.exceptions.ProxySchemeUnknown: Not supported proxy scheme <proxy_username>

所以,我的问题是,是否需要在kerberos中设置代理才能正常工作?如果是这样,怎么办?我对kerberos不太熟悉.我先运行kinit,然后再运行python代码,以进入kerberos领域,该领域运行良好,并且无需代理即可连接到适当的HDFS服务器.因此,我不知道为什么在读取或写入相同的HDFS服务器时会发生此错误.感谢您的帮助.

So, my question is, do I need to set up the proxy in kerberos for it to be working? If so, how? I am not too familiar with kerberos. I run kinit before running the python code, in order to enter into the kerberos realm, which runs fine and connects to the appropriate HDFS servers without the proxy. So I don't know why this error occurs when reading or writing to the same HDFS servers. Any help is appreciated.

我也在/etc/apt/apt.conf中设置了代理,如下所示:

I also have the proxy set up in /etc/apt/apt.conf like so:

Acquire::http::proxy  "http://<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>/";
Acquire::https::proxy "https://<proxy_username>:<proxy_password>@<proxy_IP>:<proxy_port>/";

我也尝试了以下方法:

$ export http_proxy="http://<user>:<pass>@<proxy>:<port>"
$ export HTTP_PROXY="http://<user>:<pass>@<proxy>:<port>"

$ export https_proxy="http://<user>:<pass>@<proxy>:<port>"
$ export HTTPS_PROXY="http://<user>:<pass>@<proxy>:<port>"

import os

proxy = 'http://<user>:<pass>@<proxy>:<port>'

os.environ['http_proxy'] = proxy 
os.environ['HTTP_PROXY'] = proxy
os.environ['https_proxy'] = proxy
os.environ['HTTPS_PROXY'] = proxy

#rest of the code is same

但是错误仍然存​​在.

But the error persists.

更新:我也尝试了以下方法.

UPDATE: I have also tried the following.

  1. 有人建议我们已经在/etc/apt/apt.conf中设置了代理以连接到网络.但是也许我们不需要代理即可连接到HDFS.因此,尝试在/etc/apt/apt.conf中注释代理,然后再次运行python脚本.我做到了

  1. Somebody suggested that we already have a proxy set up in /etc/apt/apt.conf to connect to the web. But maybe we don't need proxy to connect to the HDFS. So, try commenting the proxies in /etc/apt/apt.conf, and run the python script again. I did that.

$ env | grep代理 http_proxy = http://hfli:Test6969@192.168.44.217:8080 https_proxy = https://hfli:Test6969@192.168.44.217:8080 $未设置http_proxy $未设置https_proxy $ env | grep代理 $

$ env | grep proxy http_proxy=http://hfli:Test6969@192.168.44.217:8080 https_proxy=https://hfli:Test6969@192.168.44.217:8080 $ unset http_proxy $ unset https_proxy $ env | grep proxy $

然后再次运行python脚本-(i)在python脚本中未定义代理,以及(ii)在python脚本中定义了代理.在这两种情况下,我都遇到相同的原始代理错误.

And ran the python script again - (i) without defining proxies in the python script, and also (ii) with the proxies defined in the python script. I got the same original proxy error in both cases.

  1. 我发现以下Java程序应该可以在HDFS上运行Java程序:

  1. I found the following Java program that supposedly gives access to run Java programs on the HDFS:

导入com.sun.security.auth.callback.TextCallbackHandler; 导入org.apache.hadoop.fs.FSDataOutputStream; 导入org.apache.hadoop.fs.FileSystem; 导入org.apache.hadoop.fs.Path; 导入java.io.BufferedReader; 导入java.io.InputStreamReader; 导入javax.security.auth.Subject; 导入javax.security.auth.login.LoginContext;

import com.sun.security.auth.callback.TextCallbackHandler; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.io.BufferedReader; import java.io.InputStreamReader; import javax.security.auth.Subject; import javax.security.auth.login.LoginContext;

导入org.apache.hadoop.conf.Configuration; 导入org.apache.hadoop.security.UserGroupInformation;

import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.security.UserGroupInformation;

公共类HDFS_RW_Secure { 公共静态void main(String [] args)引发异常 { System.setProperty("java.security.auth.login.config","/tmp/sc3_temp/hadoop_kdc.txt"); System.setProperty("java.security.krb5.conf","/tmp/sc3_temp/hadoop_krb.txt");
配置hadoopConf = new Configuration(); //此示例使用密码登录,您可以更改为使用Keytab登录 LoginContext lc; 主题科目; lc = new LoginContext("JaasSample",new TextCallbackHandler()); lc.login(); System.out.println("login");

public class HDFS_RW_Secure { public static void main(String[] args) throws Exception { System.setProperty("java.security.auth.login.config", "/tmp/sc3_temp/hadoop_kdc.txt"); System.setProperty("java.security.krb5.conf", "/tmp/sc3_temp/hadoop_krb.txt");
Configuration hadoopConf= new Configuration(); //this example use password login, you can change to use Keytab login LoginContext lc; Subject subject; lc = new LoginContext("JaasSample", new TextCallbackHandler()); lc.login(); System.out.println("login");

    subject = lc.getSubject();
    UserGroupInformation.setConfiguration(hadoopConf);
    UserGroupInformation ugi = UserGroupInformation.getUGIFromSubject(subject);
    UserGroupInformation.setLoginUser(ugi); 

    Path pt=new Path("hdfs://edhcluster"+args[0]);

    FileSystem fs = FileSystem.get(hadoopConf);

    //write
    FSDataOutputStream fin = fs.create(pt);
    fin.writeUTF("Hello!");
    fin.close();

    BufferedReader br=new BufferedReader(new InputStreamReader(fs.open(pt)));
    String line;
    line=br.readLine();
    while (line != null)
    {
           System.out.println(line);
           line=br.readLine();
    }
    fs.close();
    System.out.println("This is the end.");

} }

我们需要获取其jar文件HDFS.jar,并运行以下shell脚本,以使Java程序能够在HDFS上运行.

We need to take its jar file, HDFS.jar, and run the following shell script to enable Java programs to be run on the HDFS.

nano run.sh
# contents of the run.sh file:
/tmp/sc3_temp/jre1.8.0_161/bin/java -Djavax.net.ssl.trustStore=/tmp/sc3_temp/cacerts -Djavax.net.ssl.trustStorePassword=changeit -jar /tmp/sc3_temp/HDFS.jar $1

因此,我可以使用/user/testuser作为参数运行此shell脚本,以使其能够访问HDFS中的Java程序:

So, I can run this shell script with /user/testuser as the argument to give it access to run Java programs in the HDFS:

./run.sh /user/testuser/test2

给出以下输出:

Debug is  true storeKey false useTicketCache false useKeyTab false doNotPrompt false ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Kerberos username [testuser]: testuser
Kerberos password for testuser: 
        [Krb5LoginModule] user entered username: testuser

principal is testuser@KRB.REALM
Commit Succeeded 

login
2018-02-08 14:09:30,020 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Hello!
This is the end.

所以,我想这是有效的.但是,如何编写等效的Shell脚本来运行Python代码?

So, that's working I suppose. But how do I write an equivalent shell script to run Python codes?

推荐答案

我找到了解决方案.原来,我在找错地方了.似乎用户帐户设置错误.我试图做一些更简单的事情,例如将网页下载到服务器中.而且我注意到它正在下载页面,但是没有修复它的权限.因此,我进行了更多研究,发现创建用户帐户时,并未为其分配适当的所有权.因此,将正确的所有者分配给用户帐户后,代理错误就消失了. (叹气,浪费了很多时间.)

I found the solution. It turns out, I was looking in the wrong place. It seems the user account was set up wrongly. I tried to do something simpler, like downloading a webpage into the server. And I noticed that it was downloading the page, but had no permissions to fix it. So I explored a bit more and found that when the user account was created, it was not assigned proper ownership. So, once I assigned the proper owner to the user account, the proxy error was gone. (Sigh, so much time wasted.)

我已经在此处对此进行了详细介绍.

I have written about it in more detail here.

这篇关于如何解决阅读和阅读中的代理错误使用Python写入HDFS?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆