使用PySpark内核时出现Jupyter Notebook错误:由于出现致命错误而导致代码失败:发送HTTP请求时出错 [英] Jupyter Notebook error while using PySpark Kernel: the code failed because of a fatal error: Error sending http request

查看:640
本文介绍了使用PySpark内核时出现Jupyter Notebook错误:由于出现致命错误而导致代码失败:发送HTTP请求时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我并使用jupyter Notebook的PySpark内核,我已经成功选择了PySpark内核,但仍然出现以下错误

I and using jupyter notebook's PySpark kernel, I have successfully selected PySpark kernel but I keep getting the below error

代码由于致命错误而失败: 发送HTTP请求时出错,并且遇到最大的重试次数. 可以尝试的一些事情:

The code failed because of a fatal error: Error sending http request and maximum retry encountered.. Some things to try:

a)确保Spark有足够的可用资源供Jupyter创建Spark上下文.

a) Make sure Spark has enough available resources for Jupyter to create a Spark context.

b)与Jupyter管理员联系,以确保正确配置了Spark Magics库.

b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.

c)重新启动内核.

这也是日志

2019-10-10 13:37:43,741 DEBUG   SparkMagics Initialized spark magics.
2019-10-10 13:37:43,742 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookLoaded,Timestamp: 2019-10-10 10:37:43.742475
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Loaded magics.
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Changed language.
2019-10-10 13:37:44,356 DEBUG   python_jupyter_kernel   Registered auto viz.
2019-10-10 13:37:45,440 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationStart,Timestamp: 2019-10-10 10:37:45.440323,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark
2019-10-10 13:37:49,591 ERROR   ReliableHttpClient  Request to 'http://localhost:8998/sessions' failed with 'HTTPConnectionPool(host='localhost', port=8998): Max retries exceeded with url: /sessions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000013184159808>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))'
2019-10-10 13:37:49,591 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationEnd,Timestamp: 2019-10-10 10:37:49.591650,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark,SessionId: -1,Status: not_started,Success: False,ExceptionType: HttpClientException,ExceptionMessage: Error sending http request and maximum retry encountered.
2019-10-10 13:37:49,591 ERROR   SparkMagics Error creating session: Error sending http request and maximum retry encountered.

请注意,我正在尝试在Windows上进行配置. 非常感谢

note that I am trying to configure this on windows. thanks alot

推荐答案

如果您尝试通过Juvy将Jupyter Notebook连接到Spark服务器(例如AWS Glue Development Endpoint),则必须用Spark替换"localhost"服务器IP地址,位于:〜/.sparkmagic/config.json

If you are trying to connect your Jupyter Notebook to a Spark server through Livy (e.g. AWS Glue Development Endpoint), you have to replace "localhost" with the Spark server IP address in: ~/.sparkmagic/config.json

如此处所述: https ://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/

这篇关于使用PySpark内核时出现Jupyter Notebook错误:由于出现致命错误而导致代码失败:发送HTTP请求时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆