将 PySpark 与 Jupyter Notebook 集成 [英] Integrate PySpark with Jupyter Notebook

查看:45
本文介绍了将 PySpark 与 Jupyter Notebook 集成的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在关注此站点以安装 Jupyter Notebook、PySpark 和整合两者.

I'm following this site to install Jupyter Notebook, PySpark and integrate both.

当我需要创建Jupyter 配置文件"时,我读到Jupyter 配置文件"不再存在.所以我继续执行以下几行.

When I needed to create the "Jupyter profile", I read that "Jupyter profiles" not longer exist. So I continue executing the following lines.

$ mkdir -p ~/.ipython/kernels/pyspark

$ touch ~/.ipython/kernels/pyspark/kernel.json

我打开了 kernel.json 并编写了以下内容:

I opened kernel.json and write the following:

{
 "display_name": "pySpark",
 "language": "python",
 "argv": [
  "/usr/bin/python",
  "-m",
  "IPython.kernel",
  "-f",
  "{connection_file}"
 ],
 "env": {
  "SPARK_HOME": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7",
  "PYTHONPATH": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python:/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip",
  "PYTHONSTARTUP": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python/pyspark/shell.py",
  "PYSPARK_SUBMIT_ARGS": "pyspark-shell"
 }
}

Spark 的路径是正确的.

The paths of Spark are correct.

但是,当我运行 jupyter console --kernel pyspark 我得到这个输出:

But then, when I run jupyter console --kernel pyspark I get this output:

MacBook:~ Agus$ jupyter console --kernel pyspark
/usr/bin/python: No module named IPython
Traceback (most recent call last):
  File "/usr/local/bin/jupyter-console", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python2.7/site-packages/jupyter_core/application.py", line 267, in launch_instance
    return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/traitlets/config/application.py", line 595, in launch_instance
    app.initialize(argv)
  File "<decorator-gen-113>", line 2, in initialize
  File "/usr/local/lib/python2.7/site-packages/traitlets/config/application.py", line 74, in catch_config_error
    return method(app, *args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/jupyter_console/app.py", line 137, in initialize
    self.init_shell()
  File "/usr/local/lib/python2.7/site-packages/jupyter_console/app.py", line 110, in init_shell
    client=self.kernel_client,
  File "/usr/local/lib/python2.7/site-packages/traitlets/config/configurable.py", line 412, in instance
    inst = cls(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/jupyter_console/ptshell.py", line 251, in __init__
    self.init_kernel_info()
  File "/usr/local/lib/python2.7/site-packages/jupyter_console/ptshell.py", line 305, in init_kernel_info
    raise RuntimeError("Kernel didn't respond to kernel_info_request")
RuntimeError: Kernel didn't respond to kernel_info_request

推荐答案

将 pyspark 与 jupyter notebook 集成的多种方式.
1.安装 Apache Toree..>

Many ways to integrate pyspark with jupyter notebook.
1.Install Apache Toree.

  pip install jupyter
  pip install toree
  jupyter toree install --spark_home=path/to/your/spark_directory --interpreters=PySpark

您可以通过

 jupyter kernelspec list

您将获得 toree pyspark 内核的条目

you will get an entry for toree pyspark kernel

  apache_toree_pyspark    /home/pauli/.local/share/jupyter/kernels/apache_toree_pyspark

之后,如果你愿意,你可以安装其他解释器,如 SparkR、Scala、SQL

Afterwards if you want, you can install other intepreters like SparkR,Scala,SQL

 jupyter toree install --interpreters=Scala,SparkR,SQL

2.将这些行添加到 bashrc

  export SPARK_HOME=/path to /spark-2.2.0
  export PATH="$PATH:$SPARK_HOME/bin"    
  export PYSPARK_DRIVER_PYTHON=jupyter
  export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

在终端中输入 pyspark,它将打开一个 jupyter notebook,并初始化了 sparkcontext.

type pyspark in terminal and it will open a jupyter notebook with sparkcontext initialized.

  1. 仅将 pyspark 安装为 python 包
    pip 安装pyspark

现在您可以像导入另一个 python 包一样导入 pyspark.

Now you can import pyspark like another python package.

这篇关于将 PySpark 与 Jupyter Notebook 集成的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆