如何在 IPython Notebook 中加载 jar 依赖项 [英] How to load jar dependenices in IPython Notebook
问题描述
此页面 激励我尝试使用 spark-csv 在 PySpark 中读取 .csv 文件我发现了一些帖子,例如 this 描述如何使用 spark-csv
This page was inspiring me to try out spark-csv for reading .csv file in PySpark I found a couple of posts such as this describing how to use spark-csv
但是我无法通过在启动中包含 .jar 文件或包扩展名来初始化 ipython 实例,而这可以通过 spark-shell 完成.
But I am not able to initialize the ipython instance by including either the .jar file or package extension in the start-up that could be done through spark-shell.
即代替
ipython notebook --profile=pyspark
我试过了
ipython notebook --profile=pyspark --packages com.databricks:spark-csv_2.10:1.0.3
但不支持.
请指教.
推荐答案
您可以简单地将它传递到 PYSPARK_SUBMIT_ARGS
变量中.例如:
You can simply pass it in the PYSPARK_SUBMIT_ARGS
variable. For example:
export PACKAGES="com.databricks:spark-csv_2.11:1.3.0"
export PYSPARK_SUBMIT_ARGS="--packages ${PACKAGES} pyspark-shell"
这些属性也可以在 SparkContext
/SparkSession
和相应的 JVM 启动之前在你的代码中动态设置:
These property can be also set dynamically in your code before SparkContext
/ SparkSession
and corresponding JVM have been started:
packages = "com.databricks:spark-csv_2.11:1.3.0"
os.environ["PYSPARK_SUBMIT_ARGS"] = (
"--packages {0} pyspark-shell".format(packages)
)
这篇关于如何在 IPython Notebook 中加载 jar 依赖项的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!