在运行spark-shell之前编辑spark-env.sh的正确方法是什么? [英] What is the right way to edit spark-env.sh before running spark-shell?
问题描述
我正在本地Windows计算机上运行spark.我能够成功启动Spark Shell.
I am running spark on my local windows machine. I am able to start spark shell successfully.
我想编辑conf/文件夹中的spark-env.sh文件.将值添加到spark-env.sh文件的正确方法是什么.
I want to edit the spark-env.sh file residing in conf/ folder. What is the right way to add values to the spark-env.sh file.
例如,如果要向SPARK_EXECUTOR_MEMORY变量添加值,该怎么做? 我对可用的不同答案感到困惑 1. SPARK_EXECUTOR_MEMORY ="2G" 2.导出
E.g If I want to add value to SPARK_EXECUTOR_MEMORY variable how to do it? Am getting confused between different answers that are available 1. SPARK_EXECUTOR_MEMORY="2G" 2. export
推荐答案
spark-env.sh
是用于 Unix 的常规bash脚本,因此在Windows安装中将永远不会被拾起.
The spark-env.sh
is a regular bash script intended for Unix, so on a Windows installation it will never get picked up.
在 Windows 上,您需要在conf
目录中具有一个spark-env.cmd
文件,而应使用以下语法:
On Windows, you'll need to have a spark-env.cmd
file in the conf
directory and instead use the following syntax :
set SPARK_EXECUTOR_MEMORY=2G
在 Unix 上,该文件将称为spark-env.sh
,并且您需要在每个属性的前面加上export
(例如:export SPARK_EXECUTOR_MEMORY=2G
)
On Unix, the file will be called spark-env.sh
and you will need to preprend each of your properties with export
(e.g. : export SPARK_EXECUTOR_MEMORY=2G
)
这篇关于在运行spark-shell之前编辑spark-env.sh的正确方法是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!