具有环境变量的ENTRYPOINT不接受新参数 [英] ENTRYPOINT with environment variables is not acepting new params
问题描述
我们正在创建一个简单的 Dockerfile
,该文件的最后一行是
We are creating a simple Dockerfile
, the last line of that file is
ENTRYPOINT ["sh", "-c", "spark-submit --master $SPARK_MASTER script.py"]
script.py
是一个简单的pyspark应用程序(对本次讨论不重要),该pyspark应用程序接收了一些我们要尝试的参数如下使用 docker
命令传递
The script.py
is a simple pyspark app (is not important for this discussion), this pyspark app receives some parameters that we are trying to pass using the docker
command as follows
docker run --rm my_spark_app_image --param1 something --param2 something_else
但是 script.py
没有得到任何参数,即执行了容器:
But script.py
is not getting any parameter, i.e. the container executed:
spark-submit --master $SPARK_MASTER script.py
预期的行为是容器执行:
The expected behaviour is that the container executes:
spark-submit --master $SPARK_MASTER script.py --param1 something --param2 something_else
我在做什么错了?
推荐答案
/ bin / sh -c
只需要一个参数即可运行脚本。该参数之后的所有内容都是一个外壳变量 $ 0
, $ 1
等,可以由脚本进行解析。虽然您可以使用 / bin / sh -c
语法来做到这一点,但它很尴尬,以后也不会与您一起成长。
The /bin/sh -c
only takes one argument, the script to run. Everything after that argument is a a shell variable $0
, $1
, etc, that can be parsed by the script. While you could do this with the /bin/sh -c
syntax, it's awkward and won't grow with you in the future.
而不是尝试在那里解析变量,我将其移动到图像中包含的entrypoint.sh中:
Rather than trying to parse the variables there, I'd move this into an entrypoint.sh that you include in your image:
#!/bin/sh
exec spark-submit --master $SPARK_MASTER script.py "$@"
,然后更改Dockerfile来定义:
And then change the Dockerfile to define:
COPY entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]
exec
语法用spark-submit进程替换PID 1中的shell脚本,从而允许信号通过。 $ @
将通过 docker run
的任何参数传递,如果您在其中有空格,则引用每个参数。参数。并且由于它是由Shell脚本运行的,因此 $ SPARK_MASTER
会得到扩展。
The exec
syntax replaces the shell script in PID 1 with the spark-submit process, which allows signals to be passed through. The "$@"
will pass through any arguments from docker run
, with each arg quoted in case you have spaces in the parameters. And since it's run by a shell script, the $SPARK_MASTER
will be expanded.
这篇关于具有环境变量的ENTRYPOINT不接受新参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!