可以将spark的驱动程序内存设置为除千兆字节以外的其他值吗? [英] Can spark's driver memory be set to something other than a number of gigabytes?
问题描述
我正在启动pyspark,并且可以通过命令行提供 driver-memory
参数,以指定驱动程序的最大内存使用量.在Spark的在线文档中,他们通常仅使用 1g
或 2g
这样的值作为示例,但是我不确定使用 3300m
是否合法或 4500m
作为值.
I am launching pyspark and I can supply the driver-memory
parameter via command line to specify the maximum memory usage by the driver. In Spark's online documentation, they often just use a value like 1g
or 2g
as examples, but I am not sure if it's legal to use 3300m
or 4500m
as the value.
我认为此参数与jvm的 Xmx
参数有关,该参数必须为1024m的倍数,这更多是我感到困惑的原因.
I think this parameter is related to the jvm's Xmx
parameter which must be a multiple of 1024m which is more of a reason why I am confused.
spark的驱动程序内存参数是否可以正确处理除千兆字节以外的其他内容?
Does spark's driver memory parameter properly handle something other than a number of gigabytes?
推荐答案
是的,它可以工作.查看文档和我以前的经验,您还可以将驱动程序内存设置为mbs.例如:512m
Yes, it works. Looking at the documentation and my previous experience, you can set the driver-memory in mbs also. Eg: 512m
请参阅: http://spark.apache.org/docs/latest/configuration.html
指定字节大小的属性应配置为大小单位.可接受以下格式:
1b (bytes)
1k or 1kb (kibibytes = 1024 bytes)
1m or 1mb (mebibytes = 1024 kibibytes)
1g or 1gb (gibibytes = 1024 mebibytes)
1t or 1tb (tebibytes = 1024 gibibytes)
1p or 1pb (pebibytes = 1024 tebibytes)
这篇关于可以将spark的驱动程序内存设置为除千兆字节以外的其他值吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!