Spark-UbuntuVM-Java运行时环境的内存不足 [英] Spark - UbuntuVM - insufficient memory for the Java Runtime Environment

查看:253
本文介绍了Spark-UbuntuVM-Java运行时环境的内存不足的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在Ubuntu14.04 VM上安装Spark1.5.1.解压缩文件后,我将目录更改为提取的文件夹,并执行命令"./bin/pyspark",这将启动pyspark shell.但我收到如下错误消息:

I'm trying to install Spark1.5.1 on Ubuntu14.04 VM. After un-tarring the file, I changed the directory to the extracted folder and executed the command "./bin/pyspark" which should fire up the pyspark shell. But I got an error message as follows:

[" OpenJDK 64位服务器VM警告:INFO:os :: commit_memory(0x00000000c5550000,715849728,0)失败; 错误=无法分配内存"(errno = 12)没有足够的空间 以便Java Runtime Environment继续存储.

[ OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000c5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12) There is insufficient memory for the Java Runtime Environment to continue.

本机内存分配(malloc)无法分配715849728字节 用于提交保留的内存.

Native memory allocation (malloc) failed to allocate 715849728 bytes for committing reserved memory.

包含更多信息的错误报告文件另存为: /home/datascience/spark-1.5.1-bin-hadoop2.6/hs_err_pid2750.log ]

An error report file with more information is saved as: /home/datascience/spark-1.5.1-bin-hadoop2.6/hs_err_pid2750.log ]

有人可以给我一些指导以解决问题吗?

Could anyone please give me some directions to sort out the problem?

推荐答案

我们需要将conf/spark-defaults.conf文件中的spark.executor.memory设置为特定于您计算机的值.例如,

We need to set spark.executor.memory in conf/spark-defaults.conf file to a value specific to your machine. For example,

usr1@host:~/spark-1.6.1$ cp conf/spark-defaults.conf.template conf/spark-defaults.conf
nano conf/spark-defaults.conf
spark.driver.memory              512m

有关更多信息,请参阅官方文档: http://spark.apache .org/docs/latest/configuration.html

For more information, refer to the official documentation: http://spark.apache.org/docs/latest/configuration.html

这篇关于Spark-UbuntuVM-Java运行时环境的内存不足的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆