在Spark中执行sbt包时出现Java内存问题 [英] Java Memory issue while executing sbt package in spark

查看:57
本文介绍了在Spark中执行sbt包时出现Java内存问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

您能建议我解决以下问题吗?

Can you please suggest me solution for the below issues.

hduser @ hduser-VirtualBox:/usr/local/spark1/project $ sbt软件包 OpenJDK 64位服务器VM警告:INFO:os :: commit_memory(0x00000000a8000000,1073741824,0)失败;error ='无法分配内存'(errno = 12)#

hduser@hduser-VirtualBox:/usr/local/spark1/project$ sbt package OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000a8000000, 1073741824, 0) failed; error='Cannot allocate memory' (errno=12) #

hduser @ hduser-VirtualBox:/usr/local/spark1/project $ java -versionJava版本"1.7.0_65"OpenJDK运行时环境(IcedTea 2.5.3)(7u71-2.5.3-0ubuntu0.14.04.1)OpenJDK 64位服务器VM(内部版本24.65-b04,混合模式)

hduser@hduser-VirtualBox:/usr/local/spark1/project$ java -version java version "1.7.0_65" OpenJDK Runtime Environment (IcedTea 2.5.3) (7u71-2.5.3-0ubuntu0.14.04.1) OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

推荐答案

看起来您正在尝试以相当大的Java堆大小(1GB)运行.我将从减少它开始.如果确实需要那么多,则可能会遇到麻烦:您的计算机似乎没有足够的RAM来为您分配它.

Looks like you're trying to run with quite a large Java heap size (1GB). I'd start by reducing that. If you really do need that much, you might be in trouble: it looks as though your machine just doesn't have enough RAM to allocate it for you.

这篇关于在Spark中执行sbt包时出现Java内存问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆