Apache Spark-内存异常错误-IntelliJ设置 [英] Apache Spark - Memory Exception Error -IntelliJ settings

查看:173
本文介绍了Apache Spark-内存异常错误-IntelliJ设置的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我尝试运行使用Apache Spark的测试时,遇到以下异常:

When I try and run a test that uses Apache Spark I encounter the following exception:

    Exception encountered when invoking run on a nested suite - System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.

我可以通过更改config中的vm来解决该错误,使其具有:-Xms128m -Xmx512m -XX:MaxPermSize=300m -ea,如

I can circumnavigate the error by changing the vm otions in config so that it has :-Xms128m -Xmx512m -XX:MaxPermSize=300m -ea as found in

http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-td25893.html

但是,我不想每次测试都更改该设置,我希望它具有全局性.尝试了各种选择之后,我发现自己在这里希望有人能提供帮助.

But, I don't want to have to change that setting for each test, I'd like it to be global of sorts. Having tried various options I find myself here hoping that someone may help.

我已经重新安装了IDEA 15并进行了更新.另外,我正在运行64位jdk,更新了JAVA_HOME,并正在使用idea64 exe.

I've reinstalled IDEA 15 and updated. In addition I'm running a 64bit jdk, updated JAVA_HOME and am using the idea64 exe.

我还更新了vmoptions文件,并从上方更新了要包含的值,以便读取:

I've also updated the vmoptions file and updated the values from above to be included so that it reads:

    -Xms3g
-Xmx3g
-XX:MaxPermSize=350m
-XX:ReservedCodeCacheSize=240m
-XX:+UseConcMarkSweepGC
-XX:SoftRefLRUPolicyMSPerMB=50
-ea
-Dsun.io.useCanonCaches=false
-Djava.net.preferIPv4Stack=true
-XX:+HeapDumpOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow

我不太了解这些选项,因此可能会出现冲突,但除此之外-我不知道我还能做什么来使此%^ $%^ $ & ; *无需手动更新IDEA中的congif即可进行测试工作.

I'm not great at understanding the options so there could possibly be a conflict but besides that - I've no idea what else I can do to make this %^$%^$&*ing test work without manually updating the congif within IDEA.

任何帮助表示感谢,谢谢.

Any help appreciated, thanks.

推荐答案

在IntelliJ中,您可以为特定类型的(测试)配置创建默认配置,然后为该类型的每个新配置将自动继承这些设置.

In IntelliJ, you can create a Default Configuration for a specific type of (test) configuration, then each new configuration of that type will automatically inherit these settings.

例如,如果要将其应用于所有JUnit测试,请转到运行/调试配置->选择默认值->选择 JUnit ,然后根据需要设置VM选项:

For example, if you want this to be applied to all JUnit tests, go to Run/Debug configurations --> Choose Defaults --> Choose JUnit, and set the VM Options as you like:

保存更改(通过应用"或确定"),然后在下次尝试运行JUnit测试时,它将自动具有以下设置:

Save changes (via Apply or OK), and then, the next time you try running a JUnit test, it will have these settings automatically:

注意:

  • 这可以应用于任何配置类型(例如ScalaTest等),当然不适用于JUnit
  • 如果您已经具有一些现有配置,则它们将不会继承默认配置中的更改,因此您应该删除它们并让IntelliJ重新创建它们(下次单击测试类中的Run或Ctrl + Shift + F10组合键)

这篇关于Apache Spark-内存异常错误-IntelliJ设置的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆