java.lang.OutOfMemoryError:具有配置单元的Java堆空间 [英] java.lang.OutOfMemoryError: Java heap space with hive

查看:306
本文介绍了java.lang.OutOfMemoryError:具有配置单元的Java堆空间的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用了hadoop hive 0.9.0和1.1.2以及netbeans,
但是我得到了这个错误,我无法解决这个问题
请帮助我
代码:

  public class Hive_test {

private static String driverName =org.apache.hadoop.hive.jdbc.HiveDriver ;

@SuppressWarnings(CallToThreadDumpStack)
public static void main(String [] args)throws SQLException {
try {
Class.forName(driverName);
} catch(ClassNotFoundException e){
e.printStackTrace();
System.exit(1);
}
System.out.println(beginr la connexion);
Connection con = DriverManager.getConnection(jdbc:hive:// localhost:10000 / default,,);
Statement stmt = con.createStatement();
ResultSet res = stmt.executeQuery(select * from STATE); $()res.get()(); $(
while(res.next()){
System.out.println(String.valueOf(res.getInt(1))+\t+ res.getString(2));
System.out.println(sql terminer);


$ / code $ / pre

以下错误

 错误:
开始连接
线程main中的异常java.lang.OutOfMemoryError:Java堆空间
at org .apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
at org.apache.thrift .TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hadoop.hive.service.ThriftHive $ Client.recv_execute(ThriftHive.java:116)
at org.apache.hadoop.hive .service.ThriftHive $ Client.execute(ThriftHive.java:103)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop .hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132)
at org.apache.hadoop 。.hive.jdbc.HiveConnection< init>(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(DriverManager.java :571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at hive.Hive_test.main(Hive_test.java:22)

您可以在Hive中设置容器heapsize并解决此错误:

大多数在Hadoop MapReduce框架之上运行的工具提供了调整这些Hadoop级别设置的方法。在Hive中有多种方法可以做到这一点。其中三个显示在这里:
$ b $ 1直接通过Hive命令行传递:

$ pre > hive -hiveconf mapreduce.map.memory.mb = 4096 -hiveconf mapreduce.reduce.memory.mb = 5120 -e从test_table选择count(*);

2)在调用Hive之前设置ENV变量:

  export HIVE_OPTS = -  hiveconf mapreduce.map.memory.mb = 4096 -hiveconf mapreduce.reduce.memory.mb = 5120


$ b 3)在hive CLI中使用set命令。

 蜂房>设置mapreduce.map.memory.mb = 4096; 
hive>设置mapreduce.reduce.memory.mb = 5120;
hive>从test_table中选择count(*);


I used hadoop hive 0.9.0 and 1.1.2 and netbeans, but I got this error and I can not solve this problem please help me code :

public class Hive_test {

private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";

   @SuppressWarnings("CallToThreadDumpStack")
public static void main(String[] args) throws SQLException {
    try {
        Class.forName(driverName);
    } catch (ClassNotFoundException e){
        e.printStackTrace();
        System.exit(1);
    }
            System.out.println("commencer la connexion");
    Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default",""," ");
    Statement stmt = con.createStatement();
    ResultSet res = stmt.executeQuery("select * from STATE");
    while (res.next()){
        System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
                    System.out.println("sql terminer");
    }
}

Error below;

error :
commencer la connexion
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
    at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
    at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116)
    at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103)
    at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
    at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
    at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132)
    at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122)
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
    at java.sql.DriverManager.getConnection(DriverManager.java:571)
    at java.sql.DriverManager.getConnection(DriverManager.java:215)
    at hive.Hive_test.main(Hive_test.java:22)

解决方案

You can set the container heapsize in Hive and resolve this error:

Most tools that operate on top of the Hadoop MapReduce framework provide ways to tune these Hadoop level settings for its jobs. There are multiple ways to do this in Hive. Three of these are shown here:

1) Pass it directly via the Hive command line:

hive -hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120 -e "select count(*) from test_table;"

2) Set the ENV variable before invoking Hive:

export HIVE_OPTS="-hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120"

3) Use the "set" command within the hive CLI.

hive> set mapreduce.map.memory.mb=4096;
hive> set mapreduce.reduce.memory.mb=5120;
hive> select count(*) from test_table;

这篇关于java.lang.OutOfMemoryError:具有配置单元的Java堆空间的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆