蜂巢给出SemanticException [错误10014]:运行我的UDF时 [英] Hive gives SemanticException [Error 10014]: when Running my UDF

查看:1771
本文介绍了蜂巢给出SemanticException [错误10014]:运行我的UDF时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个执行GeoIP查找的配置单元UDF.

I have a hive UDF that does a GeoIP lookup.

public static Text evaluate(Text inputFieldName, Text option,
        Text databaseFileName) {

    String inputField, fieldOption, dbFileName, result = null;

    inputField = inputFieldName.toString();
    fieldOption = option.toString();
    dbFileName = databaseFileName.toString();
    ExtractData eed = new ExtractData();
    try {
        result = eed.ExtractDB(inputField, fieldOption,
                dbFileName);
    } catch (IOException e) {
        e.printStackTrace();
    } catch (GeoIp2Exception e) {
        e.printStackTrace();
    }
    return new Text(result);

}

然后我用它制成一个罐子,然后在蜂巢式Cli中运行以下内容

And then I made a jar out of this and ran the following in the hive Cli

add jar /location_of_jar/MyUDF.jar;
add file /user/riyan/GeoIP2-Enterprise.mmdb;
create temporary function samplefunction as 'com.package.name.App';
select samplefunction('172.73.14.54','country_name','/user/riyan/GeoIP2-Enterprise.mmdb') AS result;

我正在将GeoIP2-Enterprise.mmdb数据库的位置传递给udf.它在我的本地系统上运行良好.但是,当我用它制作jar并在cli中运行它时,出现了一条错误消息

I'm passing the location of the GeoIP2-Enterprise.mmdb database to the udf. It works fine on my local system. But when i make jar out of it and run it in cli, it gives me an error saying

FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''/user/riyan/GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public static org.apache.hadoop.io.Text com.package.name.App.evaluate(org.apache.hadoop.io.Text,org.apache.hadoop.io.Text,org.apache.hadoop.io.Text)  on object com.package.name.App@1777c0e2 of class com.package.name.App with arguments {172.73.14.54:org.apache.hadoop.io.Text, country_name:org.apache.hadoop.io.Text, /user/riyan/GeoIP.mmdb:org.apache.hadoop.io.Text} of size 3

我还尝试将参数从Text更改为String,这给了我同样的例外. 有人可以告诉我我在做什么错吗? 谢谢

I also tried to change the parameters from Text to String and that gave me the same exception. Can someone tell me what I'm doing wrong? thanks

添加以下部分

我在蜂巢调试模式下运行它并得到了

I ran it in hive debug mode and got this

 FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@ of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
17/04/18 11:02:30 [main]: ERROR ql.Driver: FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.bankofamerica.gisds.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
        at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1184)
        at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
        at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
        at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
        at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:132)
        at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
        at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:193)
        at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:146)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10422)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10378)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3771)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3550)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8830)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8785)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9652)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9545)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10018)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10029)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9909)
        at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:223)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:488)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1274)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1391)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1203)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1193)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:220)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:172)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:383)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:775)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:693)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.exec.UDFArgumentException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:171)
        at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
        at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:959)
        at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1176)
        ... 36 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
        at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:978)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:168)
        ... 39 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:954)
        ... 41 more
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.node.ObjectNode.<init>(Lcom/fasterxml/jackson/databind/node/JsonNodeFactory;Ljava/util/Map;)V
        at com.maxmind.db.Decoder.decodeMap(Decoder.java:285)
        at com.maxmind.db.Decoder.decodeByType(Decoder.java:154)
        at com.maxmind.db.Decoder.decode(Decoder.java:147)
        at com.maxmind.db.Decoder.decodeMap(Decoder.java:281)
        at com.maxmind.db.Decoder.decodeByType(Decoder.java:154)
        at com.maxmind.db.Decoder.decode(Decoder.java:147)
        at com.maxmind.db.Decoder.decode(Decoder.java:87)
        at com.maxmind.db.Reader.<init>(Reader.java:132)
        at com.maxmind.db.Reader.<init>(Reader.java:116)
        at com.maxmind.geoip2.DatabaseReader.<init>(DatabaseReader.java:35)
        at com.maxmind.geoip2.DatabaseReader.<init>(DatabaseReader.java:23)
        at com.maxmind.geoip2.DatabaseReader$Builder.build(DatabaseReader.java:129)
        at com.bankofamerica.gisds.ExtractEnterpriseData.ExtractEnterpriseDB(ExtractEnterpriseData.java:27)
        at com.package.name.App.evaluate(App.java:73)
        ... 46 more

推荐答案

基于您的回答,看来您在JAR文件中缺少某些依赖项.您如何编译包含UDF的项目?

Based on your answre, It looks like you are missing some dependencies in your JAR file. how are you compiling your project that contains the UDF?

在Hive类路径中可能缺少此

Probably missing this one in the Hive classpath

<dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-databind</artifactId>
    <version>2.1.4</version>
</dependency>

作为解决方法,您可以尝试使用具有依赖项的jar对其进行编译(在这种情况下,这不是一个好习惯),但至少我们会知道这是否是您的问题

As work around, you could try compile it using jar with dependencies (not a good practice for this case) but at least we will know if that is your problem

<build>
  <plugins>
    <plugin>
      <artifactId>maven-assembly-plugin</artifactId>
      <configuration>
        <archive>
          <manifest>
          </manifest>
        </archive>
        <descriptorRefs>
          <descriptorRef>jar-with-dependencies</descriptorRef>
        </descriptorRefs>
      </configuration>
    </plugin>
  </plugins>
</build>

另一个选择是将此依赖项添加到Hive类路径中,然后重试

the other option is add this dependency to the Hive classpath and try again

https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind/2.1.4

这篇关于蜂巢给出SemanticException [错误10014]:运行我的UDF时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆