WSO2 BAM Hive NoSuchObjectException 错误 [英] WSO2 BAM Hive NoSuchObjectException Error

查看:35
本文介绍了WSO2 BAM Hive NoSuchObjectException 错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已按照文档中的说明配置了 BAM 2.4.0使用 WSO2 BAM 进行监控.我正在使用 MySQL.

I have configured BAM 2.4.0 as indicated in the document Monitoring using WSO2 BAM. I am using MySQL.

当我尝试按照 BAM 管理控制台更改统计数据库"部分中的说明运行删除表的脚本时,我收到了此错误.有什么想法吗?

When I try to run the script for drop table as explained on section "Changing the statistics database" from BAM Management Console, I've received this error. Any idea?

[2014-05-08 11:01:19,948] ERROR {hive.ql.metadata.Hive} -  NoSuchObjectException(message:default.APIFaultSummaryData table not found)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1222)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1217)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:360)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1217)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:734)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843)
        at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3127)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:250)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:129)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:62)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1351)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1126)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:934)
        at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
        at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
        at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.executeHiveQuery(HiveExecutorServiceImpl.java:569)
        at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:282)
        at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:189)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)

<小时>

更新错误

[2014-05-08 13:58:00,004]  INFO {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} -  Running script executor task for script am_stats_analyzer_503. [Thu May 08 13:58:00 CEST 2014]
Hive history file=/u01/app/wso2bam-2.4.0/tmp/hive/wso2-querylogs/hive_job_log_root_201405081356_596525563.txt
OK
OK
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
[main] DEBUG org.apache.hadoop.hive.conf.HiveConf  - Using hive-site.xml found on CLASSPATH at /u01/app/wso2bam-2.4.0/repository/conf/advanced/hive-site.xml
log4j:WARN No appenders could be found for logger (org.apache.axiom.util.stax.dialect.StAXDialectDetector).
log4j:WARN Please initialize the log4j system properly.
Execution log at: /u01/app/wso2bam-2.4.0/repository/logs//wso2carbon.log
[2014-05-08 13:58:02,423]  WARN {org.apache.hadoop.mapred.JobClient} -  Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
[2014-05-08 13:58:03,779] ERROR {org.wso2.carbon.bam.notification.task.NotificationDispatchTask} -  Error executing notification dispatch task: Cannot borrow client for TCP,1.33.33.127:7612,TCP,1.33.33.127:7712
org.wso2.carbon.databridge.agent.thrift.exception.AgentException: Cannot borrow client for TCP,1.33.33.127:7612,TCP,1.33.33.127:7712
        at org.wso2.carbon.databridge.agent.thrift.internal.publisher.authenticator.AgentAuthenticator.connect(AgentAuthenticator.java:58)
        at org.wso2.carbon.databridge.agent.thrift.DataPublisher.start(DataPublisher.java:273)
        at org.wso2.carbon.databridge.agent.thrift.DataPublisher.<init>(DataPublisher.java:211)
        at org.wso2.carbon.bam.notification.task.NotificationDispatchTask.initPublisherKS(NotificationDispatchTask.java:103)
        at org.wso2.carbon.bam.notification.task.NotificationDispatchTask.execute(NotificationDispatchTask.java:188)
        at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
        at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
Caused by: org.apache.thrift.transport.TTransportException: Could not connect to 1.33.33.127 on port 7712
        at org.apache.thrift.transport.TSSLTransportFactory.createClient(TSSLTransportFactory.java:212)
        at org.apache.thrift.transport.TSSLTransportFactory.getClientSocket(TSSLTransportFactory.java:166)
        at org.wso2.carbon.databridge.agent.thrift.internal.pool.client.secure.SecureClientPoolFactory.makeObject(SecureClientPoolFactory.java:90)
        at org.wso2.carbon.databridge.agent.thrift.internal.pool.client.secure.SecureClientPoolFactory.makeObject(SecureClientPoolFactory.java:48)
        at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1212)
        at org.wso2.carbon.databridge.agent.thrift.internal.publisher.authenticator.AgentAuthenticator.connect(AgentAuthenticator.java:50)
        ... 11 more
Caused by: java.net.ConnectException: Connection timed out
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:579)
        at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:618)
        at sun.security.ssl.SSLSocketImpl.<init>(SSLSocketImpl.java:407)
        at sun.security.ssl.SSLSocketFactoryImpl.createSocket(SSLSocketFactoryImpl.java:88)
        at org.apache.thrift.transport.TSSLTransportFactory.createClient(TSSLTransportFactory.java:208)
        ... 16 more

<小时>

图片已上传

推荐答案

如果这是您第一次使用 API 管理器配置 BAM,那么您之前可能没有运行 hive 查询,因此没有要删除的表.所以请继续你的测试,看看其余的东西是否测试,除非没有这样的表,否则没有错误的问题.

If this is your first time configuring BAM with API manager, then you may not have run hive query earlier, so no tables to drop. So please continue with your testing and see whether rest of the things testing, there is no issue with error unless no such tables.

这篇关于WSO2 BAM Hive NoSuchObjectException 错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆