Apache Beam-org.apache.beam.sdk.util.UserCodeException:java.sql.SQLException:无法创建PoolableConnectionFactory(不支持方法) [英] Apache Beam - org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)

查看:249
本文介绍了Apache Beam-org.apache.beam.sdk.util.UserCodeException:java.sql.SQLException:无法创建PoolableConnectionFactory(不支持方法)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Apache Beam-dataflow连接到安装在云实例中的配置单元实例。运行此命令时,出现以下异常。当我使用Apache Beam访问此数据库时,就会发生这种情况。我已经看到了许多与apache beam或google dataflow有关的问题。

I am trying to connect to a hive instance installed in cloud instance using Apache beam-dataflow. When I run this, I am getting the below exception. This is happening when I access this db using Apache beam. I have seen many related questions which is not about apache beam or google dataflow.

(c9ec8fdbe9d1719a): java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$8CR0LcYI.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:65)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:47)
at com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
at com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:365)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
        Caused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2294)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn.setup(JdbcIO.java:377)
Caused by: java.sql.SQLException: Method not supported
at org.apache.hive.jdbc.HiveConnection.isValid(HiveConnection.java:898)
at org.apache.commons.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:918)
at org.apache.commons.dbcp2.PoolableConnection.validate(PoolableConnection.java:283)
at org.apache.commons.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:357)
at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2307)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2290)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn.setup(JdbcIO.java:377)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$8CR0LcYI.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:65)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:47)
at com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
at com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:365)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

使用相同的连接字符串和驱动程序文件,我可以使用常规的java-jdbc程序连接到此实例。

With the same connection string and Driver files, I can connect to this instance using a normal java-jdbc program.

现在对此进行了一段时间的提示,但我无法找到解决方案。有人可以对此提出任何想法吗?

Bugging this for a while now, and I am unable to find a solution for it. Can anyone please give any idea on this?

请参见下面连接到配置单元的代码段:

Please see the code snippet connecting to hive below :

PCollection<Customer> collection = dataflowPipeline.apply(JdbcIO.<Customer>read()
            .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration
                    .create("org.apache.hive.jdbc.HiveDriver", "jdbc:hive2://<external IP of computer instance>:10000/dbtest")
                    .withUsername("username").withPassword("password"))     
            .withQuery(
                    "select c_customer_id,c_first_name,c_last_name,c_preferred_cust_flag,c_birth_day,from dbtest.customer")     
            .withRowMapper(new JdbcIO.RowMapper<Customer>() {
                @Override
                public Customer mapRow(ResultSet resultSet) throws Exception {
                    // TODO Auto-generated method stub
                    Customer customer = new Customer();
                    customer.setC_customer_id(resultSet.getString("c_customer_id"));
                    customer.setC_first_name(resultSet.getString("c_first_name"));
                    customer.setC_last_name(resultSet.getString("c_last_name"));
                    customer.setC_preferred_cust_flag(resultSet.getString("c_preferred_cust_flag"));
                    customer.setC_birth_day(resultSet.getInt("c_birth_day"));
                    return customer;
                }
            }).withCoder(AvroCoder.of(Customer.class)));


推荐答案

Apache DBCP BasicDataSource 使用方法 isValid 验证连接,该连接不是由旧版本的Hive JDBC驱动程序实现的-请参见无效操作isValid()导致JDBC到蜂巢的连接失败

Apache DBCP BasicDataSource uses the method isValid to validate a connection, which was not implemented by an old version of the Hive JDBC driver - see JDBC to hive connection fails on invalid operation isValid() .

但是,该方法在Hive 2.1.0之后的版本中实现。 https://github.com/apache/hive/commit/2d2ab0942482a6ce1523dd9dd0f4094865e93b28

However, the method is implemented in versions after Hive 2.1.0. https://github.com/apache/hive/commit/2d2ab0942482a6ce1523dd9dd0f4094865e93b28 .

可以使用较新版本的Hive JDBC驱动程序吗?

Can you use a newer version of the Hive JDBC driver?

这篇关于Apache Beam-org.apache.beam.sdk.util.UserCodeException:java.sql.SQLException:无法创建PoolableConnectionFactory(不支持方法)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆