无法使用 jdbc 将 spark 数据集写入数据库 [英] Not able to write spark dataset to database using jdbc
问题描述
我需要将我的 spark 数据集写入 oracle 数据库表.我正在使用具有追加模式的数据集写入方法.但是得到分析异常,当使用 spark2-submit 命令在集群上触发 spark 作业时.
I need to write my spark dataset to oracle database table. I am using dataset write method with append mode. But getting analysis exception, when the spark job was triggered on cluster using spark2-submit command.
我已阅读 json 文件,将其展平并设置为 abcDataset 的数据集.
I have read the json file, flattened it and set into a dataset as abcDataset.
Spark 版本 - 2甲骨文数据库JDBC 驱动程序 - oracle.jdbc.driver.OracleDriver编程语言 - Java
Spark Version - 2 Oracle Database JDBC Driver - oracle.jdbc.driver.OracleDriver Programming Language - Java
Dataset<Row> abcDataset= dataframe.select(col('abc').....{and other columns};
Properties dbProperties = new Properties();
InputStream is = SparkReader.class.getClassLoader().getResourceAsStream("dbProperties.yaml");
dbProperties.load(is);
String jdbcUrl = dbProperties.getProperty("jdbcUrl");
dbProperties.put("driver","oracle.jdbc.driver.OracleDriver");
String where = "USER123.PERSON";
abcDataset.write().format("org.apache.spark.sql.execution.datasources.jdbc.DefaultSource").option("driver", "oracle.jdbc.driver.OracleDriver").mode("append").jdbc(jdbcUrl, where, dbProperties);
预期 - 写入数据库但出现以下错误 -
Expected - to write into database but getting the error below -
org.apache.spark.sql.AnalysisException: Multiple sources found for jdbc (org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider, org.apache.spark.sql.execution.datasources.jdbc.DefaultSource), please specify the fully qualified class name.;
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:670)
我们是否需要在 spark submit 命令中设置任何其他属性,因为我正在集群上运行它,或者缺少任何步骤?
Do we need to set any additional property in spark submit command, as i am running this on cluster, or any step is missing ?
推荐答案
当您通过 jdbc 从 spark 写入 rdbms 时,您需要使用 abcDataset.write.jdbc 或 abcDataset.write.format("jdbc").
You need to use either abcDataset.write.jdbc or abcDataset.write.format("jdbc") when you are writing via jdbc from spark to rdbms.
这篇关于无法使用 jdbc 将 spark 数据集写入数据库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!