如何使用SparkR 1.6.0写入JDBC源? [英] How to write to JDBC source with SparkR 1.6.0?

查看:99
本文介绍了如何使用SparkR 1.6.0写入JDBC源?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用SparkR 1.6.0,我可以使用以下代码从JDBC源中读取信息,

With SparkR 1.6.0 I can read from a JDBC source with the following code,

jdbc_url <- "jdbc:mysql://localhost:3306/dashboard?user=<username>&password=<password>"

df <- sqlContext %>%
  loadDF(source     = "jdbc", 
         url        = jdbc_url, 
         driver     = "com.mysql.jdbc.Driver",
         dbtable    = "db.table_name")

但是在执行计算之后,当我尝试将数据写回数据库时,尝试时遇到了障碍.

But after performing a calculation, when I try to write the data back to the database I've hit a roadblock as attempting...

write.df(df      = df,
         path    = "NULL",
         source  = "jdbc",
         url     = jdbc_url, 
         driver  = "com.mysql.jdbc.Driver",
         dbtable = "db.table_name",
         mode    = "append")

...返回...

ERROR RBackendHandler: save on 55 failed
Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) : 
  java.lang.RuntimeException: org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not allow create table as select.
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:259)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)
    at org.apache.spark.sql.DataFrame.save(DataFrame.scala:2066)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
    at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86)
    at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)
    at io.netty.channel.SimpleChannelIn

环顾网上,我发现了,它告诉我为此安装了补丁从2.0.0版开始包含错误;并且我们还获得了 read.jdbc write.jdbc 函数.

Looking around the web I found this which tells me that a patch for this error was included as of version 2.0.0; and we also get the functions read.jdbc and write.jdbc.

但是,对于这个问题,假设我坚持使用SparkR v1.6.0.有没有一种方法可以写入JDBC源(即是否有一种解决方法可以让我使用SparkR中的 DataFrameWriter.jdbc())?

For this question, though, assume I'm stuck with SparkR v1.6.0. Is there a way to write to JDBC sources (i.e. is there a workaround that would allow me to use DataFrameWriter.jdbc() from SparkR)?

推荐答案

简短的答案是,不,SparkR在2.0.0版之前不支持JDBC写入方法.

The short answer is, no, the JDBC write method was not supported by SparkR until version 2.0.0.

这篇关于如何使用SparkR 1.6.0写入JDBC源?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆