SQLITE_ERROR:通过JDBC从星火连接到SQLite数据库时,连接已关闭 [英] SQLITE_ERROR: Connection is closed when connecting from Spark via JDBC to SQLite database

查看:1851
本文介绍了SQLITE_ERROR:通过JDBC从星火连接到SQLite数据库时,连接已关闭的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用Apache 1.5.1星火,并试图连接到名为本地SQLite数据库 clinton.db 。创建从数据库的表中的数据帧正常工作,但是当我做创建的对象上的一些操作,我得到低于该说SQL错误或丢失的数据库相连接(关闭)的错误。有趣的是,我得到了操作的结果不过。任何想法,我能做些什么来解决这个问题,即避免了错误?

I am using Apache Spark 1.5.1 and trying to connect to a local SQLite database named clinton.db. Creating a data frame from a table of the database works fine but when I do some operations on the created object, I get the error below which says "SQL error or missing database (Connection is closed)". Funny thing is that I get the result of the operation nevertheless. Any idea what I can do to solve the problem, i.e., avoid the error?

有关火花shell启动的命令:

Start command for spark-shell:

../spark/bin/spark-shell --master local[8] --jars ../libraries/sqlite-jdbc-3.8.11.1.jar --classpath ../libraries/sqlite-jdbc-3.8.11.1.jar

从数据库读取:

val emails = sqlContext.read.format("jdbc").options(Map("url" -> "jdbc:sqlite:../data/clinton.sqlite", "dbtable" -> "Emails")).load()

简单计数(失败):

Simple count (fails):

emails.count

错误:

15/09/30 9时06分39秒WARN JDBCRDD:异常致闭幕词
值java.sql.SQLException:[SQLITE_ERROR] SQL错误或丢失的数据库相连接(关闭)
    在org.sqlite.core.DB.newSQLException(DB.java:890)
    在org.sqlite.core.CoreStatement.internalClose(CoreStatement.java:109)
    在org.sqlite.jdbc3.JDBC3Statement.close(JDBC3Statement.java:35)
    在org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$anon$$close(JDBCRDD.scala:454)
    在org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD $$不久$ 1 $$ anonfun $ 8.apply(JDBCRDD.scala:358)
    在org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD $$不久$ 1 $$ anonfun $ 8.apply(JDBCRDD.scala:358)
    在org.apache.spark.TaskContextImpl $$不久$ 1.onTaskCompletion(TaskContextImpl.scala:60)
    在org.apache.spark.TaskContextImpl $$ anonfun $ markTaskCompleted $ 1.适用(TaskContextImpl.scala:79)
    在org.apache.spark.TaskContextImpl $$ anonfun $ markTaskCompleted $ 1.适用(TaskContextImpl.scala:77)
    在scala.collection.mutable.ResizableArray $ class.foreach(ResizableArray.scala:59)
    在scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    在org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:77)
    在org.apache.spark.scheduler.Task.run(Task.scala:90)
    在org.apache.spark.executor.Executor $ TaskRunner.run(Executor.scala:214)
    在java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    在java.util.concurrent.ThreadPoolExecutor中的$ Worker.run(ThreadPoolExecutor.java:617)
    在java.lang.Thread.run(Thread.java:745)
RES1:龙= 7945

推荐答案

我得到了同样的错误今天和重要的一行只是异常之前:

I got the same error today, and the important line is just before the exception:

15/11/30 12点13分02秒INFO jdbc.JDBCRDD:关闭的连接

15/11/30 12:13:02 INFO jdbc.JDBCRDD: closed connection

15/11/30 12点13分02秒WARN jdbc.JDBCRDD:异常致闭幕词
  值java.sql.SQLException:[SQLITE_ERROR] SQL错误或丢失的数据库相连接(关闭)
      在org.sqlite.core.DB.newSQLException(DB.java:890)
      在org.sqlite.core.CoreStatement.internalClose(CoreStatement.java:109)
      在org.sqlite.jdbc3.JDBC3Statement.close(JDBC3Statement.java:35)
      在org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$anon$$close(JDBCRDD.scala:454)

15/11/30 12:13:02 WARN jdbc.JDBCRDD: Exception closing statement java.sql.SQLException: [SQLITE_ERROR] SQL error or missing database (Connection is closed) at org.sqlite.core.DB.newSQLException(DB.java:890) at org.sqlite.core.CoreStatement.internalClose(CoreStatement.java:109) at org.sqlite.jdbc3.JDBC3Statement.close(JDBC3Statement.java:35) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$anon$$close(JDBCRDD.scala:454)

所以,星火成功关闭JDBC的连接,然后它无法关闭JDBC的语句

So Spark succeeded to close the JDBC connection, and then it fails to close the JDBC statement

查看源,的close()被称为两次

358线(org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD,星火1.5.1)

Line 358 (org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD, Spark 1.5.1)

context.addTaskCompletionListener{ context => close() }

469线

override def hasNext: Boolean = {
  if (!finished) {
    if (!gotNext) {
      nextValue = getNext()
      if (finished) {
        close()
      }
      gotNext = true
    }
  }
  !finished
}

如果你看看的close()办法(行443)

If you look at the close() method (line 443)

def close() {
  if (closed) return

您可以看到,它检查变量关闭,但该值永远不会设置为true。

you can see that it checks the variable closed, but that value is never set to true.

如果我看到正确的,这个错误仍然在主人。我已经提交了 bug报告

If I see it correctly, this bug is still in the master. I have filed a bug report.


  • 来源:<一个href=\"https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala\"相对=nofollow> JDBCRDD.scala (行号略有不同)

  • Source: JDBCRDD.scala (lines numbers differ slightly)

这篇关于SQLITE_ERROR:通过JDBC从星火连接到SQLite数据库时,连接已关闭的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆