启动 Spark-Shell 时出现很多错误 [英] Getting Many Errors when starting Spark-Shell

查看:133
本文介绍了启动 Spark-Shell 时出现很多错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用brew install apache-spark"下载了 spark.当我启动 spark-shell 时,我遇到了很多错误.当我尝试创建一个 Spark 会话时:

I downloaded spark using "brew install apache-spark". When I start spark-shell, I get tons of errors. When I try creating a spark session:

val spark = SparkSession.builder().appName("Spark Postgresql Example").getOrCreate()

我收到以下错误:

Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Nested Throwables StackTrace:
java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

17/07/18 13:12:35 WARN HiveMetaStore: Retrying creating default database after error: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

17/07/18 13:12:35 ERROR Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

还有更多..

 scala> import spark.implicits._
 <console>:18: error: not found: value spark
   import spark.implicits._
          ^

推荐答案

当 spark-shell 没有正常退出,然后一个新会话调用 spark-shell.try restarting the spark-shell 时,就会出现这个错误

This error appears to happen when spark-shell does not exit gracefully, and then a new session invokes spark-shell.try restarting the spark-shell

如果它仍然发生,您可以尝试创建会话

If it is still happening you can try this to create session

var sparkSession = org.apache.spark.sql.SparkSessionbuilder.getOrCreate
var sparkContext = sparkSession.sparkContext

您可以尝试删除 metastore_db/dbex.lck 这将解决您的问题

you can try removing metastore_db/dbex.lck this will fix your problem

您还可以在 {SPARK_HOME}/conf 中配置 hive-site.xml,上下文会自动在当前目录中创建一个名为 metastore_db 的元存储和一个名为仓库的文件夹.修复您正在启动的目录中的权限问题spark-shell 可以解决你的问题

you can also configure the hive-site.xml in {SPARK_HOME}/conf, the context automatically creates a metastore called metastore_db and a folder called warehouse in the current directory.fixing the permission issues in the directory from which you are launching spark-shell can solve your problem

这篇关于启动 Spark-Shell 时出现很多错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆