嵌入式模式下的 Spark - 找不到用户/配置单元/仓库 [英] Spark on embedded mode - user/hive/warehouse not found
问题描述
我在嵌入式本地模式下使用 Apache Spark.我的 pom.xml 和相同版本(spark-core_2.10、spark-sql_2.10 和 spark-hive_2.10)中包含所有依赖项.
I'm using Apache Spark in embedded local mode. I have all the dependencies included in my pom.xml and in the same version (spark-core_2.10, spark-sql_2.10, and spark-hive_2.10).
我只想运行一个 HiveQL 查询来创建一个表(存储为 Parquet).
I just want to run a HiveQL query to create a table (stored as Parquet).
运行以下(相当简单的)代码:
Running the following (rather simple) code:
public class App {
public static void main(String[] args) throws IOException, ClassNotFoundException {
SparkConf sparkConf = new SparkConf().setAppName("JavaSparkSQL").setMaster("local[2]").set("spark.executor.memory", "1g");
JavaSparkContext ctx = new JavaSparkContext(sparkConf);
HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(ctx.sc());
String createQuery = "CREATE TABLE IF NOT EXISTS Test (id int, name string) STORED AS PARQUET";
sqlContext.sql(createQuery);
}
}
...返回以下异常:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/user/hive/warehouse/test is not a directory or unable to create one)
我可以看到在项目根目录中创建的 metastore_db
文件夹.
I can see the metastore_db
folder created in the root of the project.
我四处搜索,发现的解决方案没有帮助——其中大多数不适用于嵌入式模式.
I searched around and the solutions found didn't help --most of them were not for the embedded mode.
- 一种解决方案是检查权限,我对所有内容都使用同一个用户.
- 另一个解决方案是在 HDFS 中手动创建文件夹,我做到了,我可以导航到/user/hive/warehouse/test.
- 一种解决方案是通过添加手动设置元存储:
sqlContext.sql("SET hive.metastore.warehouse.dir=hdfs://localhost:9000/user/hive/warehouse");
.
我现在的想法不多了,有人可以提供任何其他建议吗?
I'm running out of ideas right now, can someone provide any other suggestions?
推荐答案
因为您在本地嵌入模式下运行,所以不考虑 HDFS.这就是错误显示 file:/user/hive/warehouse/test
而不是 hdfs://localhost:9000/user/hive/warehouse/test
的原因.它期望 /user/hive/warehouse/test
存在于您的本地机器上.尝试在本地创建它.
Because you're running in local embedded mode, HDFS is not being considered. This is why the error says file:/user/hive/warehouse/test
rather than hdfs://localhost:9000/user/hive/warehouse/test
. It expects /user/hive/warehouse/test
to exist on your local machine. Try creating it locally.
这篇关于嵌入式模式下的 Spark - 找不到用户/配置单元/仓库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!