无法使用Spark在Hive上写入数据 [英] Unable to write data on hive using spark
问题描述
我正在使用spark1.6.我正在使用spark上下文创建hivecontext.当我将数据保存到配置单元中时,会出现错误.我正在使用cloudera vm.我的蜂巢在cloudera vm内部,并在我的系统中启动.我可以使用IP访问虚拟机.我已经在vm上启动了Thrift服务器和hiveserver2.我有 hive.metastore.uris
I am using spark1.6. I am creating hivecontext using spark context. When I save the data into hive it gives error. I am using cloudera vm. My hive is inside cloudera vm and spark in on my system. I can access the vm using IP. I have started the thrift server and hiveserver2 on vm. I have user thrift server uri for hive.metastore.uris
val hiveContext = new HiveContext(sc)
hiveContext.setConf("hive.metastore.uris", "thrift://IP:9083")
............
............
df.write.mode(SaveMode.Append).insertInto("test")
我收到以下错误:
FAILED: SemanticException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
推荐答案
可能在spark conf文件夹中,hive-site.xml不可用,我在下面添加了详细信息.
Probably inside spark conf folder, hive-site.xml is not available , I have added the details below.
在spark配置文件夹中添加hive-site.xml.
Adding hive-site.xml inside spark configuration folder.
在hive配置文件夹中创建一个指向hive-site.xml的符号链接.
creating a symlink which points to hive-site.xml in hive configuration folder.
sudo ln -s/usr/lib/hive/conf/hive-site.xml/usr/lib/spark/conf/hive-site.xml
sudo ln -s /usr/lib/hive/conf/hive-site.xml /usr/lib/spark/conf/hive-site.xml
完成上述步骤后,重新启动spark-shell应该会有所帮助.
after the above steps, restarting spark-shell should help.
这篇关于无法使用Spark在Hive上写入数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!