无法为头节点上本地存储的jar文件提交livy请求 [英] Can't submit a livy request for a jar file stored locally on the head nodes

查看:254
本文介绍了无法为头节点上本地存储的jar文件提交livy请求的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试引用下面的文档,向HDI 3.6集群发出一个livy请求,并运行一个本地存储在头节点上的jar文件。  



https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-livy-rest-interface

I'm trying to reference the documentation below to make a livy request to an HDI 3.6 cluster and run a jar file that is stored locally on the head nodes. 

https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-livy-rest-interface

有人能够这样做吗?文档声称可以完成,但需要更改的配置不会出现在ambari中的spark配置中(livy.file.local-dir-whitelist)。我希望能够提交一个类似下面的主体,其中
文件未使用wasb url引用。我现在可以调用从wasb路径启动一个超级jar,但我们正试图通过避免将其移动到临时livy目录的开销来加快启动工作的时间。



  {"文件":" /home/livy/SparkSimpleApp.jar","&的className QUOT;:" com.microsoft.spark.example.WasbIOTest" }

Has anyone been able to do this? The documentation claims it can be done, but the configuration that needs to change does not appear in my spark configurations in ambari (livy.file.local-dir-whitelist). I'd like to be able to submit a body like below where the file is not referenced using the wasb url. I can currently make the call to start an uber jar from a wasb path, but we are trying to speed up the time it takes the job to start by avoiding the overhead of moving it to the temporary livy directory.

 { "file":"/home/livy/SparkSimpleApp.jar", "className":"com.microsoft.spark.example.WasbIOTest" }

感谢您的帮助!

推荐答案

您好,

我注意到您已就同一文档提供反馈,我会请求您继续讨论已打开的GitHub问题。

I have noticed that you have provide feedback on the same document, and I would request you to continue the discussion on the opened GitHub issue.

https://github.com/MicrosoftDocs/azure-docs/issues/29933

希望这会有所帮助。


这篇关于无法为头节点上本地存储的jar文件提交livy请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆