从Java中的其他应用程序部署阿帕奇星火应用,最佳实践 [英] Deploy Apache Spark application from another application in Java, best practice

查看:285
本文介绍了从Java中的其他应用程序部署阿帕奇星火应用,最佳实践的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是星火的新用户。我有一个Web服务,它允许用户请求,服务器通过从数据库中读取与推结果返回给数据库以执行一个复杂的数据分析。我提出这些分析的成各种星火应用。目前我使用的火花提交来部署这些应用程序。

I am a new user of Spark. I have a web service that allows a user to request the server to perform a complex data analysis by reading from a database and pushing the results back to the database. I have moved those analysis's into various Spark applications. Currently I use spark-submit to deploy these applications.

不过,我很好奇,当(Java编写的),我的Web服务器接收用户请求,什么被认为是最佳实践的方式来启动相应的应用程​​序星火?星火的文档似乎是用火花提交但我宁愿给终端管出来的命令来执行此操作。我看到了另一种选择,火花JobServer,它提供了一个RESTful接口做的正是这一点,但我的星火应用程序都是用Java或者R,这似乎与火花JobServer界面不错。

However, I am curious, when my web server (written in Java) receives a user request, what is considered the "best practice" way to initiate the corresponding Spark application? Spark's documentation seems to be to use "spark-submit" but I would rather not pipe out the command to a terminal to perform this action. I saw an alternative, Spark-JobServer, which provides an RESTful interface to do exactly this, but my Spark applications are written in either Java or R, which seems to not interface well with Spark-JobServer.

有另一种最佳实践开球距离(在Java中)Web服务器的火花应用,并等待状态造成工作是成功还是失败?

Is there another best-practice to kickoff a spark application from a web server (in Java), and wait for a status result whether the job succeeded or failed?

还有哪些人在做做到这一点任何想法将是非常有益!谢谢!

Any ideas of what other people are doing to accomplish this would be very helpful! Thanks!

推荐答案

我们正在使用的Spark工作服务器,它是工作的罚款与Java还只是构建Java code的罐子和斯卡拉包装它与星火工作作业的服务器。

We are using Spark Job-server and it is working fine with Java also just build jar of Java code and wrap it with Scala to work with Spark Job-Server.

这篇关于从Java中的其他应用程序部署阿帕奇星火应用,最佳实践的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆