单个JVM中有多个SparkSession [英] Multiple SparkSessions in single JVM
问题描述
我有一个关于在一个JVM中创建多个Spark会话的查询.我已经读过,在早期版本的Spark中不建议创建多个上下文. Spark 2.0中的SparkSession也是如此吗.
I have a query regarding creating multiple spark sessions in one JVM. I have read that creating multiple contexts is not recommended in earlier versions of Spark. Is it true with the SparkSession in Spark 2.0 as well.
我正在考虑从UI调用Web服务或servlet,然后该服务创建一个spark会话,执行一些操作并返回结果.这将导致为客户端的每个请求创建一个Spark会话.推荐这种做法吗?
I am thinking of making a call to a web service or a servlet from the UI, and the service creates a spark session, performs some operation and returns the result. This will result in a spark session being created for every request from the client side. Is this practice recommended ?
说我有类似的方法:
public void runSpark()引发异常{
public void runSpark() throws Exception {
SparkSession spark = SparkSession
.builder()
.master("spark://<masterURL>")
.appName("JavaWordCount")
.getOrCreate();
以此类推....
如果将这种方法放在Web服务中,会不会有JVM问题?因此,我可以从main方法中多次调用此方法.但是不确定这是否是一种好习惯.
If I put this method in a web service , will there be any JVM issues ? As such I am able invoke this method multiple times from a main method.But not sure if this is good practice.
推荐答案
不支持,也不会. SPARK-2243 已解决,无法解决.
It is not supported and won't be. SPARK-2243 is resolved as Won't Fix.
如果您需要多个上下文,那么可以使用不同的项目来帮助您(Mist,Livy).
If you need multiple contexts there are different projects which can help you (Mist, Livy).
这篇关于单个JVM中有多个SparkSession的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!