调试Spark应用程序 [英] Debugging Spark Applications

查看:162
本文介绍了调试Spark应用程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用主节点和多个工作节点来调试集群上的Spark应用程序。使用Spark独立集群管理器设置主节点和工作节点成功。我用二进制文件下载了spark文件夹,并使用以下命令来设置worker和master节点。这些命令从spark目录执行。



命令启动master

  ./ sbin / start-master $ sh 




$ b

p $ p> ./ bin / spark-class org.apache.spark.deploy.worker.Worker master-URL

提交申请的命令

  ./ sbin / spark-submit --class应用程序 - -master URL〜/ app.jar 

现在,我想了解通过当我提交应用程序时,工作者节点上的Spark源代码(我只想使用使用reduce()的给定示例之一)。我假设我应该在Eclipse上安装Spark。 Apache Spark网站上的Eclipse设置链接似乎被打破了。感谢一些关于设置Spark和Eclipse的指导,以便在工作节点上逐步运行Spark源代码。



谢谢!

解决方案

区分调试驱动程序和调试其中一个执行程序很重要。它们需要不同的选项传递给 spark-submit



要调试驱动程序,您可以将以下内容添加到 spark-submit 命令。然后将远程调试器设置为连接到您启动驱动程序的节点。

   -  driver-java-options -agentlib :jdwp = transport = dt_socket,server = y,suspend = y,address = 5005 

端口5005被指定,但是您可能需要自定义,如果该端口上已经运行了某些东西。



连接到执行器是类似的,将以下选项添加到您的 spark-submit 命令。

   -  num-executors 1  - 执行器-co1 --confspark.executor.extraJavaOptions = -agentlib:jdwp = transport = dt_socket,server = n,address = wm1b0-8ab.yourcomputer.org:5005,suspend = n

将地址替换为本地计算机的地址。 (这是一个很好的主意,测试你可以从火花簇访问它)。



在这种情况下,在侦听模式下启动调试器,然后启动您的火花程序等待执行者附加到您的调试器。将执行器的数量设置为1或多个执行程序都将尝试连接到调试器,可能会导致问题。



这些示例用于运行 sparkMaster 设置为纱线客户端虽然它们在mesos下运行时也可以工作。如果您使用纱线集群模式运行,则可能必须将驱动程序设置为附加到调试器,而不是将调试器附加到驱动程序,因为您不一定提前知道司机将执行哪些节点。


I am trying to debug a Spark Application on a cluster using a master and several worker nodes. I have been successful at setting up the master node and worker nodes using Spark standalone cluster manager. I downloaded the spark folder with binaries and use the following commands to setup worker and master nodes. These commands are executed from the spark directory.

command for launching master

./sbin/start-master.sh

command for launching worker node

./bin/spark-class org.apache.spark.deploy.worker.Worker master-URL

command for submitting application

./sbin/spark-submit --class Application --master URL ~/app.jar

Now, I would like to understand the flow of control through the Spark source code on the worker nodes when I submit my application(I just want to use one of the given examples that use reduce()). I am assuming I should setup Spark on Eclipse. The Eclipse setup link on the Apache Spark website seems to be broken. I would appreciate some guidance on setting up Spark and Eclipse to enable stepping through Spark source code on the worker nodes.

Thanks!

解决方案

It's important to distinguish between debugging the driver program and debugging one of the executors. They require different options passed to spark-submit

For debugging the driver you can add the following to your spark-submit command. Then set your remote debugger to connect to the node you launched your driver program on.

--driver-java-options -agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005

In this example port 5005 was specified, but you may need to customize that if something is already running on that port.

Connecting to an executor is similar, add the following options to your spark-submit command.

--num-executors 1 --executor-cores 1 --conf "spark.executor.extraJavaOptions=-agentlib:jdwp=transport=dt_socket,server=n,address=wm1b0-8ab.yourcomputer.org:5005,suspend=n"

Replace the address with your local computer's address. (It's a good idea to test that you can access it from your spark cluster).

In this case, start your debugger in listening mode, then start your spark program and wait for the executor to attach to your debugger. It's important to set the number of executors to 1 or multiple executors will all try to connect to your debugger, likely causing problems.

These examples are for running with sparkMaster set as yarn-client although they may also work when running under mesos. If you're running using yarn-cluster mode you may have to set the driver to attach to your debugger rather than attaching your debugger to the driver, since you won't necessarily know in advance what node the driver will be executing on.

这篇关于调试Spark应用程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆