在 PySpark 中运行自定义 Java 类 [英] Running custom Java class in PySpark

查看:26
本文介绍了在 PySpark 中运行自定义 Java 类的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在 PySpark 中运行自定义 HDFS 读取器类.这个类是用 Java 编写的,我需要从 PySpark 访问它,无论是从 shell 还是通过 spark-submit.

I'm trying to run a custom HDFS reader class in PySpark. This class is written in Java and I need to access it from PySpark, either from the shell or with spark-submit.

在 PySpark 中,我从 SparkContext (sc._gateway) 中检索 JavaGateway.

In PySpark, I retrieve the JavaGateway from the SparkContext (sc._gateway).

假设我有一堂课:

package org.foo.module

public class Foo {

    public int fooMethod() {
        return 1;
    }

}

我尝试将它打包到一个 jar 中,并使用 --jar 选项将其传递给 pyspark,然后运行:

I've tried to package it into a jar and pass it with the --jar option to pyspark and then running:

from py4j.java_gateway import java_import

jvm = sc._gateway.jvm
java_import(jvm, "org.foo.module.*")

foo = jvm.org.foo.module.Foo()

但我收到错误:

Py4JError: Trying to call a package.

有人可以帮忙吗?谢谢.

Can someone help with this? Thanks.

推荐答案

在 PySpark 中尝试以下操作

In PySpark try the following

from py4j.java_gateway import java_import
java_import(sc._gateway.jvm,"org.foo.module.Foo")

func = sc._gateway.jvm.Foo()
func.fooMethod()

确保您已将 Java 代码编译成可运行的 jar 并像这样提交 spark 作业

Make sure that you have compiled your Java code into a runnable jar and submit the spark job like so

spark-submit --driver-class-path "name_of_your_jar_file.jar" --jars "name_of_your_jar_file.jar" name_of_your_python_file.py

这篇关于在 PySpark 中运行自定义 Java 类的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆