在PySpark运行自定义的Java类 [英] Running custom Java class in PySpark

查看:1388
本文介绍了在PySpark运行自定义的Java类的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图运行PySpark定制HDFS读取器类。这个类是用Java编写的,我需要从PySpark访问它,无论是从壳或火花提交。

I'm trying to run a custom HDFS reader class in PySpark. This class is written in Java and I need to access it from PySpark, either from the shell or with spark-submit.

在PySpark,我检索来自SparkContext的JavaGateway( sc._gateway )。

In PySpark, I retrieve the JavaGateway from the SparkContext (sc._gateway).

说我有一个类:

package org.foo.module

public class Foo {

    public int fooMethod() {
        return 1;
    }

}

我试过把它打包成一个罐子,用传递 - 罐子选项pyspark,然后运行:

I've tried to package it into a jar and pass it with the --jar option to pyspark and then running:

from py4j.java_gateway import java_import

jvm = sc._gateway.jvm
java_import(jvm, "org.foo.module.*")

foo = jvm.org.foo.module.Foo()

但我得到的错误:

But I get the error:

Py4JError: Trying to call a package.

有人能帮助呢?谢谢你。

Can someone help with this? Thanks.

推荐答案

在PySpark尝试以下

In PySpark try the following

from py4j.java_gateway import java_import
java_import(sc._gateway.jvm,"org.foo.module.Foo")

func = sc._gateway.jvm.Foo()
func.fooMethod()

请确保您已编译您的Java code到一个可运行的罐子,并提交火花的工作,像这样

Make sure that you have compiled your Java code into a runnable jar and submit the spark job like so

spark-submit --driver-class-path "name_of_your_jar_file.jar" --jars "name_of_your_jar_file.jar" name_of_your_python_file.py

这篇关于在PySpark运行自定义的Java类的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆