Spark 不再在 Windows 中运行 [英] Spark does't run in Windows anymore

查看:22
本文介绍了Spark 不再在 Windows 中运行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有 Windows 10 并且我遵循了本指南安装 Spark 并使其在我的操作系统上运行,只要使用 Jupyter Notebook 工具即可.我使用这个命令来实例化 master 并导入我工作所需的包:pyspark --packages graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local[2]

I have Windows 10 and I followed this guide to install Spark and make it work on my OS, as long as using Jupyter Notebook tool. I used this command to instantiate the master and import the packages I needed for my job: pyspark --packages graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local[2]

然而,后来,我发现没有根据上述指南实例化任何工人,我的任务真的很慢.因此,从这个,由于我找不到任何其他方式将 worker 连接到集群管理器,因为它是由 Docker 运行的,我尝试使用以下命令手动设置所有内容:binspark-class org.apache.spark.deploy.master.Master

However, later, I figured out that any worker wasn't instantiated according to the aforementioned guide and my tasks were really slow. Therefore, taking inspiration from this, since I could not find any other way to connect workers to the Cluster manager due to the fact it was run by Docker, I tried to set up everything manually with the following commands: binspark-class org.apache.spark.deploy.master.Master

master 已正确实例化,因此我继续执行下一个命令:binspark-class org.apache.spark.deploy.worker.Worker spark://:--host 这使我返回以下错误:

The master was correctly instantiated, so I continued by the next command: binspark-class org.apache.spark.deploy.worker.Worker spark://<master_ip>:<port> --host <IP_ADDR> Which returned me the following error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/01 14:14:21 INFO Master: Started daemon with process name: 8168@DESKTOP-A7EPMQG
21/04/01 14:14:21 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.lang.ExceptionInInitializerError
        at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
        at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:57)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:1123)
        at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @60015ef5
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
        at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
        at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
        ... 6 more

从那一刻起,我之前运行的所有命令都不再起作用,它们返回了您可以看到的消息.我想我搞砸了一些 Java 的东西,但老实说,我不明白是什么以及在哪里.

From that moment on, none of the commands I used to run before were working anymore, and they returned the message you can see. I guess I messed up some Java stuff, but I do not understand what and where, honestly.

我的java版本是:

java version "16" 2021-03-16
Java(TM) SE Runtime Environment (build 16+36-2231)
Java HotSpot(TM) 64-Bit Server VM (build 16+36-2231, mixed mode, sharing)

推荐答案

我刚才也遇到了同样的错误,问题似乎与 Java 版本有关.

I got the same error just now, the issue seems with Java version.

  1. 我安装了 java、python、spark 等.所有最新版本......!
  2. 按照以下链接中提到的步骤..

https://phoenixnap.com/kb/install-spark-on-windows-10

  1. 遇到和你一样的错误..!
  2. 从 Oracle 站点下载了 Java SE 8 版本..

https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html

下载jdk-8u281-windows-x64.exe

Downloaded jdk-8u281-windows-x64.exe

  1. 重置 JAVA_HOME.
  2. 启动 spark-shell - 它完美打开,没有任何问题.

仅供参考:我既没有 java 也没有 spark 经验,如果有人觉得有什么地方不对,请纠正我.只是它对我有用,所以在这里提供相同的解决方案.. :)

FYI: I don't have neither java or spark experience, if anyone feels something is wrong please correct me. Just that it worked for me, so providing the same solution here.. :)

谢谢,卡伦

这篇关于Spark 不再在 Windows 中运行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆