Spark不再在Windows中运行 [英] Spark does't run in Windows anymore

查看:74
本文介绍了Spark不再在Windows中运行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有Windows 10,并且遵循本指南只要使用Jupyter Notebook工具即可安装Spark并使其在我的操作系统上正常工作.我使用以下命令实例化了master并导入了我的工作所需的软件包: pyspark-封装graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local [2]

I have Windows 10 and I followed this guide to install Spark and make it work on my OS, as long as using Jupyter Notebook tool. I used this command to instantiate the master and import the packages I needed for my job: pyspark --packages graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local[2]

但是,后来,我发现根据上述指南没有实例化任何工人,而且我的工作真的很慢.因此,从,由于由于它是由Docker运行而导致无法找到将工人连接到集群管理器的其他方法,因此我尝试使用以下命令手动设置所有内容: bin \ spark-class org.apache.spark.deploy.master.Master

However, later, I figured out that any worker wasn't instantiated according to the aforementioned guide and my tasks were really slow. Therefore, taking inspiration from this, since I could not find any other way to connect workers to the Cluster manager due to the fact it was run by Docker, I tried to set up everything manually with the following commands: bin\spark-class org.apache.spark.deploy.master.Master

主节点已正确实例化,因此我继续执行下一条命令: bin \ spark-class org.apache.spark.deploy.worker.Worker spark://< master_ip>:< port>--host< IP_ADDR> 返回了以下错误:

The master was correctly instantiated, so I continued by the next command: bin\spark-class org.apache.spark.deploy.worker.Worker spark://<master_ip>:<port> --host <IP_ADDR> Which returned me the following error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/01 14:14:21 INFO Master: Started daemon with process name: 8168@DESKTOP-A7EPMQG
21/04/01 14:14:21 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.lang.ExceptionInInitializerError
        at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
        at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:57)
        at org.apache.spark.deploy.master.Master$.main(Master.scala:1123)
        at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @60015ef5
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
        at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
        at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
        ... 6 more

从那时起,我以前运行的所有命令都不再起作用,它们返回了您可以看到的消息.我想我弄乱了一些Java东西,但是老实说,我不知道在哪里做什么.

From that moment on, none of the commands I used to run before were working anymore, and they returned the message you can see. I guess I messed up some Java stuff, but I do not understand what and where, honestly.

我的Java版本是:

java version "16" 2021-03-16
Java(TM) SE Runtime Environment (build 16+36-2231)
Java HotSpot(TM) 64-Bit Server VM (build 16+36-2231, mixed mode, sharing)

推荐答案

我刚才遇到了相同的错误,Java版本似乎有此问题.

I got the same error just now, the issue seems with Java version.

  1. 我安装了java,python,spark等.所有最新版本...!
  2. 以下链接中提到的步骤.

https://phoenixnap.com/kb/install-spark-on-windows-10

  1. 得到了与您相同的错误..!
  2. 从Oracle网站下载了Java SE 8版本..

https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html

下载了jdk-8u281-windows-x64.exe

Downloaded jdk-8u281-windows-x64.exe

  1. 重置JAVA_HOME.
  2. 启动了spark-shell-在出现问题时完美打开.

仅供参考:我没有Java或Spark方面的经验,如果有人觉得有问题,请纠正我.只是它对我有用,所以在这里提供相同的解决方案..:)

FYI: I don't have neither java or spark experience, if anyone feels something is wrong please correct me. Just that it worked for me, so providing the same solution here.. :)

谢谢,卡伦

这篇关于Spark不再在Windows中运行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆