如何可以激发壳无需安装斯卡拉事前工作? [英] How can spark-shell work without installing Scala beforehand?
问题描述
我已经下载星火1.2.0(pre-内置Hadoop的2.4)。在其快速启动文档,它说:
I have downloaded Spark 1.2.0 (pre-built for Hadoop 2.4). In its quick start doc, it says:
这是在任何的Scala或Python可用。
It is available in either Scala or Python.
什么混淆我是,我的电脑没有单独斯卡拉(OS X 10.10)之前安装,但是当我键入火花壳
它运作良好,而输出显示:
What confuses me is that my computer doesn't have Scala installed separately before (OS X 10.10), but when I type spark-shell
it works well, and the output shows:
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25)
如截图描述:
我之前没有安装任何Scala发行。
I didn't install any Scala distribution before.
如何能激发-shell中运行,而不Scala呢?
How can spark-shell run without Scala?
推荐答案
TL;博士 Scala的二进制文件都包含在装配时,内置或下载星火
tl;dr Scala binaries are included in the assembly when you built or download Spark.
在下载 星火概述你可以读到什么需要运行星火:
Under Downloading in Spark Overview you can read about what is required to run Spark:
星火在Windows和UNIX类系统(例如在Linux,Mac OS)上同时运行。
这很容易在一台机器上本地运行 - 所有你需要的是具备一定Java
安装在你的系统路径,或者JAVA_HOME环境变量
指着一个Java安装。
Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS). It’s easy to run locally on one machine — all you need is to have java installed on your system PATH, or the JAVA_HOME environment variable pointing to a Java installation.
星火关于Java 6+和Python 2.6+运行。对于斯卡拉API,星火1.2.0
使用Scala的2.10。您将需要使用兼容版本的Scala
(2.10.x)。
Spark runs on Java 6+ and Python 2.6+. For the Scala API, Spark 1.2.0 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).
我们强烈建议阅读的文档。
It's highly recommended to read the documentation.
这篇关于如何可以激发壳无需安装斯卡拉事前工作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!