导入我的罐子来激发外壳 [英] Importing my jar to spark shell
问题描述
我有一个简单的 scala maven 模块,它是一个更大项目的一部分(我按照此处的描述创建了它:https://www.jetbrains.com/help/idea/2016.2/creating-a-maven-module.html):
I have a simple scala maven module which is part of a larger project (I created it as described here: https://www.jetbrains.com/help/idea/2016.2/creating-a-maven-module.html):
package com.myorg.simplr
import [...]
@SerialVersionUID(100L)
case class Simplr (){
//class code
}
我试图在 spark shell 中使用这个类,所以我构建了一个 jar 文件simplr-1.0.jar"并使用 --jars simplr-1.0.jar 启动了 spark shell.
I am trying to use this class in spark shell, so I built a jar file "simplr-1.0.jar" and launched the spark shell with --jars simplr-1.0.jar.
然后,当我尝试导入时,我得到以下内容
Then, when I try to import, I get the following
scala> import com.myorg.simplr.Simplr
<console>:25: error: object myorg is not a member of package com
import com.myorg.simplr.Simplr
^
如何使导入工作?
我使用 maven 构建,这是我的 pom.xml:
I used maven to build, and here's my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>my-parent-project</artifactId>
<groupId>com.myorg</groupId>
<version>1.0</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>simplr</artifactId>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
推荐答案
请确保以下几点它会起作用1. 像 ./spark-shell --jars jar_path
一样启动 spark shell2.在你导入的同一个包下的jar中有class文件,打开jar查看.3. 启动 spark 后转到 http://localhost:4040/environment/ 你的 jar 将在类路径条目中与否.
Please make sure some below points it will works
1. start spark shell like ./spark-shell --jars jar_path
2. There is class file in jar under the same package which you import, open jar and check it.
3. After start spark go to http://localhost:4040/environment/ you jar will be in classpath entries or not.
这篇关于导入我的罐子来激发外壳的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!