错误:无法在Scala中找到或加载主类 [英] Error: Could not find or load main class in scala

查看:149
本文介绍了错误:无法在Scala中找到或加载主类的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在安装了scala的eclipse scala插件和eclipse maven插件之后.

After installing eclipse scala plugins and eclipse maven plugin for scala .

我是scala的新手,因此我尝试确保在测试完scala hello world项目后环境能够正常运行.它按预期工作.

I am new to scala , so i tried to so ensured that the enviorment was working after testing a scala hello world project. It works as expected.

但是在尝试执行从公司存储库中签出的项目时,我遇到了困难.无论我做什么(通过mave进行清洁,构建,清洁安装等),我都会收到错误:找不到或加载主类com.company.team.spark.sqlutil.testQuery "同时尝试在项目中运行一个小的hello world程序.我的直觉说,由于pom issse,eclipse无法为该项目创建类文件,但是即使经过几次尝试,我也无法确定它.请帮我解决这个问题

But i am facing difficulty while trying to execute the project that i had checked out from the company's repository. No matter what I do (clean,build, clean-install via mave etc) I am getting a "Error: Could not find or load main class com.company.team.spark.sqlutil.testQuery" while trying to run even a small hello world program inside the project. My hunch says eclipse is unable to create class files for the project due to a pom issse, but I am unable to nail it down even after several tries. Please help me to figure this out

版本:Eclipse Luna发行版(4.4.0) 内部版本号:20140612-0600

Version: Eclipse Luna Release (4.4.0) Build id: 20140612-0600

scala-2.10.6

scala - 2.10.6

Scalacode-testQuery.scala

Scalacode - testQuery.scala

package com.company.team.spark.sqlutil

object testQuery {
  def main(args: Array[String]): Unit = {
   print ("Hello")
  }
}

下面是我使用的POM.

Below is the POM I used.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.company.team.spark</groupId>
    <artifactId>HomeSpark</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>HomeSpark</name>
    <url>http://maven.apache.org</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <lib.dir>${project.basedir}\lib\</lib.dir>
    </properties>

    <dependencies>
    <!--    <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-core</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>system</scope>
            <systemPath>${lib.dir}junit-3.8.1.jar</systemPath>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>2.1.0</version>
            <scope>system</scope>
            <systemPath>${lib.dir}spark-core_2.10-2.1.0.jar</systemPath>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>2.1.0</version>
            <scope>system</scope>
            <systemPath>${lib.dir}spark-sql_2.10-2.1.0.jar</systemPath>
        </dependency>

        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_2.10</artifactId>
            <version>1.5.0</version>
            <scope>system</scope>
            <systemPath>${lib.dir}spark-csv_2.10-1.5.0.jar</systemPath>
        </dependency> -->

  <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.1.0</version>
</dependency>

   <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 --> 
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>2.1.0</version>
</dependency>    

<!-- https://mvnrepository.com/artifact/com.databricks/spark-csv_2.10 -->
<dependency>
    <groupId>com.databricks</groupId>
    <artifactId>spark-csv_2.10</artifactId>
    <version>1.5.0</version>
</dependency>

<dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.9.2</version>
        </dependency>

  </dependencies>



<build>

        <sourceDirectory>${project.basedir}/src/main/scala</sourceDirectory>

        <testOutputDirectory>${project.build.directory}/test-classes</testOutputDirectory>

        <plugins><plugin>

    <groupId>net.alchim31.maven</groupId>
    <artifactId>scala-maven-plugin</artifactId>
    <version>3.1.3</version>
    <executions>
        <execution>
            <goals>
                <goal>compile</goal>
                <goal>testCompile</goal>
            </goals>
        </execution>
    </executions>
</plugin></plugins>

</build>    
</project>

链接到项目结构的图像

推荐答案

在选择了Scala IDE而非集成了Scala IDE插件的eclipse之后,我能够解决问题.

I was able to resolve the issues after opted for scala IDE over eclipse integrated with scala IDE plugin.

还将pom.xml更改为以下内容:

Also changed the pom.xml to the following:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.company.fuison</groupId>
  <artifactId>SomeCloud</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2015</inceptionYear>
  <licenses>
    <license>
      <name>My License</name>
      <url>http://....</url>
      <distribution>repo</distribution>
    </license>
  </licenses>

  <properties>
    <maven.compiler.source>1.6</maven.compiler.source>
    <maven.compiler.target>1.6</maven.compiler.target>
    <encoding>UTF-8</encoding>
    <scala.version>2.11.5</scala.version>
    <scala.compat.version>2.11</scala.compat.version>
  </properties>

  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>${scala.version}</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.0.0</version>
</dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.0</version>
</dependency>


    <dependency>
    <groupId>com.amazonaws</groupId>
    <artifactId>aws-java-sdk</artifactId>
    <version>1.9.2</version>
    </dependency>

<!-- https://mvnrepository.com/artifact/com.databricks/spark-csv_2.11 -->
<dependency>
    <groupId>com.databricks</groupId>
    <artifactId>spark-csv_2.11</artifactId>
    <version>1.5.0</version>
</dependency>
    <!-- Test -->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.12</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.specs2</groupId>
      <artifactId>specs2-core_${scala.compat.version}</artifactId>
      <version>2.4.16</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.scalatest</groupId>
      <artifactId>scalatest_${scala.compat.version}</artifactId>
      <version>2.2.4</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.specs2</groupId>
      <artifactId>specs2-junit_${scala.compat.version}</artifactId>
      <version>2.4.16</version>
      <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-api-scala_2.11</artifactId>
        <version>2.8.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.8.1</version>
    </dependency>



    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library 
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.12.1</version>
    </dependency>
    -->
    <!-- https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging_2.11 -->
    <dependency>
        <groupId>com.typesafe.scala-logging</groupId>
        <artifactId>scala-logging_2.11</artifactId>
        <version>3.5.0</version>
    </dependency>


  </dependencies>

  <build>
    <resources>
            <resource>
                <directory>${project.basedir}/config/log4j</directory>
            </resource>
        </resources>

    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
      <plugin>
        <!-- see http://davidb.github.com/scala-maven-plugin -->
        <groupId>net.alchim31.maven</groupId>
        <artifactId>scala-maven-plugin</artifactId>
        <version>3.2.0</version>
        <executions>
          <execution>
            <goals>
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
            <configuration>
              <args>

                <arg>-dependencyfile</arg>
                <arg>${project.build.directory}/.scala_dependencies</arg>
              </args>
            </configuration>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.18.1</version>
        <configuration>
          <useFile>false</useFile>
          <disableXmlReport>true</disableXmlReport>
          <!-- If you have classpath issue like NoDefClassError,... -->
          <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
          <includes>
            <include>**/*Test.*</include>
            <include>**/*Suite.*</include>
          </includes>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>

这篇关于错误:无法在Scala中找到或加载主类的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆