我可以运行Eclipse中火花单元测试 [英] Can I run spark unit tests within eclipse

查看:150
本文介绍了我可以运行Eclipse中火花单元测试的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

近日,我们从使用烫伤引发感动。我用日食和Scala IDE Eclipse的写code和测试。测试运行良好与Twitter的JobTest类。使用JobTest任何类是自动提供给作为Eclipse中Scala的单元测试运行。我现在已经失去了这种能力。火花测试用例是用SBT完全可以运行,但在Eclipse中运行配置为这些测试列表没有适用

Recently we moved from using scalding to spark. I used eclipse and the scala IDE for eclipse to write code and tests. The tests ran fine with twitter's JobTest class. Any class using JobTest would be automatically available to run as a scala unit test within eclipse. I've lost that ability now. The spark test cases are perfectly runnable using sbt, but the run configuration in eclipse for these tests lists 'none applicable'.

有没有一种方法,以Eclipse中运行的火花单元测试?

Is there a way to run spark unit tests within eclipse?

推荐答案

我认为使用Java将在斯卡拉工作,同样的方法。基本上只是做一个SparkContext使用主为本地,然后建立并运行单元测试正常。一定要停止当测试完成时SparkContext。

I think this same approach using Java would work in Scala. Basically just make a SparkContext using the master as "local" and then build and run unit tests as normal. Be sure to stop the SparkContext when the test is finished.

我有这个工作星火1.​​0.0但不是一个新的版本。

I have this working for Spark 1.0.0 but not a newer version.

public class Test123 {
  static JavaSparkContext sparkCtx;

  @BeforeClass
  public static void sparkSetup() {
    // Setup Spark
    SparkConf conf = new SparkConf();
    sparkCtx = new JavaSparkContext("local", "test", conf);     
  }

  @AfterClass
  public static void sparkTeardown() {
    sparkCtx.stop();
  }

  @Test
  public void integrationTest() {
    JavaRDD<String> logRawInput = sparkCtx.parallelize(Arrays.asList(new String[] {
            "data1",
            "data2",
            "garbage",
            "data3",
        }));
  }
}

这篇关于我可以运行Eclipse中火花单元测试的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆