如何使用apache spark的MLlib的线性回归? [英] how to use the linear regression of MLlib of apache spark?

查看:26
本文介绍了如何使用apache spark的MLlib的线性回归?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是apache spark的新手,从MLlib的文档中,我找到了scala的一个例子,但我真的不知道scala,有人知道java中的例子吗?谢谢!示例代码是

I'm new to the apache spark, and from the document of MLlib, i found a example of scala, but i really don't know scala, is anyone knows a example in java? thanks! the example code is

import org.apache.spark.mllib.regression.LinearRegressionWithSGD
import org.apache.spark.mllib.regression.LabeledPoint

// Load and parse the data
val data = sc.textFile("mllib/data/ridge-data/lpsa.data")
val parsedData = data.map { line =>
  val parts = line.split(',')
  LabeledPoint(parts(0).toDouble, parts(1).split(' ').map(x => x.toDouble).toArray)
}

// Building the model
val numIterations = 20
val model = LinearRegressionWithSGD.train(parsedData, numIterations)

// Evaluate model on training examples and compute training error
val valuesAndPreds = parsedData.map { point =>
  val prediction = model.predict(point.features)
  (point.label, prediction)
}
val MSE = valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2)}.reduce(_ +     _)/valuesAndPreds.count
println("training Mean Squared Error = " + MSE)

来自MLlib的文档谢谢!

推荐答案

如文档所示:

MLlib 的所有方法都使用 Java 友好的类型,因此您可以导入和像在 Scala 中那样调用它们.唯一的警告是这些方法采用 Scala RDD 对象,而 Spark Java API 使用单独的 JavaRDD 类.您可以通过以下方式将 Java RDD 转换为 Scala在 JavaRDD 对象上调用 .rdd().

All of MLlib’s methods use Java-friendly types, so you can import and call them there the same way you do in Scala. The only caveat is that the methods take Scala RDD objects, while the Spark Java API uses a separate JavaRDD class. You can convert a Java RDD to a Scala one by calling .rdd() on your JavaRDD object.

这并不容易,因为您仍然需要在 Java 中重现 Scala 代码,但它可以工作(至少在这种情况下是这样).

This is not easy, since you still have to reproduce the scala code in java, but it works (at least in this case).

话虽如此,这是一个java实现:

Having said that, here is a java implementation :

public void linReg() {
    String master = "local";
    SparkConf conf = new SparkConf().setAppName("csvParser").setMaster(
            master);
    JavaSparkContext sc = new JavaSparkContext(conf);
    JavaRDD<String> data = sc.textFile("mllib/data/ridge-data/lpsa.data");
    JavaRDD<LabeledPoint> parseddata = data
            .map(new Function<String, LabeledPoint>() {
            // I see no ways of just using a lambda, hence more verbosity than with scala
                @Override
                public LabeledPoint call(String line) throws Exception {
                    String[] parts = line.split(",");
                    String[] pointsStr = parts[1].split(" ");
                    double[] points = new double[pointsStr.length];
                    for (int i = 0; i < pointsStr.length; i++)
                        points[i] = Double.valueOf(pointsStr[i]);
                    return new LabeledPoint(Double.valueOf(parts[0]),
                            Vectors.dense(points));
                }
            });

    // Building the model
    int numIterations = 20;
    LinearRegressionModel model = LinearRegressionWithSGD.train(
    parseddata.rdd(), numIterations); // notice the .rdd()

    // Evaluate model on training examples and compute training error
    JavaRDD<Tuple2<Double, Double>> valuesAndPred = parseddata
            .map(point -> new Tuple2<Double, Double>(point.label(), model
                    .predict(point.features())));
    // important point here is the Tuple2 explicit creation.

    double MSE = valuesAndPred.mapToDouble(
            tuple -> Math.pow(tuple._1 - tuple._2, 2)).mean();
    // you can compute the mean with this function, which is much easier
    System.out.println("training Mean Squared Error = "
            + String.valueOf(MSE));
}

它远非完美,但我希望它能让您更好地理解如何在 Mllib 文档中使用 Scala 示例.

It is far from being perfect, but I hope it will make you understand better how to use scala examples on Mllib documentation.

这篇关于如何使用apache spark的MLlib的线性回归?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆