Apache Spark:如何从DataFrame创建矩阵? [英] Apache Spark: How to create a matrix from a DataFrame?

查看:627
本文介绍了Apache Spark:如何从DataFrame创建矩阵?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Apache Spark中有一个带有整数数组的DataFrame,其来源是一组图像.我最终想在其上进行PCA,但仅从阵列中创建矩阵时遇到了麻烦.如何从RDD创建矩阵?

I have a DataFrame in Apache Spark with an array of integers, the source is a set of images. I ultimately want to do PCA on it, but I am having trouble just creating a matrix from my arrays. How do I create a matrix from a RDD?

> imagerdd = traindf.map(lambda row: map(float, row.image))
> mat = DenseMatrix(numRows=206456, numCols=10, values=imagerdd)
Traceback (most recent call last):

  File "<ipython-input-21-6fdaa8cde069>", line 2, in <module>
mat = DenseMatrix(numRows=206456, numCols=10, values=imagerdd)

  File "/usr/local/spark/current/python/lib/pyspark.zip/pyspark/mllib/linalg.py", line 815, in __init__
values = self._convert_to_array(values, np.float64)

  File     "/usr/local/spark/current/python/lib/pyspark.zip/pyspark/mllib/linalg.py", line 806, in _convert_to_array
    return np.asarray(array_like, dtype=dtype)

  File "/usr/local/python/conda/lib/python2.7/site-        packages/numpy/core/numeric.py", line 462, in asarray
    return array(a, dtype, copy=False, order=order)

TypeError: float() argument must be a string or a number

从我能想到的所有可能的安排中,我都得到了相同的错误:

I'm getting the same error from every possible arrangement I can think of:

imagerdd = traindf.map(lambda row: Vectors.dense(row.image))
imagerdd = traindf.map(lambda row: row.image)
imagerdd = traindf.map(lambda row: np.array(row.image))

如果我尝试

> imagedf = traindf.select("image")
> mat = DenseMatrix(numRows=206456, numCols=10, values=imagedf)

回溯(最近通话最近一次):

Traceback (most recent call last):

  File "<ipython-input-26-a8cbdad10291>", line 2, in <module>
mat = DenseMatrix(numRows=206456, numCols=10, values=imagedf)

  File "/usr/local/spark/current/python/lib/pyspark.zip/pyspark/mllib/linalg.py", line 815, in __init__
    values = self._convert_to_array(values, np.float64)

  File "/usr/local/spark/current/python/lib/pyspark.zip/pyspark/mllib/linalg.py", line 806, in _convert_to_array
    return np.asarray(array_like, dtype=dtype)

  File "/usr/local/python/conda/lib/python2.7/site-packages/numpy/core/numeric.py", line 462, in asarray
    return array(a, dtype, copy=False, order=order)

ValueError: setting an array element with a sequence.

推荐答案

由于您没有提供示例输入,因此我假设它看起来或多或少像这样,其中id是行号,而image包含值.

Since you didn't provide an example input I'll assume it looks more or less like this where id is a row number and image contains values.

traindf = sqlContext.createDataFrame([
    (1, [1, 2, 3]),
    (2, [4, 5, 6]),
    (3, (7, 8, 9))
], ("id", "image"))

首先要了解的是DenseMatrix local 数据结构.确切地说,它是numpy.ndarray的包装.到目前为止(Spark 1.4.1)在PySpark MLlib中没有分布式等效项.

First thing you have to understand is that the DenseMatrix is a local data structure. To be precise it is a wrapper around numpy.ndarray. As for now (Spark 1.4.1) there are no distributed equivalents in PySpark MLlib.

密集矩阵采用三个强制性参数numRowsnumColsvalues,其中values是本地数据结构.在您的情况下,您必须先收集:

Dense Matrix take three mandatory arguments numRows, numCols, values where values is a local data structure. In your case you have to collect first:

values = (traindf.
    rdd.
    map(lambda r: (r.id, r.image)). # Extract row id and data
    sortByKey(). # Sort by row id
    flatMap(lambda (id, image): image).
    collect())


ncol = len(traindf.rdd.map(lambda r: r.image).first())
nrow = traindf.count()

dm = DenseMatrix(nrow, ncol, values)

最后:

> print dm.toArray()
[[ 1.  4.  7.]
 [ 2.  5.  8.]
 [ 3.  6.  9.]]

修改:

在Spark 1.5+中,您可以按以下方式使用mllib.linalg.distributed:

In Spark 1.5+ you can use mllib.linalg.distributed as follows:

from pyspark.mllib.linalg.distributed import IndexedRow, IndexedRowMatrix

mat = IndexedRowMatrix(traindf.map(lambda row: IndexedRow(*row)))
mat.numRows()
## 4
mat.numCols()
## 3

尽管到目前为止,API仍然仅限于在实践中有用.

although as for now API is still to limited to be useful in practice.

这篇关于Apache Spark:如何从DataFrame创建矩阵?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆