如何为Opencl应用程序使用hadoop MapReuce框架? [英] How to use hadoop MapReuce framework for an Opencl application?

查看:163
本文介绍了如何为Opencl应用程序使用hadoop MapReuce框架?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个opencl应用程序,其基本目标是在GPU平台上实现数据挖掘算法。我想使用Hadoop分布式文件系统并希望在多个节点上执行该应用程序。我使用的是MapReduce框架,我将基本算法分成了两部分,即'Map'和'Reduce'。

我之前从未在hadoop工作过,所以我有一些问题:


  1. 我是否在Java中使用Hadoop和Mapeduce框架编写应用程序?

  2. 我已经编写了map和opencl中的内核函数。是否可以将HDFS文件系统用于非Java GPU计算应用程序? (注:我不想使用JavaCL或Aparapi)
  3. HDFS是一个文件系统;您可以使用任何语言的HDFS文件系统。

    HDFS数据分布在多台机器上,高度可用于处理GPU计算中的数据。

    有关更多信息,请参阅 Hadoop Streaming


    I am developing an application in opencl whose basic objective is to implement a data mining algorithm on GPU platform. I want to use Hadoop Distributed File System and want to execute the application on multiple nodes. I am using MapReduce framework and I have divided my basic algorithm into two parts i.e. 'Map' and 'Reduce'.

    I have never worked in hadoop before so I have some questions:

    1. Do I have write my application in java only to use Hadoop and Mapeduce framework?
    2. I have written kernel functions for map and reduce in opencl. Is it possible to use HDFS a file system for a non java GPU-Computing application? (Note: I don't want to use JavaCL or Aparapi)

    解决方案

    HDFS is a file system; you can use HDFS file system with any language.

    HDFS data is distributed over multiple machines, it is highly available to process the data in GPU computing.

    For more information reference Hadoop Streaming.

    这篇关于如何为Opencl应用程序使用hadoop MapReuce框架?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆