我如何使用GPU与Java编程 [英] How can I use GPU with Java programming

查看:1693
本文介绍了我如何使用GPU与Java编程的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用CUDAC这些天访问GPU。但现在我的指南要求我使用Java和GPU。所以我在互联网上搜索并找到 Rootbeer是最好的选择,但我不能理解如何使用Rootbeer运行程序。可以有人告诉我使用Rootbeer的步骤

I am using CUDAC all these days to access the GPU. But now my guide asked me to work with Java and GPU. So I searched in Internet and found Rootbeer is the best option for it but I am not able to understand how to run a program using 'Rootbeer'. Can some one tell me steps for using Rootbeer.

推荐答案

来自Nvidia的Mark Harris CUDA在SC14的未来。您可以观看这里

Mark Harris from Nvidia gave nice talk about the future of CUDA at SC14. You can watch it here.

您可能会感兴趣的主要事情是他谈论编程语言,特别是Java的部分。 IBM正在从事 CUDA4J ,还有一些关于Java 8功能的很好的计划,特别是用于GPU编程的lambdas。但是,我不是一个Java用户,我不能回答你的问题关于Rootbeer(除了味道),但也许CUDA4J将是一个适合你的东西。特别是,如果你知道如何编写CUDA C,并需要一个由IBM公司支持的解决方案。

The main thing that may be of interest for you is the part where he talks about programming languages and especially Java. IBM is working on CUDA4J and there are some nice plans about Java 8 features especially lambdas to be used for GPU programming. However, I am not a Java user and I can't answer your question regarding Rootbeer (besides the taste) but maybe CUDA4J will be something that suits you. Especially, if you know how to write CUDA C and need a solution backed up by a company like IBM.

这篇关于我如何使用GPU与Java编程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆