AWS Lambda上基于GPU的算法 [英] GPU based algorithm on AWS Lambda
问题描述
我有一个函数,需要执行一些数学运算,并且需要一个16gb GPU系统,但是该函数不会始终被触发,并且我的系统将不再使用其余时间。我开始了解AWS Lambda。
我可以在Lambda上运行基于GPU的算法吗?这样,每当我需要GPU时,我都可以将系统安装在云上。我需要一些说明。
您无法为AWS Lambda函数指定运行时环境,所以不,不需要GPU的存在(实际上,AWS选择放入其Lambda池中的物理计算机几乎肯定没有有一个)。
您最好的选择是在配置为使用p型实例的计算群集上将需要GPU的功能作为批处理作业运行。 此处可能会有帮助。 p>
I have a function which perform some mathematical operations and need a 16gb GPU system, But this function will not be triggered always and rest of time my system will not be in use. I came to know about AWS Lambda. Can I run GPU based algorithm on Lambda?? So that whenever I need GPU, I will get the system on cloud. I need a little description about it.
You can't specify the runtime environment for AWS Lambda functions, so no, you can't require the presence of a GPU (in fact the physical machines AWS chooses to put into its Lambda pool will almost certainly not have one).
Your best bet would be to run the GPU-requiring function as a Batch job on a compute cluster configured to use p-type instances. The guide here might be helpful.
这篇关于AWS Lambda上基于GPU的算法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!