在hadoop reducer中加载本地库? [英] loading native libraries in hadoop reducer?

查看:93
本文介绍了在hadoop reducer中加载本地库?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个本地库,我需要为我的reduce方法加载,并将其添加到分布式缓存中,但是当我在地图方法中调用System.loadLibrary(mylib.so)时,出现错误和失败的映射任务:

 错误:java.library.path中没有mylib.so 

尽管我将它添加到分布式缓存中。我错过了一步吗?在我的作业配置中,我打电话给:

  DistributedCache.addCacheFile(uri,job.getConfiguration()); 

其中uri是hadoop文件系统上mylib.so的路径。



这个本地库依赖于许多其他库,全部存在于hadoop fs上的my / hadoop / fs / mystuff / libs中。我将它们全部添加到分布式缓存中,我甚至尝试使用System.loadLibrary()调用在我的reduce任务中加载它们。但我一直得到相同的java.library.path错误。我也尝试将这些库作为命令行参数添加到-files标志,但我仍然得到上述错误。尝试我们的代码独立,而不是在map-reduce?我知道,System.loadLibrary期望没有.so或.dll后缀的库名称...


I have a native library I need to load for my reduce method and I added it to the distributed cache but when I call System.loadLibrary(mylib.so) in my map method I get an error and failed map task:

Error: no mylib.so in java.library.path

Even though I added it to the distributed cache. Am I missing a step? In my job configuration I call:

DistributedCache.addCacheFile(uri, job.getConfiguration());

Where the uri is the path to mylib.so on the hadoop file system.

This native library depends on many others, all existing on my /hadoop/fs/mystuff/libs on the hadoop fs. I add them all to the distributed cache and I even tried loading all of them in my reduce task using the System.loadLibrary() call. But I keep getting that same java.library.path error. I also tried adding the libraries as command line arguments to the -files flag but I still get the error above.

解决方案

Do you try our code standalone, not in map-reduce? What I know, System.loadLibrary expect library name without ".so" or ".dll" suffix...

这篇关于在hadoop reducer中加载本地库?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆