在HBase MapReduce任务中加载Native共享库 [英] Load Native Shared Libraries in a HBase MapReduce task
问题描述
这是我的JNI类。
public class VideoFeature {
// JNI代码Begin
public native static float Match(byte [] testFileBytes,byte [] tempFileBytes);
static {
System.loadLibrary(JVideoFeatureMatch);
}
// JNI代码结束
}
主函数,我写
$ b $ pre $ MapReduce $ b $配置conf = HBaseConfiguration.create();
//分布式缓存共享库
DistributedCache.createSymlink(conf);
//以下两种方式似乎都有效。
// DistributedCache.addCacheFile(new URI(/ home / danayan / Desktop / libJVideoFeatureMatch.so#JVideoFeatureMatch),conf);
DistributedCache.addCacheFile(新的URI(hdfs:// danayan-pc:9000 / lib / libJVideoFeatureMatch.so#libJVideoFeatureMatch.so),conf);
在map方法中,代码如下工作。
public static class MatchMapper扩展TableMapper< Text,IntWritable> {
@Override
public void map(ImmutableBytesWritable key,Result values,Context context)throws IOException,InterruptedException {
//其他代码
Path [] localFiles = DistributedCache.getLocalCacheFiles(context.getConfiguration());
for(Path temp:localFiles){
String path = temp.toString();
if(path.contains(JVideoFeatureMatch)){
System.out.println(JVideoFeatureMatch found!);
}
}
}
换句话说,它似乎我'分布式缓存'我的共享库成功。但我无法加载它在地图功能。
public static class MatchMapper扩展TableMapper< Text,IntWritable> {
@Override
public void map(ImmutableBytesWritable key,Result values,Context context)throws IOException,InterruptedException {
//其他代码
int score =(int )VideoFeature.Match(testBytes,tempBytes);
}
当我尝试在JNI类中调用静态函数时,抛出java.lang.Exception':
java.lang.UnsatisfiedLinkError:no java.library.path中的libJVideoFeatureMatch。
我也尝试了'System.load()'。我曾考虑在Linux系统中使用前缀'lib'和后缀'.so'。
更重要的是,我设置了一个jvm参数(删除它没有任何区别): -Djava.library.path = / usr / local / hadoop / lib / native / Linux-amd64-64
我通过将共享库移动到'Java.library我已经浏览了下面的一些网站:
我不知道我说的是否清楚。如果不是,请让我知道。
-
首先将库复制到HDFS:
bin / hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1
-
工作启动程序应包含以下内容:
DistributedCache.createSymlink(CONF);
DistributedCache.addCacheFile(hdfs:// host:port / libraries / mylib.so.1#mylib.so,conf);
-
MapReduce任务可以包含:
System.load((new File(mylib.so))。getAbsolutePath());
第三点不同于官方文档
官方文档:原生共享库
Recently I'm trying to implementing my algorithm in JNI codes(using C++).I did that and generate a shared library. Here is my JNI class.
public class VideoFeature{
// JNI Code Begin
public native static float Match(byte[] testFileBytes, byte[] tempFileBytes);
static {
System.loadLibrary("JVideoFeatureMatch");
}
// JNI Code End
}
In the main function, I write
// MapReduce
Configuration conf = HBaseConfiguration.create();
// DistributedCache shared library
DistributedCache.createSymlink(conf);
// Both following ways seem work.
// DistributedCache.addCacheFile(new URI("/home/danayan/Desktop/libJVideoFeatureMatch.so#JVideoFeatureMatch"), conf);
DistributedCache.addCacheFile(new URI("hdfs://danayan-pc:9000/lib/libJVideoFeatureMatch.so#libJVideoFeatureMatch.so"), conf);
In map method, codes following work.
public static class MatchMapper extends TableMapper<Text, IntWritable> {
@Override
public void map(ImmutableBytesWritable key, Result values, Context context) throws IOException, InterruptedException {
// Other codes
Path[] localFiles = DistributedCache.getLocalCacheFiles(context.getConfiguration());
for(Path temp:localFiles) {
String path = temp.toString();
if(path.contains("JVideoFeatureMatch")) {
System.out.println("JVideoFeatureMatch found !");
}
}
}
In other words, it seems that I 'DistributedCache' my shared library successfully.But I can't load it in the Map function.
public static class MatchMapper extends TableMapper<Text, IntWritable> {
@Override
public void map(ImmutableBytesWritable key, Result values, Context context) throws IOException, InterruptedException {
// Other codes
int score = (int)VideoFeature.Match(testBytes, tempBytes);
}
When I try to call the static function in the JNI class, a 'java.lang.Exception' is thrown :
java.lang.UnsatisfiedLinkError: no libJVideoFeatureMatch in java.library.path.
I have also tried 'System.load()'. And I have considered the use of prefix 'lib' and suffix '.so' in Linux system.
What's more, I set a jvm argument (Removing it makes no difference):
-Djava.library.path=/usr/local/hadoop/lib/native/Linux-amd64-64
And I have loaded the shared library in local machine successfully by moving the shared library to the 'Java.library.path'(set above).
I have browsed some web site below:
Issue loading a native library through the DistributedCache
Native Libraries Guide loading native libraries in hadoop reducer?
I don't know if that I said clearly.If not, please let me know.
First copy the library to HDFS:
bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1
The job launching program should contain the following:
DistributedCache.createSymlink(conf); DistributedCache.addCacheFile("hdfs://host:port/libraries/mylib.so.1#mylib.so", conf);
The MapReduce task can contain:
System.load((new File("mylib.so")).getAbsolutePath());
The third point is different from the official documentation
Official documentation : Native Shared Libraries
这篇关于在HBase MapReduce任务中加载Native共享库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!