如何从Mapper Hadoop设置系统环境变量? [英] How to set system environment variable from Mapper Hadoop?
问题描述
该行下面的问题已解决,但我面临另一个问题.
我正在这样做:
DistributedCache.createSymlink(job.getConfiguration());
DistributedCache.addCacheFile(new URI
("hdfs:/user/hadoop/harsh/libnative1.so"),conf.getConfiguration());
以及在映射器中:
System.loadLibrary("libnative1.so");
(我也尝试过 System.loadLibrary("libnative1"); System.loadLibrary("native1");
(i also tried System.loadLibrary("libnative1"); System.loadLibrary("native1");
但是我收到此错误:
java.lang.UnsatisfiedLinkError: no libnative1.so in java.library.path
我一无所知,应该将java.library.path设置为.. 我尝试将其设置为/home,并将每个.so从分布式缓存复制到/home/,但仍然不起作用:(
I am totally clueless what should I set java.library.path to .. I tried setting it to /home and copied every .so from distributed cache to /home/ but still it didn't work :(
有什么建议/解决方案吗?
Any suggestions / solutions please?
<罢工> IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
我想设置运行映射器的机器的系统环境变量(特别是LD_LIBRARY_PATH).
I want to set the system environment variable (specifically, LD_LIBRARY_PATH) of the machine where the mapper is running.
我尝试过:
Runtime run = Runtime.getRuntime();
Process pr=run.exec("export LD_LIBRARY_PATH=/usr/local/:$LD_LIBRARY_PATH");
但是它抛出IOException.
But it throws IOException.
我也知道
JobConf.MAPRED_MAP_TASK_ENV
但是我正在使用hadoop版本0.20.2,其中包含Job&配置而不是JobConf.
But I am using hadoop version 0.20.2 which has Job & Configuration instead of JobConf.
我找不到任何这样的变量,这也不是Hadoop特定的环境变量,而是系统环境变量.
I am unable to find any such variable, and this is also not a Hadoop specific environment variable but a system environment variable.
有解决方案/建议吗? 在此先感谢.
Any solution/suggestion? Thanks in advance..
推荐答案
为什么不在群集的所有节点上导出此变量?
Why dont you export this variable on all nodes of the cluster ?