netlib-java的“符号查找错误" [英] 'Symbol lookup error' with netlib-java
问题描述
在运行Fedora 23的计算机上运行Spark的MLLib中的示例时,我遇到了一些麻烦.我已经按照Spark文档中的以下选项构建了Spark 1.6.2:
I am having a bit of trouble running the examples in Spark's MLLib on a machine running Fedora 23. I have built Spark 1.6.2 with the following options per Spark documentation:
build/mvn -Pnetlib-lgpl -Pyarn -Phadoop-2.4 \
-Dhadoop.version=2.4.0 -DskipTests clean package
,并在运行二进制分类示例后:
and upon running the binary classification example:
bin/spark-submit --class org.apache.spark.examples.mllib.BinaryClassification \
examples/target/scala-*/spark-examples-*.jar \
--algorithm LR --regType L2 --regParam 1.0 \
data/mllib/sample_binary_classification_data.txt
我收到以下错误:
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.92-1.b14.fc23.x86_64/jre/bin/java: symbol lookup error: /tmp/jniloader5830472710956533873netlib-native_system-linux-x86_64.so: undefined symbol: cblas_dscal
这种形式的错误(netlib的符号查找错误)不限于此特定示例.另一方面,Elastic Net示例(./bin/run-example ml.LinearRegressionWithElasticNetExample
)运行没有问题.
Errors of this form (symbol lookup error with netlib) are not limited to this particular example. On the other hand, the Elastic Net example (./bin/run-example ml.LinearRegressionWithElasticNetExample
) runs without a problem.
我尝试了许多解决方案,但无济于事.例如,我在这里通过了一些建议 https://datasciencemadesimpler.wordpress.com/tag/blas/,虽然我可以从com.github.fommil.netlib.BLAS
和LAPACK
成功导入,但上述符号查找错误仍然存在.
I have tried a number of solutions to no avail. For example, I went through some of the advice here https://datasciencemadesimpler.wordpress.com/tag/blas/, and while I can successfully import from com.github.fommil.netlib.BLAS
and LAPACK
, the aforementioned symbol lookup error persists.
我已经阅读了fommil/netlib-java中的netlib-java文档,并确保我的系统具有libblas
和liblapack
共享对象文件:
I have read through the netlib-java documentation at fommil/netlib-java, and have ensured my system has the libblas
and liblapack
shared object files:
$ ls /usr/lib64 | grep libblas
libblas.so
libblas.so.3
libblas.so.3.5
libblas.so.3.5.0
$ ls /usr/lib64 | grep liblapack
liblapacke.so
liblapacke.so.3
liblapacke.so.3.5
liblapacke.so.3.5.0
liblapack.so
liblapack.so.3
liblapack.so.3.5
liblapack.so.3.5.0
我在这里找到的最有希望的建议是 http://fossdev.blogspot.com/2015/12/scala-breeze-blas-lapack-on-linux.html ,建议包含
The most promising advice I found was here http://fossdev.blogspot.com/2015/12/scala-breeze-blas-lapack-on-linux.html, which suggests including
JAVA_OPTS="- Dcom.github.fommil.netlib.BLAS=com.github.fommil.netlib.NativeRefBLAS"
.因此,我在build/mvn
脚本中将这些选项附加到了_COMPILE_JVM_OPTS="..."
,这也不能解决问题.
in the sbt
script. So, I included appended those options to _COMPILE_JVM_OPTS="..."
in the build/mvn
script, which also did not resolve the problem.
最后,我在网上找到的最后建议是建议将以下标志传递给sbt
:
Finally, a last bit of advice I found online suggested passing the following flags to sbt
:
sbt -Dcom.github.fommil.netlib.BLAS=com.github.fommil.netlib.F2jBLAS \
-Dcom.github.fommil.netlib.LAPACK=com.github.fommil.netlib.F2jLAPACK \
-Dcom.github.fommil.netlib.ARPACK=com.github.fommil.netlib.F2jARPACK
,问题仍然存在.我的帖子仅限于两个链接,但是建议可以在github上的lildata的"scaladatascience"存储库的README.md中找到.
and again the issue persists. I am limited to two links in my post, but the advice can be found as the README.md of lildata's 'scaladatascience' repo on github.
有人遇到过这个问题并成功解决了吗?任何帮助或建议,我们深表感谢.
Has anybody suffered this issue and successfully resolved it? Any and all help or advice is deeply appreciated.
推荐答案
已经几个月了,但是我又回到了这个问题,并且能够得到一个可行的解决方法(如果有人遇到相同的问题,请在此处发布)
It's been a couple months, but I got back to this problem and was able to get a functioning workaround (posting here in case anybody else has the same issue).
它归结为库优先级;因此,请致电:
It came down to library precedence; so, by calling:
$ export LD_PRELOAD=/path/to/libopenblas.so
在启动Spark之前,一切都会按预期进行.
prior to launching Spark, everything works as expected.
阅读后我想出了解决方案:
I figured out the solution after reading:
- https://github.com/fommil/netlib-java/issues/88 (直接解决此问题)
- JNI符号查找错误"在Linux上的共享库中(类似的链接问题,与Spark无关,但有关链接的信息是有益的)
- https://github.com/fommil/netlib-java/issues/88 (directly addresses this issue)
- JNI "symbol lookup error" in shared library on Linux (similar linking issue, doesn't have to do with Spark but answers are informative with regards to linking)
这篇关于netlib-java的“符号查找错误"的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!