Nifi java.lang.NoSuchMethodError:org.apache.hadoop.conf.Configuration.reloadExistingConfigurations [英] Nifi java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations

查看:124
本文介绍了Nifi java.lang.NoSuchMethodError:org.apache.hadoop.conf.Configuration.reloadExistingConfigurations的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在单击此链接来设置Nifi putHDFS以写入Azure Data Lake.

I am following this link to set up Nifi putHDFS to write to Azure Data Lake.Connecting to Azure Data Lake from a NiFi dataflow

Nifi在HDF 3.1 VM中,Nifi版本为1.5.

The Nifi is within HDF 3.1 VM and the Nifi version is 1.5.

我们从HD Insight(v 3.6,支持hadoop 2.7)头节点获得了上面链接中提到的jar文件,这些jar是:

We got the jar files mentioned in the above link, from a HD Insight(v 3.6, which supports hadoop 2.7) head node, these jars are:

adls2-oauth2-token-provider-1.0.jar

azure-data-lake-store-sdk-2.1.4.jar

hadoop-azure-datalake.jar

jackson-core-2.2.3.jar

okhttp-2.4.0.jar

okio-1.4.0.jar

它们被复制到HDF群集Nifi主机的文件夹/usr/lib/hdinsight-datalake(群集中只有1台主机).并且putHDFS配置(图片)与附件相同(与上面的链接完全一样)放置HDFS属性.

And they are copied to the folder /usr/lib/hdinsight-datalake of the HDF cluster Nifi host(we only have 1 host in the cluster). And the putHDFS config(picture) is as attached(exactly as the link above)putHDFS attributes.

但是在nifi日志中,我们得到的是:

But in the nifi log we are getting this:

由以下原因引起:java.lang.NoSuchMethodError:org.org.apache.hadoop.fs.adl.AdlConfKeys.addDeprecatedKeys(AdlConfKeys.java:112)上的org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V. apache.hadoop.fs.adl.AdlFileSystem.(AdlFileSystem.java:92)在java.lang.Class.forName0(本机方法)在java.lang.Class.forName(Class.java:348)在org.apache.nifi .org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2099)处的.processors.hadoop.AbstractHadoopProcessor $ ExtendedConfiguration.getClassByNameOrNull(AbstractHadoopProcessor.java:490)在org.apache.hadoop.conf.Configuration.getClass(Configuration) org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2654)处的org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)处的.java:2193). fs.FileSystem.get(FileSystem.java:370),位于org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172),位于org.apache.nifi.processors.hadoop.AbstractHadoopProcessor $ 1.run(AbstractHadoopProcessor.jav a:322)at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor $ 1.run(AbstractHadoopProcessor.java:319)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject) .java:422),位于org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.getFileSystemAsUser(AbstractHadoopProcessor.java:319),位于org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.getFileSystemAsUser(AbstractHadoopProcessor.java:319)上,位于org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.abstractOnScheduled(AbstractHadoopProcessor.java:205)的nifi.processors.hadoop.AbstractHadoopProcessor.resetHDFSResources(AbstractHadoopProcessor.java:281)...省略了16个常见框架

Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V at org.apache.hadoop.fs.adl.AdlConfKeys.addDeprecatedKeys(AdlConfKeys.java:112) at org.apache.hadoop.fs.adl.AdlFileSystem.(AdlFileSystem.java:92) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor$ExtendedConfiguration.getClassByNameOrNull(AbstractHadoopProcessor.java:490) at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2099) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2654) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor$1.run(AbstractHadoopProcessor.java:322) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor$1.run(AbstractHadoopProcessor.java:319) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.getFileSystemAsUser(AbstractHadoopProcessor.java:319) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.resetHDFSResources(AbstractHadoopProcessor.java:281) at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.abstractOnScheduled(AbstractHadoopProcessor.java:205) ... 16 common frames omitted

AdlConfKeys类来自上面的hadoop-azure-datalake.jar文件.从上述异常来看,在我看来,这个AdlConfKeys正在加载org.apache.hadoop.conf.Configuration类的旧版本,该类没有reloadExistingConfigurations方法.但是,我们无法找到此较旧类的加载位置.这个HDF 3.1在多个位置都有hadoop-common-XXXX.jar,所有版本2.7都有org.apache.hadoop.conf.Configuration包含方法reloadExistingConfigurations,只有版本2.3才具有此方法. (我对2.7和2.3罐子进行了反编译以找出答案)

The AdlConfKeys class is from the hadoop-azure-datalake.jar file above. From the above exception, it seems to me this AdlConfKeys is loading an older version of the org.apache.hadoop.conf.Configuration class, which does not have the reloadExistingConfigurations method. However we cannot find out from where this older class gets loaded. This HDF 3.1 has the hadoop-common-XXXX.jar in multiple locations, all those on version 2.7 something has the org.apache.hadoop.conf.Configuration containing the method reloadExistingConfigurations, only those on version 2.3 don't have this method.(I decompiled both 2.7 and 2.3 jars to find out)

[root@NifiHost /]# find . -name *hadoop-common*

(输出比下面的要多得多,但是我出于显示目的删除了一些,其中大多数在2.7上,只有2项在2.3版上):

(the output is a lot more than below, however I removed some for display purpose, most of them are on 2.7, only 2 of them are on version 2.3):

./var/lib/nifi/work/nar/extensions/nifi-hadoop-libraries-nar-1.5.0.3.1.0.0-564.nar-unpacked/META-INF/bundled-dependencies/hadoop-common-2.7.3.jar

./var/lib/ambari-agent/cred/lib/hadoop-common-2.7.3.jar

./var/lib/ambari-server/resources.backup/views/work/WORKFLOW_MANAGER{1.0.0}/WEB-INF/lib/hadoop-common-2.7.3.2.6.2.0-205.jar

./var/lib/ambari-server/resources.backup/views/work/HUETOAMBARI_MIGRATION{1.0.0}/WEB-INF/lib/hadoop-common-2.3.0.jar

./var/lib/ambari-server/resources/views/work/HUETOAMBARI_MIGRATION{1.0.0}/WEB-INF/lib/hadoop-common-2.3.0.jar

./var/lib/ambari-server/resources/views/work/HIVE{1.5.0}/WEB-INF/lib/hadoop-common-2.7.3.2.6.4.0-91.jar

./var/lib/ambari-server/resources/views/work/CAPACITY-SCHEDULER{1.0.0}/WEB-INF/lib/hadoop-common-2.7.3.2.6.4.0-91.jar

./var/lib/ambari-server/resources/views/work/TEZ{0.7.0.2.6.2.0-205}/WEB-INF/lib/hadoop-common-2.7.3.2.6.2.0-205.jar

./usr/lib/ambari-server/hadoop-common-2.7.2.jar

./usr/hdf/3.1.0.0-564/nifi/ext/ranger/install/lib/hadoop-common-2.7.3.jar

./usr/hdf/3.0.2.0-76/nifi/ext/ranger/install/lib/hadoop-common-2.7.3.jar

所以我真的不知道Nifi如何设法找到一个hadoop常见的jar文件,或者其他包含Configuration类的东西没有方法reloadExistingConfigurations().我们也没有将任何自定义的Nar文件部署到Nifi,从Nifi上的HDF 3.1来看,所有内容几乎都是默认的.

So I really don't know how Nifi managed to find a hadoop-common jar file or something else containing the Configuration class does not have the method reloadExistingConfigurations(). We do not have any customized Nar files deployed to Nifi either, everything is pretty much default from whatever HDF 3.1 has on Nifi.

请告知.我已经花了整整一整天的时间,但无法解决问题.感谢您的帮助.

Please advise. I've been spending a whole day on this but can't fix the issue. Appreciate your help.

推荐答案

我认为您使用的Azure JAR需要比NiFi使用的2.7.3更高版本的hadoop-common.

I think the Azure JARs you are using require a newer version of hadoop-common than the 2.7.3 one that NiFi is using.

如果从2.7.3开始查看Configuration类,则没有"reloadExistingConfiguration"方法:

If you look at the Configuration class from 2.7.3 there is no "reloadExistingConfiguration" method:

它似乎是在2.8.x的某个时候引入的:

It appears to be introduced sometime during 2.8.x:

查看全文

相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆