HDFS写入导致“ CreateSymbolicLink错误(1314):所需特权不由客户端持有。“ [英] HDFS write resulting in " CreateSymbolicLink error (1314): A required privilege is not held by the client."

查看:3033
本文介绍了HDFS写入导致“ CreateSymbolicLink错误(1314):所需特权不由客户端持有。“的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

试图从 Apache Hadoop 。在地图缩减作业运行时出现异常。试过 hdfs dfs -chmod 777 / 但这并没有解决问题。

  15/03/10 13:13:10 WARN mapreduce.JobSubmitter:未执行Hadoop命令行选项解析。实施工具界面并使用
ToolRunner执行您的应用程序来解决这个问题。
15/03/10 13:13:10 WARN mapreduce.JobSubmitter:没有作业jar文件集。用户类可能找不到。请参阅作业或作业#setJar(String)。
15/03/10 13:13:10 INFO input.FileInputFormat:要输入的总输入路径:2
15/03/10 13:13:11信息mapreduce.JobSubmitter:分割数目:2
15/03/10 13:13:11信息mapreduce.JobSubmitter:提交作业的标记:job_1425973278169_0001
15/03/10 13:13:12信息mapred.YARNRunner:作业jar不存在。不添加任何jar到资源列表。
15/03/10 13:13:12 INFO impl.YarnClientImpl:已提交的应用程序application_1425973278169_0001
15/03/10 13:13:12信息mapreduce.Job:跟踪作业的网址:http: // B2ML10803:8088 / proxy / application_1425973278169_0001 /
15/03/10 13:13:12信息mapreduce.Job:正在运行的作业:job_1425973278169_0001
15/03/10 13:13:18信息mapreduce。 Job:job job_1425973278169_0001以超级模式运行:false
15/03/10 13:13:18信息mapreduce.Job:map 0%reduce 0%
15/03/10 13:13:18信息mapreduce.Job:Job job_1425973278169_0001失败,状态为FAILED,原因是:应用程序application_1425973278169_0001失败了两次,因为
到AM容器for appattempt_1425973278169_0001_000002退出并使用exitCode:1
有关更详细的输出,请检查应用程序跟踪页面:http:/ / B2ML10803:8088 / proxy / application_1425973278169_0001 /然后,点击指向每个attemptp
t的日志的链接。
诊断:容器启动异常。
容器ID:container_1425973278169_0001_02_000001
退出代码:1
异常消息:CreateSymbolicLink错误(1314):所需的特权不由客户端持有。

堆栈跟踪:

  ExitCodeException exitCode = 1:CreateSymbolicLink错误(1314):所需的特权不由客户端持有。 

at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell $ ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java: 211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager。 containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker( ThreadPoolExecutor.java:1142)
在java.util.concurrent.ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:617)$ b $在java.lang.Thread.run(Thread.java:745)

壳牌输出:

  1个文件移动。 

使用非零退出代码退出的容器1
尝试失败。申请失败。
15/03/10 13:13:18信息mapreduce.Job:Counters:0


解决方案

Win 8.1 + hadoop 2.7.0(从源代码构建)


  1. run命令以管理员模式提示


  2. 执行etc \hadoop\hadoop-env.cmd


  3. 运行sbin \start-dfs.cmd


  4. 运行sbin\start-yarn.cmd


  5. 现在尝试运行您的工作



Tried to execute sample map reduce program from Apache Hadoop. Got exception below when map reduce job was running. Tried hdfs dfs -chmod 777 / but that didn't fix the issue.

15/03/10 13:13:10 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with
ToolRunner to remedy this.
15/03/10 13:13:10 WARN mapreduce.JobSubmitter: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
15/03/10 13:13:10 INFO input.FileInputFormat: Total input paths to process : 2
15/03/10 13:13:11 INFO mapreduce.JobSubmitter: number of splits:2
15/03/10 13:13:11 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1425973278169_0001
15/03/10 13:13:12 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
15/03/10 13:13:12 INFO impl.YarnClientImpl: Submitted application application_1425973278169_0001
15/03/10 13:13:12 INFO mapreduce.Job: The url to track the job: http://B2ML10803:8088/proxy/application_1425973278169_0001/
15/03/10 13:13:12 INFO mapreduce.Job: Running job: job_1425973278169_0001
15/03/10 13:13:18 INFO mapreduce.Job: Job job_1425973278169_0001 running in uber mode : false
15/03/10 13:13:18 INFO mapreduce.Job:  map 0% reduce 0%
15/03/10 13:13:18 INFO mapreduce.Job: Job job_1425973278169_0001 failed with state FAILED due to: Application application_1425973278169_0001 failed 2 times due
to AM Container for appattempt_1425973278169_0001_000002 exited with  exitCode: 1
For more detailed output, check application tracking page:http://B2ML10803:8088/proxy/application_1425973278169_0001/Then, click on links to logs of each attemp
t.
Diagnostics: Exception from container-launch.
Container id: container_1425973278169_0001_02_000001
Exit code: 1
Exception message: CreateSymbolicLink error (1314): A required privilege is not held by the client.

Stack trace:

ExitCodeException exitCode=1: CreateSymbolicLink error (1314): A required privilege is not held by the client.

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Shell output:

1 file(s) moved.

Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
15/03/10 13:13:18 INFO mapreduce.Job: Counters: 0

解决方案

Win 8.1 + hadoop 2.7.0 (build from sources)

  1. run Command Prompt in admin mode

  2. execute etc\hadoop\hadoop-env.cmd

  3. run sbin\start-dfs.cmd

  4. run sbin\start-yarn.cmd

  5. now try to run your job

这篇关于HDFS写入导致“ CreateSymbolicLink错误(1314):所需特权不由客户端持有。“的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆