加载数据时猪出错 [英] Error in pig while loading data

查看:28
本文介绍了加载数据时猪出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用的是 ubuntu 12.02 32bit 并且已经成功安装了 hadoop2.2.0pig 0.12.Hadoop 在我的系统上正常运行.

I am using ubuntu 12.02 32bit and have installed hadoop2.2.0 and pig 0.12 successfully. Hadoop runs properly on my system.

但是,每当我运行此命令时:

However, whenever I run this command :

data = load 'atoz.csv' using PigStorage(',')  as (aa1:int, bb1:int, cc1:int, dd1:chararray);            
dump data;

我收到以下错误:

ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error whiletrying to run jobs.java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected.

这是完整的堆栈跟踪:

> 2014-01-23 10:41:44,998 [main] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher- 1 map-reduce job(s) waiting for submission.
>             2014-01-23 10:41:45,000 [Thread-9] INFO  org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM
> Metrics with processName=JobTracker, sessionId= - already initialized
>             2014-01-23 10:41:45,001 [Thread-9] ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while
> trying to run jobs.
>             java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
>             at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
>             at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>             at java.security.AccessController.doPrivileged(Native Method)
>             at javax.security.auth.Subject.doAs(Subject.java:415)
>             at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>             at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
>             at java.lang.Thread.run(Thread.java:724)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
>             2014-01-23 10:41:45,498 [main] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 0% complete
>             2014-01-23 10:41:45,502 [main] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - job null has failed! Stop running all dependent jobs
>             2014-01-23 10:41:45,503 [main] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 100% complete
>             2014-01-23 10:41:45,507 [main] ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to
> recreate exception from backend error: Unexpected System Error
> Occured: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.JobContext, but class was expected
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
>             at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
>             at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>             at java.security.AccessController.doPrivileged(Native Method)
>             at javax.security.auth.Subject.doAs(Subject.java:415)
>             at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>             at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
>             at java.lang.Thread.run(Thread.java:724)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
2014-01-23 10:41:45,507 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s)
> failed!
>             2014-01-23 10:41:45,507 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Detected Local mode.
> Stats reported below may be incomplete
>             2014-01-23 10:41:45,508 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion    PigVersion    UserId    StartedAt    FinishedAt    Features
>             2.2.0    0.10.1    hardik    2014-01-23 10:41:44    2014-01-23 10:41:45    UNKNOWN
 Failed!
Failed Jobs:
JobId    Alias    Feature    Message    Outputs
N/A    aatoz    MAP_ONLY    Message: Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.JobContext, but class was expected
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
>             at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
>             at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>             at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>             at java.security.AccessController.doPrivileged(Native Method)
>             at javax.security.auth.Subject.doAs(Subject.java:415)
>             at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>             at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
>             at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
>             at java.lang.Thread.run(Thread.java:724)
>             at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
>             file:/tmp/temp1979716161/tmp-189979005,
Input(s):
Failed to read data from "file:///home/hardik/pig10/bin/input/atoz.csv"
Output(s):
             Failed to produce result in "file:/tmp/temp1979716161/tmp-189979005"
Job DAG:
null
2014-01-23 10:41:45,509 [main] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 Failed! 2014-01-23 10:41:45,510 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator
> for alias aatoz
>             Details at logfile: /home/hardik/pig10/bin/pig_1390453192689.log
>     </i>

推荐答案

Apache Pig 0.12.0 默认使用旧版本的 Hadoop.您必须为 Hadoop 2.2.0 重新编译 Pig,并用新的 pig-0.12.1-SNAPSHOT.jar 和 pig-0.12.1-SNAPSHOT-withouthadoop.jar 替换两个 jar.

Apache Pig 0.12.0 expects an older version of Hadoop by default. You must recompile Pig for Hadoop 2.2.0 and replace two jars with new pig-0.12.1-SNAPSHOT.jar and pig-0.12.1-SNAPSHOT-withouthadoop.jar.

要重新编译,解压 pig 存档,请转到目录pig-0.12.0"并运行:

For recompilation unpack the pig archive, go to the directory "pig-0.12.0" and just run:

ant clean jar-all -Dhadoopversion=23

这篇关于加载数据时猪出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆