solr - Hadoop-2.5.1 + Nutch-2.2.1: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected -
solr - Hadoop-2.5.1 + Nutch-2.2.1: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected -
command: ./crawl /urls /mydir xxxxx 2
when run command in hadoop-2.5.1 , nutch-2.2.1, wrong info following.
14/10/07 19:58:10 info mapreduce.job: running job: job_1411692996443_0016 14/10/07 19:58:17 info mapreduce.job: job job_1411692996443_0016 running in uber mode : false 14/10/07 19:58:17 info mapreduce.job: map 0% cut down 0% 14/10/07 19:58:21 info mapreduce.job: task id : attempt_1411692996443_0016_m_000000_0, status : failed error: found interface org.apache.hadoop.mapreduce.taskattemptcontext, class expected 14/10/07 19:58:26 info mapreduce.job: task id : attempt_1411692996443_0016_m_000000_1, status : failed error: found interface org.apache.hadoop.mapreduce.taskattemptcontext, class expected 14/10/07 19:58:31 info mapreduce.job: task id : attempt_1411692996443_0016_m_000000_2, status : failed error: found interface org.apache.hadoop.mapreduce.taskattemptcontext, class expected 14/10/07 19:58:36 info mapreduce.job: map 100% cut down 0% 14/10/07 19:58:36 info mapreduce.job: job job_1411692996443_0016 failed state failed due to: task failed task_1411692996443_0016_m_000000 job failed tasks failed. failedmaps:1 failedreduces:0 14/10/07 19:58:36 info mapreduce.job: counters: 12
job counters failed map tasks=4 launched map tasks=4 other local map tasks=3 data-local map tasks=1 total time spent maps in occupied slots (ms)=11785 total time spent reduces in occupied slots (ms)=0 total time spent map tasks (ms)=11785 total vcore-seconds taken map tasks=11785 total megabyte-seconds taken map tasks=12067840 map-reduce framework cpu time spent (ms)=0 physical memory (bytes) snapshot=0 virtual memory (bytes) snapshot=0
14/10/07 19:58:36 error crawl.injectorjob: injectorjob: java.lang.runtimeexception: job failed: name=[/mydir]inject /urls, jobid=job_1411692996443_0016
at org.apache.nutch.util.nutchjob.waitforcompletion(nutchjob.java:55) @ org.apache.nutch.crawl.injectorjob.run(injectorjob.java:233) @ org.apache.nutch.crawl.injectorjob.inject(injectorjob.java:251) @ org.apache.nutch.crawl.injectorjob.run(injectorjob.java:273) @ org.apache.hadoop.util.toolrunner.run(toolrunner.java:70) @ org.apache.nutch.crawl.injectorjob.main(injectorjob.java:282) @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:483) @ org.apache.hadoop.util.runjar.main(runjar.java:212)
probably using gora (or smth else) compiled hadoop 1 (from maven repo?). can download gora (0.5?) , build hadoop 2.
perhaps first problem in series of problems. please notify future steps.
hadoop solr nutch
Comments
Post a Comment