linux hadoop 本地,Linux系统下运行基于本地的Hadoop
運(yùn)行完成之后,可以到tmp目錄下查看Hadoop-root目錄和hsperfdata_root目錄,其實(shí)hsperfdata_root是空的,不過在hadoop-root目錄下有很多臨時目錄和文件:
[root@www.linuxidc.com /]# ls -l -R tmp/hadoop-root
tmp/hadoop-root:
總計(jì) 8
drwxr-xr-x 4 root root 4096 09-24 19:43 mapred
tmp/hadoop-root/mapred:
總計(jì) 16
drwxr-xr-x 4 root root 4096 09-24 19:43 local
drwxr-xr-x 2 root root 4096 09-25 16:32 system
tmp/hadoop-root/mapred/local:
總計(jì) 16
drwxr-xr-x 2 root root 4096 09-25 16:32 localRunner
drwxr-xr-x 3 root root 4096 09-24 19:43 taskTracker
tmp/hadoop-root/mapred/local/localRunner:
總計(jì) 8
-rw-r--r-- 1 root root 104 09-25 16:32 split.dta
tmp/hadoop-root/mapred/local/taskTracker:
總計(jì) 8
drwxr-xr-x 3 root root 4096 09-24 19:43 jobcache
tmp/hadoop-root/mapred/local/taskTracker/jobcache:
總計(jì) 8
drwxr-xr-x 10 root root 4096 09-25 16:32 job_local_0001
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001:
總計(jì) 64
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000000_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000001_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000002_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000003_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000004_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000005_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_m_000006_0
drwxr-xr-x 2 root root 4096 09-25 16:32 attempt_local_0001_r_000000_0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000000_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000001_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000002_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000003_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000004_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000005_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000006_0:
總計(jì) 0
tmp/hadoop-root/mapred/local/taskTracker/jobcache/job_local_0001/attempt_local_0001_r_000000_0:
總計(jì) 0
tmp/hadoop-root/mapred/system:
總計(jì) 0如果沒有使用root用戶,就會出現(xiàn)很多問題,我想是因?yàn)闄?quán)限的問題。
從root用戶切換到用戶shiyanjun:
[root@www.linuxidc.com /]# su shiyanjun配置認(rèn)證,并通過ssh登錄到127.0.0.1:
[shiyanjun@www.linuxidc.com hadoop-0.18.0]$ ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/home/shiyanjun/.ssh/id_rsa):
/home/shiyanjun/.ssh/id_rsa already exists.
Overwrite (y/n)? y
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/shiyanjun/.ssh/id_rsa.
Your public key has been saved in /home/shiyanjun/.ssh/id_rsa.pub.
The key fingerprint is:
76:7d:0c:8c:77:81:6c:eb:d9:7e:b2:d2:87:d0:ac:61 shiyanjun@www.linuxidc.com
[shiyanjun@www.linuxidc.com hadoop-0.18.0]$ ssh localhost
shiyanjun@localhost's password:
Last login: Wed Sep 24 16:30:17 2008這時,準(zhǔn)備好待處理數(shù)據(jù)文件,開始運(yùn)行WordCount工具:
[shiyanjun@www.linuxidc.com hadoop-0.18.0]$ bin/hadoop jar hadoop-0.18.0-examples.jar wordcount my-input my-output總是會因?yàn)閯?chuàng)建輸出目錄而發(fā)生異常:
08/09/25 17:14:24 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
08/09/25 17:14:24 INFO mapred.FileInputFormat: Total input paths to process : 7
08/09/25 17:14:24 INFO mapred.FileInputFormat: Total input paths to process : 7
08/09/25 17:14:25 INFO mapred.JobClient: Running job: job_local_0001
08/09/25 17:14:25 INFO mapred.FileInputFormat: Total input paths to process : 7
08/09/25 17:14:25 INFO mapred.FileInputFormat: Total input paths to process : 7
08/09/25 17:14:25 ERROR mapred.LocalJobRunner: Mkdirs failed to create file:/home/shiyanjun/hadoop-0.18.0/my-output/_temporary
08/09/25 17:14:25 WARN mapred.LocalJobRunner: job_local_0001
java.io.IOException: The directory file:/home/shiyanjun/hadoop-0.18.0/my-output/_temporary doesnt exist
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:148)
java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1113)
at org.apache.hadoop.examples.WordCount.run(WordCount.java:149)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:155)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
待解決...
因?yàn)閯倓偨佑|Linux系統(tǒng),對于一些配置還是不能游刃有余。有知道該如何配置的朋友,請不吝賜教,謝謝。
總結(jié)
以上是生活随笔為你收集整理的linux hadoop 本地,Linux系统下运行基于本地的Hadoop的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: c语言中size of 用法,C语言中s
- 下一篇: linux文件一列加1,Linux命令(