[root@master ~]# hadoop
Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
namenode -format format the DFS filesystem
secondarynamenode run the DFS secondary namenode
namenode run the DFS namenode
datanode run a DFS datanode
dfsadmin run a DFS admin client
mradmin run a Map-Reduce admin client
fsck run a DFS filesystem checking utility
fs run a generic filesystem user client
balancer run a cluster balancing utility
jobtracker run the MapReduce job Tracker node
pipes run a Pipes job
tasktracker run a MapReduce task Tracker node
job manipulate MapReduce jobs
queue get information regarding JobQueues
version print the version
jar <jar> run a jar file
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME <src>* <dest> create a hadoop archive
daemonlog get/set the log level for each daemon
or
CLASSNAME run the class named CLASSNAME
Most commands print help when invoked w/o parameters.
[root@slave1 hadoop-0.20.2]# hadoop jar hadoop-0.20.2-examples.jar pi 4 2
Number of Maps = 4
Samples per Map = 2
Wrote input for Map #0
Wrote input for Map #1
Wrote input for Map #2
Wrote input for Map #3
Starting Job
12/05/20 09:45:19 INFO mapred.FileInputFormat: Total input paths to process : 4
12/05/20 09:45:19 INFO mapred.JobClient: Running job: job_201205190417_0005
12/05/20 09:45:20 INFO mapred.JobClient: map 0% reduce 0%
12/05/20 09:45:30 INFO mapred.JobClient: map 50% reduce 0%
12/05/20 09:45:31 INFO mapred.JobClient: map 100% reduce 0%
12/05/20 09:45:45 INFO mapred.JobClient: map 100% reduce 100%
12/05/20 09:45:47 INFO mapred.JobClient: Job complete: job_201205190417_0005
12/05/20 09:45:47 INFO mapred.JobClient: Counters: 18
12/05/20 09:45:47 INFO mapred.JobClient: Job Counters
12/05/20 09:45:47 INFO mapred.JobClient: Launched reduce tasks=1
12/05/20 09:45:47 INFO mapred.JobClient: Launched map tasks=4
12/05/20 09:45:47 INFO mapred.JobClient: Data-local map tasks=4
12/05/20 09:45:47 INFO mapred.JobClient: FileSystemCounters
12/05/20 09:45:47 INFO mapred.JobClient: FILE_BYTES_READ=94
12/05/20 09:45:47 INFO mapred.JobClient: HDFS_BYTES_READ=472
12/05/20 09:45:47 INFO mapred.JobClient: FILE_BYTES_WRITTEN=334
12/05/20 09:45:47 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=215
12/05/20 09:45:47 INFO mapred.JobClient: Map-Reduce Framework
12/05/20 09:45:47 INFO mapred.JobClient: Reduce input groups=8
12/05/20 09:45:47 INFO mapred.JobClient: Combine output records=0
12/05/20 09:45:47 INFO mapred.JobClient: Map input records=4
12/05/20 09:45:47 INFO mapred.JobClient: Reduce shuffle bytes=112
12/05/20 09:45:47 INFO mapred.JobClient: Reduce output records=0
12/05/20 09:45:47 INFO mapred.JobClient: Spilled Records=16
12/05/20 09:45:47 INFO mapred.JobClient: Map output bytes=72
12/05/20 09:45:47 INFO mapred.JobClient: Map input bytes=96
12/05/20 09:45:47 INFO mapred.JobClient: Combine input records=0
12/05/20 09:45:47 INFO mapred.JobClient: Map output records=8
12/05/20 09:45:47 INFO mapred.JobClient: Reduce input records=8
Job Finished in 28.952 seconds
Estimated value of Pi is 3.50000000000000000000