Amazon

Monday, May 2, 2016

Hadoop - Practicing HDFS Basic Commands

notroot@ubuntu:~$ hadoop
Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
  namenode -format     format the DFS filesystem
  secondarynamenode    run the DFS secondary namenode
  namenode             run the DFS namenode
  datanode             run a DFS datanode
  dfsadmin             run a DFS admin client
  mradmin              run a Map-Reduce admin client
  fsck                 run a DFS filesystem checking utility
  fs                   run a generic filesystem user client
  balancer             run a cluster balancing utility
  fetchdt              fetch a delegation token from the NameNode
  jobtracker           run the MapReduce job Tracker node
  pipes                run a Pipes job
  tasktracker          run a MapReduce task Tracker node
  historyserver        run job history servers as a standalone daemon
  job                  manipulate MapReduce jobs
  queue                get information regarding JobQueues
  version              print the version
  jar            run a jar file
  distcp copy file or directories recursively
  archive -archiveName NAME -p * create a hadoop archi                                                                                        ve
  classpath            prints the class path needed to get the
                       Hadoop jar and the required libraries
  daemonlog            get/set the log level for each daemon
 or
  CLASSNAME            run the class named CLASSNAME
Most commands print help when invoked w/o parameters.

balancer - Runs a cluster balancing utility. An administrator can simply press Ctrl-C to stop the rebalancing process. See Balancer for more details.

notroot@ubuntu:~$ hadoop balancer
16/05/03 01:07:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 0 time(s).
16/05/03 01:07:12 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 1 time(s).
16/05/03 01:07:13 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 2 time(s).
16/05/03 01:07:14 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 3 time(s).
16/05/03 01:07:15 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 4 time(s).
16/05/03 01:07:16 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 5 time(s).
16/05/03 01:07:17 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 6 time(s).
16/05/03 01:07:18 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 7 time(s).
16/05/03 01:07:19 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 8 time(s).
16/05/03 01:07:20 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 9 time(s).
Received an IO exception: Call to localhost/127.0.0.1:8020 failed on connection                                                                                         exception: java.net.ConnectException: Connection refused . Exiting...
Balancing took 11.778 seconds

notroot@ubuntu:~$ hadoop version
Hadoop 1.0.3
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r                                                                                         1335192
Compiled by hortonfo on Tue May  8 20:31:25 UTC 2012
From source with checksum e6b0c1e23dcf76907c5fecb4b832f3be

notroot@ubuntu:~$ hadoop fs -ls hdfs:/
16/05/03 01:08:44 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 0 time(s).
16/05/03 01:08:45 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 1 time(s).
16/05/03 01:08:46 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 2 time(s).
16/05/03 01:08:47 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 3 time(s).
16/05/03 01:08:48 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 4 time(s).
16/05/03 01:08:49 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 5 time(s).
16/05/03 01:08:50 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 6 time(s).
16/05/03 01:08:51 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 7 time(s).
16/05/03 01:08:52 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 8 time(s).
16/05/03 01:08:53 INFO ipc.Client: Retrying connect to server: localhost/127.0.0                                                                                        .1:8020. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/127.0.0.1:80                                                                                        20 failed on connection exception: java.net.ConnectException: Connection refused

notroot@ubuntu:~$ start-all.sh
starting namenode, logging to /home/notroot/lab/software/hadoop-1.0.3/libexec/..                                                                                        /logs/hadoop-notroot-namenode-ubuntu.out
localhost: starting datanode, logging to /home/notroot/lab/software/hadoop-1.0.3                                                                                        /libexec/../logs/hadoop-notroot-datanode-ubuntu.out
localhost: starting secondarynamenode, logging to /home/notroot/lab/software/had                                                                                        oop-1.0.3/libexec/../logs/hadoop-notroot-secondarynamenode-ubuntu.out
starting jobtracker, logging to /home/notroot/lab/software/hadoop-1.0.3/libexec/                                                                                        ../logs/hadoop-notroot-jobtracker-ubuntu.out
localhost: starting tasktracker, logging to /home/notroot/lab/software/hadoop-1.                                                                                        0.3/libexec/../logs/hadoop-notroot-tasktracker-ubuntu.out

notroot@ubuntu:~$ jps
3248 DataNode
3016 NameNode
3479 SecondaryNameNode
3910 Jps
3804 TaskTracker
3560 JobTracker

notroot@ubuntu:~$ hadoop fs -ls hdfs:/
Found 1 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home


notroot@ubuntu:~$ hadoop  fsck - /
FSCK started by notroot from /127.0.0.1 for path / at Tue May 03 01:22:23 UTC 20                                                                                        16
.Status: HEALTHY
 Total size:    4 B
 Total dirs:    6
 Total files:   1
 Total blocks (validated):      1 (avg. block size 4 B)
 Minimally replicated blocks:   1 (100.0 %)
 Over-replicated blocks:        0 (0.0 %)
 Under-replicated blocks:       0 (0.0 %)
 Mis-replicated blocks:         0 (0.0 %)
 Default replication factor:    1
 Average block replication:     1.0
 Corrupt blocks:                0
 Missing replicas:              0 (0.0 %)
 Number of data-nodes:          1
 Number of racks:               1
FSCK ended at Tue May 03 01:22:23 UTC 2016 in 10 milliseconds


The filesystem under path '/' is HEALTHY

notroot@ubuntu:~$ hadoop version
Hadoop 1.0.3
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1335192
Compiled by hortonfo on Tue May  8 20:31:25 UTC 2012
From source with checksum e6b0c1e23dcf76907c5fecb4b832f3be

notroot@ubuntu:~$ hadoop balancer
Time Stamp               Iteration#  Bytes Already Moved  Bytes Left To Move  Bytes Being Moved
16/05/03 01:23:35 INFO net.NetworkTopology: Adding a new node: /default-rack/127.0.0.1:50010
16/05/03 01:23:35 INFO balancer.Balancer: 0 over utilized nodes:
16/05/03 01:23:35 INFO balancer.Balancer: 1 under utilized nodes:  127.0.0.1:50010
The cluster is balanced. Exiting...
Balancing took 2.405 seconds

notroot@ubuntu:~$ hadoop balancer
Time Stamp               Iteration#  Bytes Already Moved  Bytes Left To Move  Bytes Being Moved
16/05/03 01:23:50 INFO net.NetworkTopology: Adding a new node: /default-rack/127.0.0.1:50010
16/05/03 01:23:50 INFO balancer.Balancer: 0 over utilized nodes:
16/05/03 01:23:50 INFO balancer.Balancer: 1 under utilized nodes:  127.0.0.1:50010
The cluster is balanced. Exiting...
Balancing took 2.32 seconds

notroot@ubuntu:~$ hadoop fs -ls /
Found 2 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~$ hadoop fs -ls /default-rack
ls: Cannot access /default-rack: No such file or directory.


notroot@ubuntu:~$ hadoop fs -ls  /default-rack/127.0.0.1:50010
ls: java.net.URISyntaxException: Relative path in absolute URI: 127.0.0.1:50010
Usage: java FsShell [-ls ]

notroot@ubuntu:~$ hadoop fs -ls hdfs:/
Found 2 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~$ hadoop fs -count hdfs:/
           7            1                  4 hdfs://localhost:8020/

notroot@ubuntu:~$ hadoop fs -ls hdfs:/
Found 2 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home
Found 1 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home/notroot

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home/notroot
Found 1 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home/notroot/lab

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home/notroot/lab
Found 1 items
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:22 /home/notroot/lab/mapred

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home/notroot/lab/mapred
Found 1 items
drwx------   - notroot supergroup          0 2016-05-03 01:22 /home/notroot/lab/mapred/system

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home/notroot/lab/mapred/system
Found 1 items
-rw-------   1 notroot supergroup          4 2016-05-03 01:22 /home/notroot/lab/mapred/system/jobtracker.info

notroot@ubuntu:~$ hadoop fs -ls hdfs:/home/notroot/lab/mapred
Found 1 items
drwx------   - notroot supergroup          0 2016-05-03 01:22 /home/notroot/lab/mapred/system

notroot@ubuntu:~$ hadoop fs -ls hdfs:/
Found 2 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~$ hadoop fs -ls hdfs:/system

notroot@ubuntu:~$ ls -lt
total 7608
drwxrwxr-x 2 notroot notroot    4096 May  1 11:14 sanjeev
-rw-rw-r-- 1 notroot notroot 7772980 Apr 26 18:07 latest.tar.gz
drwxrwxr-x 3 notroot notroot    4096 Nov 12  2012 downloads
drwxrwxr-x 7 notroot notroot    4096 Sep 26  2012 lab
drwxrwxr-x 4 notroot notroot    4096 Sep 26  2012 backup

notroot@ubuntu:~$ cd sanjeev/

notroot@ubuntu:~/sanjeev$ ls
sample  sample1

notroot@ubuntu:~/sanjeev$ cd ..

notroot@ubuntu:~$ ls -h -lrt
total 7.5M
drwxrwxr-x 4 notroot notroot 4.0K Sep 26  2012 backup
drwxrwxr-x 7 notroot notroot 4.0K Sep 26  2012 lab
drwxrwxr-x 3 notroot notroot 4.0K Nov 12  2012 downloads
-rw-rw-r-- 1 notroot notroot 7.5M Apr 26 18:07 latest.tar.gz
drwxrwxr-x 2 notroot notroot 4.0K May  1 11:14 sanjeev

notroot@ubuntu:~$ rm latest.tar.gz

notroot@ubuntu:~$ cd lab

notroot@ubuntu:~/lab$ ls
data  hdfs  mapred  programs  software

notroot@ubuntu:~/lab$ cd data/

notroot@ubuntu:~/lab/data$ ls -lrt
total 174832
-rw-rw-r-- 1 notroot notroot 81468050 Jan 10  2012 weblogs
-rw-rw-r-- 1 notroot notroot 88328787 May 25  2012 txns
-rw-rw-r-- 1 notroot notroot   391355 Jun  9  2012 custs
-rw-rw-r-- 1 notroot notroot  8828133 Jun 10  2012 txntab
drwxrwxr-x 2 notroot notroot     4096 Sep  5  2012 images

notroot@ubuntu:~/lab/data$ cd ..

notroot@ubuntu:~/lab$ ls
data  hdfs  mapred  programs  software

notroot@ubuntu:~/lab$ cd ..

notroot@ubuntu:~$ pwd
/home/notroot

notroot@ubuntu:~$ cd sanjeev

notroot@ubuntu:~/sanjeev$ ls
googleroundtable.mp4  sample  sample1

notroot@ubuntu:~/sanjeev$ ls -lt
total 122812
-rw-rw-r-- 1 notroot notroot       175 May  1 11:15 sample1
-rw-rw-r-- 1 notroot notroot       136 May  1 11:09 sample
-rw-rw-r-- 1 notroot notroot 125744560 Apr 14  2014 googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ ls -lt -h
total 120M
-rw-rw-r-- 1 notroot notroot  175 May  1 11:15 sample1
-rw-rw-r-- 1 notroot notroot  136 May  1 11:09 sample
-rw-rw-r-- 1 notroot notroot 120M Apr 14  2014 googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ hadoop dfs -put googleroundtable.mp4 hdfs:/

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 3 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:37 /googleroundtable.mp4
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/home
Found 1 items
drwxr-xr-x   - notroot supergroup          0 2016-05-01 07:24 /home/notroot

notroot@ubuntu:~/sanjeev$ hadoop dfs -mkdir hdfs:/home/sanjeev

notroot@ubuntu:~/sanjeev$ hadoop dfs -mv hdfs:/googleroundtable.mp4 hdfs:/sanjeev/

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/sanjeev
Found 1 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:37 /sanjeev

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 3 items
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:39 /home
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:37 /sanjeev
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -rm hdfs:/sanjeev
Deleted hdfs://localhost:8020/sanjeev

notroot@ubuntu:~/sanjeev$ hadoop dfs -put googleroundtable.mp4 hdfs:/

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 3 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:42 /googleroundtable.mp4
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:39 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -mkdir hdfs:/sanjeev

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 4 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:42 /googleroundtable.mp4
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:39 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:42 /sanjeev
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -mv hdfs:/googleroundtable.mp4 hdfs:/sanjeev/googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 3 items
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:39 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:43 /sanjeev
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/sanjeev
Found 1 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:42 /sanjeev/googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ hadoop dfs -cp hdfs:/sanjeev/googleroundtable.mp4 hdfs:/googlemapred.mp4

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/sanjeev
Found 1 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:42 /sanjeev/googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ hadoop dfs -ls hdfs:/
Found 4 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:44 /googlemapred.mp4
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:39 /home
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:43 /sanjeev
drwxr-xr-x   - notroot supergroup          0 2016-05-03 01:23 /system

notroot@ubuntu:~/sanjeev$ hadoop dfs -copyToLocal hdfs:/googlemapred.mp4 .

notroot@ubuntu:~/sanjeev$ ls -lt
total 245612
-rw-rw-r-- 1 notroot notroot 125744560 May  3 01:46 googlemapred.mp4
-rw-rw-r-- 1 notroot notroot       175 May  1 11:15 sample1
-rw-rw-r-- 1 notroot notroot       136 May  1 11:09 sample
-rw-rw-r-- 1 notroot notroot 125744560 Apr 14  2014 googleroundtable.mp4

notroot@ubuntu:~/sanjeev$ hadoop fs -copyFromLocal sample hdfs:/sanjeev

notroot@ubuntu:~/sanjeev$ hadoop fs -ls hdfs:/sanjeev
Found 2 items
-rw-r--r--   1 notroot supergroup  125744560 2016-05-03 01:42 /sanjeev/googleroundtable.mp4
-rw-r--r--   1 notroot supergroup        136 2016-05-03 01:53 /sanjeev/sample

notroot@ubuntu:~/sanjeev$ hadoop fs -ls hdfs:/sanjeev/sample
Found 1 items
-rw-r--r--   1 notroot supergroup        136 2016-05-03 01:53 /sanjeev/sample

notroot@ubuntu:~/sanjeev$ hadoop fs -cat hdfs:/sanjeev/sample
this is my test file

i m learing hadoop

its workong on vmware

ubantu is on top of vmaware

this is 2nd day class revision..

No comments:

Post a Comment

Amazon Best Sellors

TOGAF 9.2 - STUDY [ The Open Group Architecture Framework ] - Chap 01 - Introduction

100 Feet View of TOGAF  What is Enterprise? Collection of Organization that has common set of Goals. Enterprise has People - organized by co...