There are many softwares including the open source to manage the server farm. Its fun to execute and run the jobs in the Hadoop/HBase cluster. But as the cluster grows with more and more servers, it becomes more and more difficult to manage the server farm without the sophisticated softwares to manage it.
The following simple script could be really useful to run the unix commands on all the farm servers:
Create a file servers.dat with the host name of all the servers:
Now to see all the processes running on the servers, just execute:
To see the disk usage:
Please ensure that the silent/auto login is enabled. If not use the following links to enable it before executing this script:
http://www.rebol.com/docs/ssh-auto-login.html
http://hortonworks.com/kb/generating-ssh-keys-for-passwordless-login/
The following simple script could be really useful to run the unix commands on all the farm servers:
#!/bin/bash
#script used to run the command on hadoop cluster
#
if [ -z "$1" ]; then
echo "Usage: cluster_run.sh <<Unix Command>>"
exit 1;
fi
for server in `cat servers.dat`;do
echo Running command $1 on server $server ...
echo "=================================================================="
ssh $server $1
echo ""
done;
Create a file servers.dat with the host name of all the servers:
server1
server2
server3
..
serverN
Now to see all the processes running on the servers, just execute:
cluster_run.sh 'jps'
To see the disk usage:
cluster_run.sh 'df -k'
Please ensure that the silent/auto login is enabled. If not use the following links to enable it before executing this script:
http://www.rebol.com/docs/ssh-auto-login.html
http://hortonworks.com/kb/generating-ssh-keys-for-passwordless-login/
No comments:
Post a Comment