I was searching for a complete tutorial on installing Hadoop on Mac and play around with it. There are resources on installing Hadoop with "HomeBrew" which is the missing package manager in Mac ;). But i do not want to offload all the configuration burden to it as i need to learn this from top to bottom. I played with some and here are the configuration steps i followed.
1) You need to download and extract Hadoop Binary. I used Hadoop 2.5.1 which is the latest at the moment.
2) Extract the binary and lets called the location as HADOOP_HOME
3) Add HADOOP_HOME and JAVA_HOME as path variables to your system. You can add them to
bashrc or bash_profile.
You can add them by issuing following commands.
Add following entries and change paths according to your machine's configurations.
export JAVA_HOME=$(/usr/libexec/java_home) export HADOOP_HOME=/Users/user1/software/hadoop-2.5.1 export PATH=$PATH:$HADOOP_HOME
and then reload the configurations.
(Follow these steps if you need to run in Pseudo Distributed mode. If you do continue you will have to add input files to the HDFS and then download output files from the HDFS too.)
4) Navigate to HADOOP_HOME and change following files as below.
5) Setup passphraseless ssh
Now check that you can ssh to the localhost without a passphrase:
$ ssh localhost
If you cannot ssh to localhost without a passphrase, execute the following commands:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
6) Starting the Hadoop in Standalone mode.
(You may need to enable remote-login in system preferences --> sharing if you have not enabled it later to login through ssh.)
Navigate to $HADOOP_HOME
Format the filesystem:
$ bin/hdfs namenode -format
Start NameNode daemon and DataNode daemon:
The hadoop daemon log output is written to the $HADOOP_LOG_DIR directory (defaults to $HADOOP_HOME/logs).
Browse the web interface for the NameNode; by default it is available at:
NameNode - http://localhost:50070/
So good luck with all your map reduce jobs. :)