Hadoop: Require Root's Password After Enter "Start-All.Sh"

How to start hadoop without asking local machine password?

Based on my research i follow these steps to avoid the above problem

step 1: ssh-keygen -t rsa -P ""

step 2: cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys

and now i started hadoop

amtex@amtex-desktop:~$ start-all.sh

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-namenode-amtex-desktop.out
localhost: starting datanode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-datanode-amtex-desktop.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-secondarynamenode-amtex-desktop.out
starting yarn daemons
starting resourcemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-resourcemanager-amtex-desktop.out
localhost: starting nodemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-nodemanager-amtex-desktop.out
amtex@amtex-desktop:~$ start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /home/amtex/Documents/installed/spark/logs/spark-amtex-org.apache.spark.deploy.master.Master-1-amtex-desktop.out
amtex@amtex-desktop:~$ start-slaves.sh
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/amtex/Documents/installed/spark/logs/spark-amtex-org.apache.spark.deploy.worker.Worker-1-amtex-desktop.out

amtex@amtex-desktop:~$ jps

21523 Jps
2404 Startup
21029 NodeManager
20581 DataNode
20439 NameNode
20760 SecondaryNameNode
21353 Master
21466 Worker
20911 ResourceManager

starting hadoop process using start-all.sh runs into issues

Your SSH isn't setup properly

Setup passphraseless ssh

Now check that you can ssh to the localhost without a passphrase:

$ ssh localhost

If you cannot ssh to localhost without a passphrase, execute the
following commands:

$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

Execution

Format a new distributed-filesystem:

$ bin/hadoop namenode -format

Start The hadoop daemons:

$ bin/start-all.sh

http://hadoop.apache.org/common/docs/r0.17.0/quickstart.html#Setup+passphraseless

or

Refer Micheal Noll's link for running Hadoop on your machine.

What is the password for root@localhost's password?

It is blank (as in "") unless you set it:

ssh root@localhost uses the same password for root. It looks like you have not set root password.
To do that log in as root using sudo -s then use passwd command to set root password.

After that you must be able to ssh as root

  • https://askubuntu.com/questions/9017/how-to-know-my-root-password

Hadoop Permission denied (publickey,password,keyboard-interactive) warning

Problem is when you are trying to ssh to a server (in this case localhost) it tries to authenticate you using your credential. And stores that info. But here password-less authentication is not configured, so each time you try to ssh, it will ask you for your password, which is a problem if machines try to communicate with each other using ssh. So to setup passwordless ssh, we need to add user machine's public key to server machines ~/.ssh/authorized_keys file. In this case, both the system are same machines.

So Long story short run the following command.

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

unable to start start-dfs.sh in Hadoop Multinode cluster

I think you missed this step "Add the SSH Public Key to the authorized_keys file on your target hosts"

Just redo the password-less ssh step correctly. Follow this:

  1. Generate public and private SSH keys

    ssh-keygen
  2. Copy the SSH Public Key (id_rsa.pub) to the root account on your
    target hosts

    .ssh/id_rsa
    .ssh/id_rsa.pub
  3. Add the SSH Public Key to the authorized_keys file on your target
    hosts

    cat id_rsa.pub >> authorized_keys
  4. Depending on your version of SSH, you may need to set permissions on
    the .ssh directory (to 700) and the authorized_keys file in that
    directory (to 600) on the target hosts.

    chmod 700 ~/.ssh
    chmod 600 ~/.ssh/authorized_keys
  5. Check the connection:

    ssh root@<remote.target.host>

    where <remote.target.host> has the value of each host name in your cluster.

    If the following warning message displays during your first
    connection: Are you sure you want to continue connecting (yes/no)?

    Enter Yes.

Refer: Set Up Password-less SSH

Note: password will not be asked, if your passwordless ssh is setup properly.



Related Topics



Leave a reply



Submit