Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Hadoop configuration ssh automation of automation     - Mac OS X Server installation and application (Linux)

- Vi / Vim prompt solutions do not have permission to save time (Linux)

- Transfer files and permissions from Windows to Linux system by Samba (Linux)

- Spring use Cache (Programming)

- iOS Sensor Development - add to APP phone password, fingerprint security authentication (Programming)

- Linux system started to learn: the Linux syslog (Linux)

- Ubuntu install video playback software SMPlayer 14.9.0.7042 (Linux)

- CentOS Linux firewall configuration and Close (Linux)

- Zabbix Agent for Linux Installation and Configuration (Server)

- Linux file compression and file system packaged with instructions (Linux)

- Python Flask environment to build (Linux)

- Java study notes: String (Programming)

- Java multi-threaded shared communications variables (Programming)

- Ubuntu install perfectly handsome terminal Guake 0.8.1 (Linux)

- Computer security protection remove local and remote system log files (Linux)

- Install Ubuntu text editor KKEdit 0.2.10 (Linux)

- Linux System Getting Started Learning: On Linux how to convert text files to PDF (Linux)

- Achieve camera preview by ffplay (Linux)

- The lambda expression Java8 (constructor references) (Programming)

- Using a proxy method under Linux terminal (Linux)

 
         
  Hadoop configuration ssh automation of automation
     
  Add Date : 2018-11-21      
         
         
         
  Test environment: Ubuntu 12.04.2 server 64bit, expect version 5.45, GNU bash, version 4.2.24 (1) -release (x86_64-pc-linux-gnu)

Note: Results are out Hadoop automated configuration: the whole cluster a namenode, a secondary, a JobTracker, and these three processes on the same machine above, datanode and tasktracker other slaves machines above (if there is a need to modify the appropriate shell script can)

hadoop configuration automation how to do? This should involve many aspects. If you say this is all bare metal cluster, all machines share a user name and password, and configure the tools expect default in / etc / hosts inside the cluster configuration all machines machine name and ip. We should be able to do the following ideas: (1) automated deployment ssh configuration; (2) jdk automated deployment configuration; (3) hadoop automated configuration;

(1) Thinking user ssh automated deployment configuration To configure namenode first node above a slaves.host the file that contains all cluster slaves machine name, and then run the script to automatically generate id_rsa.pub file namenode above, and generates authorized_keys authorization file, the file is then distributed to the slaves of the cluster above complete ssh configuration;

(2) jdk configuration, jdk mainly to a jdk.tar.gz package decompress it, then configure the .bashrc file, and then .bashrc file and unpacked jdk.tar.gz document distributed to slaves cluster; to complete for jdk configuration;

(3) hadoop automated configuration, this configuration if the configuration conf folder under the file, first download hadoop installation package, and then extract modify some general configuration conf inside, and then jdk path namenode nodes above and namenode machine name and salves configure the machine name .xml, .env file, and finally hadoop modified unzip the package to be distributed to the various slaves;

Paste here first first article configuration ssh shell code:

#! / Bin / bash
# Auto generate ssh key and distribute the authorized_keys to the salves machine
# The script should run on the namenode manchine

if [$ # -lt 2]; then
cat << HELP
 generate_ssh_v1 --generate ssh key for login without typing password;
this script should run on the namenode machine and user should edit the ip-list file

USAGE: ./generate_ssh_v1 user pasaword

EXAMPLE: ./generate_ssh_v1 hadoop1 1234
HELP
        exit 0
fi

user = $ 1
ip = $ HOSTNAME
pass = $ 2
rm -rf ~ / .ssh

echo ''
echo "################################################ #### '
echo "generate the rsa public key on $ HOSTNAME ..."
echo "################################################ #### '

expect -c "
 set timeout 3
 spawn ssh $ user @ $ ip
 expect "yes / no \"
 send - "yes r \"
 expect "password: \"
 send - "$ pass r \"
 expect "$ \"
 send "ssh-keygen -t rsa -P '' -f $ HOME / .ssh / id_rsa r \"
 expect "$ \"
 send "ssh-copy-id -i $ HOME / .ssh / id_rsa.pub $ HOSTNAME r \"
 expect "password \"
 send - "$ pass r \"
 expect eof
"
echo ''
echo "################################################ #### '
echo "copy the namenode's authorized_keys to slaves ..."
echo "################################################ #### '

for slave in `cat slaves.host`
do
 expect -c "
  set timeout 3
  spawn ssh $ user @ $ slave
  expect "yes / no \"
  send - "yes r \"
  expect "password \"
  send - "$ pass r \"
  expect "$ \"
  send "rm -rf $ HOME / .ssh r \"
  expect "$ \"
  send "mkdir $ HOME / .ssh r \"
  expect "$ \"
  expect eof
 "
done

for slave in `cat slaves.host`
do
 expect -c "
  set timeout 3
  spawn scp $ HOME / .ssh / authorized_keys $ user @ $ slave: $ HOME / .ssh /
  expect "password \"
  send - "$ pass r \"
  expect eof
 "
done

/ Etc / hosts:

192.168.128.138 hadoop
192.168.128.130 ubuntu

slaves.host:

hadoop
Test information:

Hadoop1 @ Ubuntu: ~ $ ./generate_ssh_v1 hadoop1 1234

################################################## ##
 generate the rsa public key on ubuntu ...
################################################## ##
spawn ssh hadoop1 @ ubuntu
The authenticity of host 'ubuntu (192.168.128.130)' can not be established.
ECDSA key fingerprint is 53: c7: 7a: dc: 3b: bc: 34: 00: 4a: 6d: 18: 1c: 5e: 87: e7: e8.
Are you sure you want to continue connecting (yes / no)? Yes
Warning: Permanently added 'ubuntu, 192.168.128.130' (ECDSA) to the list of known hosts.
hadoop1 @ ubuntu's password:
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic x86_64)

 * Documentation: https://help.ubuntu.com/

Last login: Mon Sep 23 15:22:03 2013 from ubuntu
ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa
hadoop1 @ ubuntu: ~ $ ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa
Generating public / private rsa key pair.
Your identification has been saved in /home/hadoop1/.ssh/id_rsa.
Your public key has been saved in /home/hadoop1/.ssh/id_rsa.pub.
The key fingerprint is:
e1: 5e: 20: 9d: 4e: 11: f8: dc: 05: 35: 08: 83: 5d: ce: 99: ed hadoop1 @ ubuntu
The key's randomart image is:
+ - [RSA 2048] ---- +
| + = + O + o |
| O .. * + ... |
|. .o * = .. |
| = Oo .. |
|. S E |
|.. |
|. |
| |
| |
+ ----------------- +
hadoop1 @ ubuntu: ~ $ ssh-copy-id -i /home/hadoop1/.ssh/id_rsa.pub ubuntu
hadoop1 @ ubuntu's password:
Now try logging into the machine, with "ssh 'ubuntu'", and check in:

  ~ / .ssh / Authorized_keys

to make sure we have not added extra keys that you were not expecting.

hadoop1 @ ubuntu: ~ $
################################################## ##
 copy the namenode's authorized_keys to slaves ...
################################################## ##
spawn ssh hadoop1 @ hadoop
The authenticity of host 'hadoop (192.168.128.138)' can not be established.
ECDSA key fingerprint is 10: 8f: d1: 8e: 63: 0a: af: 1e: fb: d9: a8: bb: 9a: 39: ab: 46.
Are you sure you want to continue connecting (yes / no)? Yes
Warning: Permanently added 'hadoop, 192.168.128.138' (ECDSA) to the list of known hosts.
hadoop1 @ hadoop's password:
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic i686)

 * Documentation: https://help.ubuntu.com/

  System information as of Tue Aug 6 20:11:49 CST 2013

  System load: 0.1 Processes: 76
  Usage of /: 24.8% of 7.12GB Users logged in: 2
  Memory usage: 34% IP address for eth0: 192.168.128.138
  Swap usage: 0%

  Graph this data and manage this system at https://landscape.canonical.com/

85 packages can be updated.
45 updates are security updates.

Last login: Tue Aug 6 20:11:16 2013 from 192.168.128.130
hadoop1 @ hadoop: ~ $ rm -rf /home/hadoop1/.ssh
hadoop1 @ hadoop: ~ $ mkdir /home/hadoop1/.ssh
hadoop1 @ hadoop: ~ $ spawn scp /home/hadoop1/.ssh/authorized_keys hadoop1 @ hadoop: /home/hadoop1/.ssh/
hadoop1 @ hadoop's password:
authorized_keys 100% 396 0.4KB / s 00:00
hadoop1 @ ubuntu: ~ $ ssh ubuntu
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic x86_64)

 * Documentation: https://help.ubuntu.com/

Last login: Mon Sep 23 16:13:39 2013 from ubuntu
hadoop1 @ ubuntu: ~ $ exit
logout
Connection to ubuntu closed.
hadoop1 @ ubuntu: ~ $ ssh hadoop
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic i686)

 * Documentation: https://help.ubuntu.com/

  System information as of Tue Aug 6 20:12:17 CST 2013

  System load: 0.12 Processes: 76
  Usage of /: 24.8% of 7.12GB Users logged in: 2
  Memory usage: 34% IP address for eth0: 192.168.128.138
  Swap usage: 0%

  Graph this data and manage this system at https://landscape.canonical.com/

85 packages can be updated.
45 updates are security updates.

Last login: Tue Aug 6 20:11:50 2013 from 192.168.128.130
hadoop1 @ hadoop: ~ $ exit
logout
Connection to hadoop closed.

Summary: At the beginning of the preparation of shell attached to write and direct knock spawn shell commands, the results can not always expect to read later. . .
     
         
         
         
  More:      
 
- Talk about Java EE Learning (Programming)
- OpenDaylight Helium version installed (Linux)
- Process safety monitoring and protection under the Linux operating system (Linux)
- Linux --- file descriptors and redirection (Linux)
- Ubuntu install Tonido private cloud services (Server)
- Python script running in the background (Programming)
- Ubuntu Server (Ubuntu 14.04 LTS 64-bit) installation libgdiplus 2.10.9 error solution (Linux)
- Supervisor Installation and Configuration (Server)
- MySQL appears data too long for column 'name' at row 1 to solve (Database)
- secureCRT remote login Linux must first open the connection protocol (Linux)
- Ten SCP transfer command example (Linux)
- Repair Raspbian Encountered a section with no Package (Linux)
- Linux server data backup (Server)
- Linux performance optimization features Tuned and ktune (Linux)
- Ubuntu Locale configuration problem solving Can not set LC_CTYPE (Linux)
- Oracle database on the hit rate of query summary (Database)
- Linux firewall to prevent external network attacks (Linux)
- Use py2exe to generate exe files Python script (Programming)
- About redis in Replication (Database)
- Solaris 10 nagios monitoring system (Linux)
     
           
     
  CopyRight 2002-2020 newfreesoft.com, All Rights Reserved.