Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Hadoop configuration ssh automation of automation     - CentOS7 install MySQL5.6.22 (Linux)

- Security measures under Unix multi-user operating system (Linux)

- Use ldap implement Windows Remote Desktop Ubuntu Linux (Linux)

- Linux vi command list (Linux)

- Getting Started with Linux system to learn: How to compress JPEG images on the command line (Linux)

- How to monitor Linux system performance Nmon (Linux)

- Java Prototype Pattern (Programming)

- LinkedList Basic Usage (Programming)

- Use apt-p2p up a local Debian package cache (Server)

- MySQL to NoSQL avatar (Database)

- Docker build their own private warehouses (Linux)

- Linux remote wake the computer original code [C] (Linux)

- Linux5 install MySQL 5.6 (RPM mode) (Database)

- How do you temporarily emptied Bash environment variable before running the command on Linux (Linux)

- Oracle RMAN-06023 and ORA-19693 errors (Database)

- Android Activity launchMode (Programming)

- How to remove the files inside the privacy of data on Linux (Linux)

- Ftp user to create multiple virtual machines to support different access rights Examples (Server)

- Use window.name + iframe cross-domain access to data Detailed (Programming)

- BGP routers want to play it by CentOS (Linux)

 
         
  Hadoop configuration ssh automation of automation
     
  Add Date : 2018-11-21      
         
         
         
  Test environment: Ubuntu 12.04.2 server 64bit, expect version 5.45, GNU bash, version 4.2.24 (1) -release (x86_64-pc-linux-gnu)

Note: Results are out Hadoop automated configuration: the whole cluster a namenode, a secondary, a JobTracker, and these three processes on the same machine above, datanode and tasktracker other slaves machines above (if there is a need to modify the appropriate shell script can)

hadoop configuration automation how to do? This should involve many aspects. If you say this is all bare metal cluster, all machines share a user name and password, and configure the tools expect default in / etc / hosts inside the cluster configuration all machines machine name and ip. We should be able to do the following ideas: (1) automated deployment ssh configuration; (2) jdk automated deployment configuration; (3) hadoop automated configuration;

(1) Thinking user ssh automated deployment configuration To configure namenode first node above a slaves.host the file that contains all cluster slaves machine name, and then run the script to automatically generate id_rsa.pub file namenode above, and generates authorized_keys authorization file, the file is then distributed to the slaves of the cluster above complete ssh configuration;

(2) jdk configuration, jdk mainly to a jdk.tar.gz package decompress it, then configure the .bashrc file, and then .bashrc file and unpacked jdk.tar.gz document distributed to slaves cluster; to complete for jdk configuration;

(3) hadoop automated configuration, this configuration if the configuration conf folder under the file, first download hadoop installation package, and then extract modify some general configuration conf inside, and then jdk path namenode nodes above and namenode machine name and salves configure the machine name .xml, .env file, and finally hadoop modified unzip the package to be distributed to the various slaves;

Paste here first first article configuration ssh shell code:

#! / Bin / bash
# Auto generate ssh key and distribute the authorized_keys to the salves machine
# The script should run on the namenode manchine

if [$ # -lt 2]; then
cat << HELP
 generate_ssh_v1 --generate ssh key for login without typing password;
this script should run on the namenode machine and user should edit the ip-list file

USAGE: ./generate_ssh_v1 user pasaword

EXAMPLE: ./generate_ssh_v1 hadoop1 1234
HELP
        exit 0
fi

user = $ 1
ip = $ HOSTNAME
pass = $ 2
rm -rf ~ / .ssh

echo ''
echo "################################################ #### '
echo "generate the rsa public key on $ HOSTNAME ..."
echo "################################################ #### '

expect -c "
 set timeout 3
 spawn ssh $ user @ $ ip
 expect "yes / no \"
 send - "yes r \"
 expect "password: \"
 send - "$ pass r \"
 expect "$ \"
 send "ssh-keygen -t rsa -P '' -f $ HOME / .ssh / id_rsa r \"
 expect "$ \"
 send "ssh-copy-id -i $ HOME / .ssh / id_rsa.pub $ HOSTNAME r \"
 expect "password \"
 send - "$ pass r \"
 expect eof
"
echo ''
echo "################################################ #### '
echo "copy the namenode's authorized_keys to slaves ..."
echo "################################################ #### '

for slave in `cat slaves.host`
do
 expect -c "
  set timeout 3
  spawn ssh $ user @ $ slave
  expect "yes / no \"
  send - "yes r \"
  expect "password \"
  send - "$ pass r \"
  expect "$ \"
  send "rm -rf $ HOME / .ssh r \"
  expect "$ \"
  send "mkdir $ HOME / .ssh r \"
  expect "$ \"
  expect eof
 "
done

for slave in `cat slaves.host`
do
 expect -c "
  set timeout 3
  spawn scp $ HOME / .ssh / authorized_keys $ user @ $ slave: $ HOME / .ssh /
  expect "password \"
  send - "$ pass r \"
  expect eof
 "
done

/ Etc / hosts:

192.168.128.138 hadoop
192.168.128.130 ubuntu

slaves.host:

hadoop
Test information:

Hadoop1 @ Ubuntu: ~ $ ./generate_ssh_v1 hadoop1 1234

################################################## ##
 generate the rsa public key on ubuntu ...
################################################## ##
spawn ssh hadoop1 @ ubuntu
The authenticity of host 'ubuntu (192.168.128.130)' can not be established.
ECDSA key fingerprint is 53: c7: 7a: dc: 3b: bc: 34: 00: 4a: 6d: 18: 1c: 5e: 87: e7: e8.
Are you sure you want to continue connecting (yes / no)? Yes
Warning: Permanently added 'ubuntu, 192.168.128.130' (ECDSA) to the list of known hosts.
hadoop1 @ ubuntu's password:
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic x86_64)

 * Documentation: https://help.ubuntu.com/

Last login: Mon Sep 23 15:22:03 2013 from ubuntu
ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa
hadoop1 @ ubuntu: ~ $ ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa
Generating public / private rsa key pair.
Your identification has been saved in /home/hadoop1/.ssh/id_rsa.
Your public key has been saved in /home/hadoop1/.ssh/id_rsa.pub.
The key fingerprint is:
e1: 5e: 20: 9d: 4e: 11: f8: dc: 05: 35: 08: 83: 5d: ce: 99: ed hadoop1 @ ubuntu
The key's randomart image is:
+ - [RSA 2048] ---- +
| + = + O + o |
| O .. * + ... |
|. .o * = .. |
| = Oo .. |
|. S E |
|.. |
|. |
| |
| |
+ ----------------- +
hadoop1 @ ubuntu: ~ $ ssh-copy-id -i /home/hadoop1/.ssh/id_rsa.pub ubuntu
hadoop1 @ ubuntu's password:
Now try logging into the machine, with "ssh 'ubuntu'", and check in:

  ~ / .ssh / Authorized_keys

to make sure we have not added extra keys that you were not expecting.

hadoop1 @ ubuntu: ~ $
################################################## ##
 copy the namenode's authorized_keys to slaves ...
################################################## ##
spawn ssh hadoop1 @ hadoop
The authenticity of host 'hadoop (192.168.128.138)' can not be established.
ECDSA key fingerprint is 10: 8f: d1: 8e: 63: 0a: af: 1e: fb: d9: a8: bb: 9a: 39: ab: 46.
Are you sure you want to continue connecting (yes / no)? Yes
Warning: Permanently added 'hadoop, 192.168.128.138' (ECDSA) to the list of known hosts.
hadoop1 @ hadoop's password:
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic i686)

 * Documentation: https://help.ubuntu.com/

  System information as of Tue Aug 6 20:11:49 CST 2013

  System load: 0.1 Processes: 76
  Usage of /: 24.8% of 7.12GB Users logged in: 2
  Memory usage: 34% IP address for eth0: 192.168.128.138
  Swap usage: 0%

  Graph this data and manage this system at https://landscape.canonical.com/

85 packages can be updated.
45 updates are security updates.

Last login: Tue Aug 6 20:11:16 2013 from 192.168.128.130
hadoop1 @ hadoop: ~ $ rm -rf /home/hadoop1/.ssh
hadoop1 @ hadoop: ~ $ mkdir /home/hadoop1/.ssh
hadoop1 @ hadoop: ~ $ spawn scp /home/hadoop1/.ssh/authorized_keys hadoop1 @ hadoop: /home/hadoop1/.ssh/
hadoop1 @ hadoop's password:
authorized_keys 100% 396 0.4KB / s 00:00
hadoop1 @ ubuntu: ~ $ ssh ubuntu
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic x86_64)

 * Documentation: https://help.ubuntu.com/

Last login: Mon Sep 23 16:13:39 2013 from ubuntu
hadoop1 @ ubuntu: ~ $ exit
logout
Connection to ubuntu closed.
hadoop1 @ ubuntu: ~ $ ssh hadoop
Welcome to Ubuntu 12.04.2 LTS (GNU / Linux 3.5.0-23-generic i686)

 * Documentation: https://help.ubuntu.com/

  System information as of Tue Aug 6 20:12:17 CST 2013

  System load: 0.12 Processes: 76
  Usage of /: 24.8% of 7.12GB Users logged in: 2
  Memory usage: 34% IP address for eth0: 192.168.128.138
  Swap usage: 0%

  Graph this data and manage this system at https://landscape.canonical.com/

85 packages can be updated.
45 updates are security updates.

Last login: Tue Aug 6 20:11:50 2013 from 192.168.128.130
hadoop1 @ hadoop: ~ $ exit
logout
Connection to hadoop closed.

Summary: At the beginning of the preparation of shell attached to write and direct knock spawn shell commands, the results can not always expect to read later. . .
     
         
         
         
  More:      
 
- Configuring s3c-linux-2.6.28.6-Real6410 appears Unable to find the QT3 installation (Linux)
- Repair Chrome for Linux is (Linux)
- How to find on Linux and delete duplicate files: FSlint (Linux)
- Do you know how to build the Linux kernel (Programming)
- Integrated security administrator Linux accident management (Linux)
- Linux System Getting Started Learning: Linux common log file (Linux)
- Zabbix installation under Linux (Server)
- Git common skills (Linux)
- Efficient running Linux virtual machine Six Tips (Linux)
- Difference in MySQL VARCHAR and CHAR data format (Database)
- MySQL display operation control tips (Database)
- Linux Operating System Security Management Experience (Linux)
- Use Oracle Data Guard to complete cross-platform database migration cases (Database)
- Configure the Linux kernel and use iptables to do port mapping (Linux)
- Oracle GoldenGate Installation and Configuration Tutorial Introduction (Database)
- Python configuration tortuous road of third-party libraries Numpy and matplotlib (Programming)
- Linux gprof oprofiling and performance testing tools (Linux)
- tar decompression problems gzip: stdin: not in gzip format (Linux)
- Oracle creates split and splitstr functions (Database)
- Four IDS intrusion detection tool under Linux environment (Linux)
     
           
     
  CopyRight 2002-2022 newfreesoft.com, All Rights Reserved.