首页 > 其他 > 详细

Hadoop-1.2.1集群虚拟机搭建(上)--环境准备

时间:2015-03-15 00:31:21      阅读:278      评论:0      收藏:0      [点我收藏+]

VM虚拟机配置:

技术分享

 

NAT网络配置参考:

安装Hadoop前的装备(在每一台主机上):

配置sudo(可选):

[root@hadoop01 hadoop]# chmod u+w /etc/sudoers
[root@hadoop01 hadoop]# vi /etc/sudoers
添加一行数据:hadoop      ALL=(ALL)              NOPASSWD: ALLhadoop为sudo免密码用户
 

主机名设置:

[root@hadoop03 hadoop]# vi /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.153.101 hadoop01
192.168.153.102 hadoop02
192.168.153.103 hadoop03

关闭iptable:

[root@hadoop01 hadoop]# sudo chkconfig iptables off
[root@hadoop01 hadoop]# sudo /etc/init.d/iptables stop
iptables: Flushing firewall rules:                         [  OK  ]
iptables: Setting chains to policy ACCEPT: filter          [  OK  ]
iptables: Unloading modules:                               [  OK  ]

关闭SELinux:

[root@hadoop02 install]# sudo vi /etc/selinux/config

# This file controls the state of SELinux on the system.
# SELINUX= can take one of these three values:
#     enforcing - SELinux security policy is enforced.
#     permissive - SELinux prints warnings instead of enforcing.
#     disabled - No SELinux policy is loaded.
SELINUX=disabled
# SELINUXTYPE= can take one of these two values:
#     targeted - Targeted processes are protected,
#     mls - Multi Level Security protection.
SELINUXTYPE=targeted

NTP服务配置:

参考:

配置免登录:

在每一台主机生成公钥和私钥:

[hadoop@hadoop01 ~]$ mkdir ~/.ssh 
[hadoop@hadoop01 ~]$ chmod 700 ~/.ssh 
[hadoop@hadoop01 ~]$ ssh-keygen -t dsa -P ‘‘ -f ~/.ssh/id_dsa
 
[hadoop@hadoop02 ~]$ mkdir ~/.ssh 
[hadoop@hadoop02 ~]$ chmod 700 ~/.ssh 
[hadoop@hadoop02~]$ ssh-keygen -t dsa -P ‘‘ -f ~/.ssh/id_dsa
 
[hadoop@hadoop03 ~]$ mkdir ~/.ssh 
[hadoop@hadoop03 ~]$ chmod 700 ~/.ssh 
[hadoop@hadoop03 ~]$ ssh-keygen -t dsa -P ‘‘ -f ~/.ssh/id_dsa

 

 将slave主机的 id_dsa.pub复制到master主机上:

[hadoop@hadoop02 .ssh]$ scp id_dsa.pub hadoop@hadoop01:/home/hadoop/.ssh/id_dsa.pub.hadoop02
[hadoop@hadoop03 .ssh]$ scp id_dsa.pub hadoop@hadoop01:/home/hadoop/.ssh/id_dsa.pub.hadoop03

 

Masters上将所有id_dsa.pub集中:

[hadoop@hadoop01 .ssh]$  cat id_dsa.pub >> authorized_keys
[hadoop@hadoop01 .ssh]$  cat id_dsa.pub.hadoop02 >> authorized_keys
[hadoop@hadoop01 .ssh]$  cat id_dsa.pub.hadoop03 >> authorized_keys

 

将Master主机上的authorized_keys分发到各slave主机上:

[hadoop@hadoop01 .ssh]$ scp authorized_keys hadoop@hadoop02:/home/hadoop/.ssh/authorized_keys
[hadoop@hadoop01 .ssh]$ scp authorized_keys hadoop@hadoop03:/home/hadoop/.ssh/authorized_keys

 

安装JDK:

sudo tar -xvf jdk-7u55-linux-x64.gz
sudo chown -R root:root jdk1.7.0_55
安装成功后会生成jdk1.7.0_55,确认该目录权限为drwxr-xr-x (755) 
 
vi /etc/profile
export JAVA_HOME=/usr/lib/java/jdk1.7.0_55
export PATH=$JAVA_HOME/bin:$PATH

[hadoop@hadoop01 java]$ source /etc/profile
[hadoop@hadoop01 java]$ java -version
java version "1.7.0_55"
Java(TM) SE Runtime Environment (build 1.7.0_55-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.55-b03, mixed mode)

 

Hadoop-1.2.1集群虚拟机搭建(上)--环境准备

原文:http://www.cnblogs.com/gongice/p/4338546.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!