1、设置root用户密码,以root用户登录,设置方式如下
sudo -s
gedit /etc/f
[SeatDefaults]
greeter-session=unity-greeter
user-session=Ubuntu
greeter-show-manual-login=true
allow-guest=false
启用root帐号:(Ubuntu默认是禁止root账户的)
sudo passwd root
设置好密码,重启系统,选择“login”,输入“root”,再输入密码就可以了。
2、配置机器的/etc/hosts和/etc/hostname并安装ssh设置三台机器之间的无密码登 录,在“/etc/hostname”文件中把三台机器的hostname分别设置了SparkMaster、SparkWorker1、SparkWorker2并在每台机器的“/etc/hosts”配置如下IP和机器名称的对应关系:hadoop分布式集搭建
127.0.0.1 localhost
192.168.32.131 SparkMaster
192.168.32.132 SparkWorker1
192.168.32.133 SparkWorker2
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
可通过ipconfig来查看ip地址。
可以ping SparkWorker1来查看ip是否配置成功
下面配置ssh无密码登陆:
1)apt-get install ssh
2)/etc/init.d/ssh start,启动服务
3)ps -e |grep ssh,验证服务是否正常启动
4)设置免密登陆,生成私钥和公钥:
ssh-keygen -t rsa -P “”
再/root/.ssh中生成两个文件:id_rsa和id_rsa.pub,id_rsa为私钥,id_rsa.pub为公钥,我们将公钥追加到authorized_keys中,
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
将SparkWorker1、SparkWorker2的id_rsa.pub传给SparkMaster,使用scp命令进行复制:
SparkWorker1上,
scp ~/.ssh/id_rsa.pub root@SparkMaster:~/.ssh/id_rsa.pub.SparkWorker1
SparkWorker2上,
scp ~/.ssh/id_rsa.pub root@SparkMaster:~/.ssh/id_rsa.pub.SparkWorker2
然后将公钥添加到SparkMaster的authorized_keys中,
SparkMaster上,
cd ~/.ssh
cat id_rsa.pub.SparkWorker1 >> authorized_keys
cat id_rsa.pub.SparkWorker2 >> authorized_keys
再将SparkMaster的authorized_keys复制到SparkWorker1、SparkWorker2的.ssh目录下:
scp authorized_keys root@SparkWorker1:~/.ssh/authorized_keys
scp authorized_keys root@SparkWorker2:~/.ssh/authorized_keys
至此,ssh无密登陆已配置完毕。
ssh SparkMaster
ssh SparkWorker1
ssh SparkWorker2
在一台机器上可以登录其他系统无需密码。
3、配置java环境
SparkMaster上,
mkdir /urs/lib/java
cd /urs/lib/java
tar -zxvf
gedit ~/.bashrc
在最后面添加,后面都用得上
#JAVA
export JAVA_HOME=/usr/lib/java/jdk1.8.0_25
export JRE_HOME=${JAVA_HOME}/jre
export CLASS_PATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.0
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
export SCALA_HOME=/usr/lib/scala/scala-2.11.4
export SPARK_HOME=/usr/local/spark/spark-1.2.0-bin-hadoop2.4
export IDEA_HOME=/usr/local/idea/idea-IC-139.659.2
export PATH=${IDEA_HOME}/bin:${SPARK_HOME}/bin:${SCALA_HOME}/bin:${HADOOP_HOME}/bin:${JAVA_HOME}/bin:$PATH
source ~/.bashrc,使配置生效。
java -version可查看版本号,可验证是否成功。
在SparkWorker1,SparkWorker2上以同样方法配置,也可通过scp复制。
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论