Centos7下Hadoop3.x 源码编译(⽀持zstd 压缩⽅式)
⼯具安装
组件
下载apache版本下载地址
JDK
1.8.0_131Hadoop
3.1.1Maven
3.6.0CMake
3.13.1ProtocolBuffer 2.5.0编译完成⽣成的软件包
软件包名
下载地址描述hadoop-3.1.
CentOS7.4下hadoop3.1.1源码编译native-hadoop-3.1. hadoop3.1.1源码编译⽣成的native,含zstd 编译所需要⼯具和软件包
下载hadoop-3.1.源码包,解压并查看
软件安装
安装JDK
下载并上传⾄/home/hadoop/src
使⽤root⽤户创建/usr/java⽬录,安装jdk [hadoop@hadoop src ]$cat ...Requirements:* Unix System * JDK 1.8* Maven 3.3 or later * ProtocolBuffer 2.5.0* CMake 3.1 or newer (if compiling native code )* Zlib devel (if compiling native code )* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance )* Linux FUSE (Filesystem in Userspace ) version 2.6 or above (if compiling fuse_dfs )* Internet connection for first build (to fetch all Maven and Hadoop dependencies )* python (for releasedocs )* bats (for shell code testing )* Node.js / bower / Ember-cli (for YARN UI v2 building )...
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16[root@hadoop ~]# mkdir /usr/local/maven [root@hadoop ~]#
1
2
解压jdk
修改属主
配置环境变量
source使环境变量⽣效
验证JDK
安装Maven
下载apache-maven-3.6.并上传⾄/home/hadoop/src
使⽤root⽤户创建/usr/local/maven⽬录,安装maven
解压maven [hadoop@hadoop src ]$ tar -zxvf /home/hadoop/src/ -C /usr/java/
1[hadoop@hadoop ~]$ chown -R hadoop:hadoop /usr/java
1[root@hadoop jdk1.8.0_131]# vim /etc/profile ## ADD java environment export JAVA_HOME =/usr/java/jdk1.8.0_131export PATH =$JAVA_HOME /bin:$PATH
1
2
3
4[hadoop@hadoop ~]$ source /etc/profile
1[hadoop@hadoop ~]$ java -version java version "1.8.0_131"Java (TM ) SE Runtime Environment (build 1.8.0_131-b11)Java HotSpot (TM ) 64-Bit Server VM (build 25.131-b11, mixed mode )[hadoop@hadoop ~]$
1
2
3
4
5[root@hadoop ~]# mkdir /usr/local/maven ##创建maven 安装⽬录[root@hadoop ~]# chown -R hadoop:hadoop /usr/local/maven ##修改⽬录属主
1
2
配置maven环境变量
source使环境变量⽣效
验证Maven
配置国内maven仓库源
安装⼯具包(root )
安装gcc make ⼯具[hadoop@hadoop src ]$ tar -zxvf apache-maven-3.6. -C /usr/local/maven/
1[root@hadoop jdk1.8.0_131]# vim /etc/profile ## ADD maven environment export MAVEN_HOME =/usr/local/maven/apache-maven-3.6.0export PATH =$MAVEN_HOME /bin:$PATH
1
2
3
4[hadoop@hadoop ~]$ source /etc/profile
1[hadoop@hadoop ~]$ mvn -version Apache Maven 3.6.0 (97c98ec64a1fdfee7767ce5ffb20918da4f719f3; 2018-10-25T02:41:47+08:00)Maven home: /usr/local/maven/apache-maven-3.6.0Java version: 1.8.0_131, vendor: Oracle Corporation, runtime: /usr/java/jdk1.8.0_131/jre Default locale: en_US, platform encoding: UTF-8OS name: "linux", version: "3.10.0-693.el7.x86_64", arch: "amd64", family: "unix"[hadoop@hadoop ~]$
1
2
3
4
5
6
7[hadoop@hadoop conf ]$ vim /usr/local/maven/apache-maven-3.6.0/l ...<mirrors > <mirror > <id >nexus-aliyun </id > <mirrorOf >central </mirrorOf > <name >Nexus aliy
un </name > <url >maven.aliyun/nexus/content/groups/public </url > </mirror ></mirrors >...
1
2
3
4
5
6
7
8
9
10
11
安装压缩⼯具
安装⼀些基本⼯具
安装zstd
安装扩展源 epel
安装zstd
安装CMake
CMake 3.1 或者更新版本,此处我们使⽤3.13.1(cmake-3.13.)下载并上传cmake-3.13.到/home/hadoop/src⽬录下
解压cmake
切换到源码包⽬录[root@hadoop ~]# yum install -y gcc* make
1[root@hadoop ~]# yum -y install snappy* bzip2* lzo* zlib* lz4* gzip*
1[root@hadoop ~]# yum -y install openssl* svn ncurses* autoconf automake libtool
1[root@hadoop ~]# yum -y install epel-release
1[root@hadoop ~]# yum -y install *zstd*
1[hadoop@hadoop src ]$ tar -zxvf cmake-3.13.
1[hadoop@hadoop src ]$ cd cmake-3.13.1
1
检查依赖
编译安装cmake(root)
使⽤root⽤户编译安装
查看cmake版本
CMake suite maintained and supported by Kitware ().
[hadoop@hadoop ~]$
1
2
3
4
5
安装ProtocolBuffer
下载protobuf-2.5.并上传⾄/home/hadoop/src
使⽤root⽤户创建/usr/local/protobuf⽬录,安装ProtocolBuffer
解压ProtocolBuffer
切换到/home/hadoop/src/protobuf-2.5.0⽬录[hadoop@hadoop cmake-3.13.1]$ ./bootstrap
1[hadoop@hadoop cmake-3.13.1]$ su root Password: [root@hadoop cmake-3.13.1]# make && make install
1
2
3[hadoop@hadoop ~]$ cmake -version
cmake version 3.13.1[root@hadoop ~]# mkdir /usr/local/protobuf ##创建ProtocolBuffer 安装⽬录[root@hadoop ~]# chown -R hadoop:hadoop /usr/local/protobuf ##修改⽬录属主
1
2[hadoop@hadoop src ]$ tar -zxvf protobuf-2.5. -C /home/hadoop/src/
1
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论