欢迎投稿

今日深度:

(Hadoop一)搭建Hadoop完全分布式,

(Hadoop一)搭建Hadoop完全分布式,


搭建hadoop平台

1、新建虚拟机

2、为普通用户配置sudo

原因:修改某些系统文件时需要root权限,频繁切换用户很可能会导致某些问题,故通过赋予普通用户root权限来避免这些问题

方法:切换root用户

               vim /etc/sudoers

               添加zyd ALL=(ALL) NOPASSWD:ALL

3、配置网络

         (1)右键虚拟机->设置->网络适配器->NAT模式->确定

         (2)修改主机名

                    sudo vim /etc/sysconfig/network

         (3sudo vim /etc/sysconfig/network-scripts/ifcfg-eth0

BOOTPROTO=static

                    ONBOOT=yes

         IPADDR=(编辑->虚拟网络编辑器->VMnet8->NAT设置->子网IP

         NETMASK=(编辑->虚拟网络编辑器-> VMnet8->NAT设置->子网掩码)

         GATEWAY=(编辑->虚拟网络编辑器-> VMnet8->NAT设置->网关)

         DNS1==(编辑->虚拟网络编辑器-> VMnet8->NAT设置->网关)

         (4)修改域名映射
                    sudo vim /etc/hosts

                    ip hostname

4、关闭防火墙

         原因:若未关防火墙,则远程web浏览器无法访问该机器

         方法:(1)临时关闭:service iptables stop

                     (2)永久关闭:chkconfig iptables off

                     (3)查看状态:chkconfig iptables –list

                     (4)开启防火墙:chkconfig iptables on

5、配置无密登录

         (1)普通用户下执行ssh-keygen -t rsa命令,所有提示信息均按回车

         (2)在~/.ssh下执行命令

cat id_rsa.pub>>authorized_keys

chmod 600 authorized.keys

6、安装hadoopjdk

         (1)右键虚拟机->设置->选项->共享文件夹->启用->添加->下一步->浏览(名称最好不包含中文)

2~下创建apps目录

         (3)进入/mnt/hgfa/…hadoopjdk压缩包复制到apps目录下

         (4)安装jdk

①检查本机是否安装过jdk

                           rpm -qa | grep java

                           若安装过则将已安装的全部卸载:sudo rpm -e --nodeps *

                          ps*指代已安装jdk

                  ②进入apps目录下

                  ③解压jdk压缩包

                          tar -zvxf *

                          ps*指代jdk压缩包

                  ④为解压后的jdk文件创建软连接

                          ln -s * jdk

                          ps*指代解压后的jdk文件

                  ⑤配置环境变量

                          vim ~/.bash_profile

                          JAVA_HOME=主目录/apps/jdk

                          PATH=$PATH:$JAVA_HOME/bin:$JAVA_HOME/jre/bin

                          export JAVA_HOME PATH

                  ⑥重新加载配置文件

                          source ~/.bash_profile

                  ⑦测试是否配置成功

                          java -version

                          javac

         (5)安装hadoop

                  ①进入apps目录下

②解压hadoop压缩包

                          tar -zvxf *

                          ps*指代hadoop压缩包

                  ③为解压后的hadoop文件创建压缩包
                          ln -s Hadoop *

ps*指代解压后的hadoop文件

                  ④配置环境变量

                          vim ~/.bash_profile

                          HADOOP_HOME=/home/zyd/apps/Hadoop

                          PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

                          export JAVA_HOME HADOOP_HOME PATH

⑤重新加载配置文件

                          source ~/.bash_profile

⑥测试是否配置成功

                          Hadoop

7、配置hadoop

1)在~/zyd/apps/Hadoop/目录下执行mkdir tmp后进入/主目录/apps/hadoop/etc/hadoop

         (2)修改core_site.xml

                          <configuration>

                                   <property>

                                            <name>fs.defaultFS</name>

                                            <value>hdfs://master:9000</value>

</property>

                                   <property>

                                            <name>hadoop.tmp.dir</name>

                                            <value>file:/home/zyd/apps/hadoop/tmp</value>

</property>

                           </configuration>

         (3)修改hdfs-sit.xml

                          <configuration>

                                   <property>

                                            <name>dfs.replication</name>

                                            <value>3</value>

</property>

                                   <property>

                                            <name>dfs.namenode.name.dir</name>

                                      <value>

file:/home/zyd/apps/hadoop/tmp/dfs/name

</value>

</property>

                                   <property>

                                            <name>dfs.datanode.data.dir</name>

                                      <value>

file:/home/zyd/apps/hadoop/tmp/dfs/data

</value>

</property>

                           </configuration>

         (4)修改mapred-site.xml

                          <configuration>

                                   <property>

                                            <name>mapreduce.framework.name</name>

                                            <value>yarn</value>

</property>

                                   <property>

                                            <name>mapreduce.jobhistory.address</name>

                                      <value>master:10020</value>

</property>

                                   <property>

                                            <name>mapreduce.jobhistory.webapp.address</name>

                                      <value>master:19888</value>

</property>

                           </configuration>

ps:在此目录下没有该文件,需要复制mapred-site.xml.template并更名为mapred-site.xml

         (5)修改yarn-site.xml

                          <configuration>

                                   <property>

                                            <name>yarn.nodemanager.aux-services</name>

                                            <value>mapreduce_shuffle</value>

</property>

                                   <property>

                                            <name>yarn.resoucemanager.hostname</name>

                                      <value>master </value>

</property>

                           </configuration>

6)修改hadoop-env.sh

         JAVA_HOME=/主目录/apps/jdk

7)修改yarn-env.sh

         JAVA_HOME=/主目录/apps/jdk

8)修改slaves文件

         删除默认的localhost,增加四个数据节点

8、克隆三台虚拟机作为从节点

         (1)右键作为主节点的虚拟机->管理->克隆->创建完整克隆

2)修改主机名

sudo vim /etc/sysconfig/network

HOSTNAEM=slave1

                           (3)修改网卡信息

                                   sudo vim /etc/udev/rules.d/70-persistent-net.rules

                                   注释掉eth0,将eth1修改为eth0

                          (4)修改mac码与ip地址

sudo vim /etc/sysconfig/network-scripts/ifcfg-eth0

psmac码为(3)中的mac

5)修改域名映射

         sudo vim /etc/hosts

www.htsjk.Com true http://www.htsjk.com/Hadoop/40086.html NewsArticle (Hadoop一)搭建Hadoop完全分布式, 搭建 hadoop 平台 1 、新建虚拟机 2 、为普通用户配置 sudo 原因:修改某些系统文件时需要 root 权限,频繁切换用户很可能会导致某些问题,故通过赋予...
相关文章
    暂无相关文章
评论暂时关闭