- 系统初始化
- mysql5.6 的安装配置
- hive 的安装配置处理
一: 系统环境初始化
1.1 系统环境:
CentOS6.4x64
安装好的hadoop伪分布环境
所需软件包:
apache-hive-0.3.1.tar.gz
mysql-connector-java-5.1.27.tar.gz
mysql-server-5.6.24-1.el6x86_64
mysql-client-5.6.24-1.el6x86_64
上传到/home/hadoop/yangyang/
二: 安装mysql5.6数据库(使用root用操作)
2.1. 安装mysql 5.6的包
rpm -qa |grep mysql
rpm -e --nodeps mysql-libs

rpm -ivh /home/hadoop/mysql-*
service mysql start

2.2 修改mysql 密码
mysql -uroot -p
mysql> set password = password("123456");
mysql> flush privileges;

2.3 mysql 授权主机
mysql> grant all privileges on *.* to root@'namenode01.hadoop.com' identified by '123456';
mysql> flush privileges

三: 安装hive 处理
3.1 在hdfs 上面配置hive 的元数据目录
bin/hdfs dfs -mkdir /tmp
bin/hdfs dfs -mkdir -p /user/hive/warehouse
bin/hdfs dfs -chmod g+w /user/hive/warehouse
bin/hdfs dfs -chmod g+w /tmp
3.2 安装hive修改hive 的配置文件
tar -zxvf apache-hive-0.13.1-bin.tar.gz
mv apache-hive-0.13.1-bin yangyang/hive
cd yangyang/hive/conf
3.3 修改hive-env.sh
cp -p hive-env.sh.template hive-env.sh
vim hive-env.sh
export HADOOP_HOME=/home/hadoop/yangyang/hadoop
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/hadoop/yangyang/hive/conf

3.4 修改hive-site.xml
cp -p hive-default.xml.template hive-site.xml
vim hive-site.xml
<property>
<name>hive.cli.print.header</name>
<value>true</value> <!-- false 改为true -->
<description>Whether to print the names of the columns in query output.</description>
</property>
<property>
<name>hive.cli.print.current.db</name>
<value>true</value> <!-- false 改为true -->
<description>Whether to include the current database in the Hive prompt.</description>
</property>
3.5 增加内容mysql连接方面的内容:
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://namenode01.hadoop.com:3306/metastore?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
3.6 开启logs 日志信息内容
cp -p hive-log4j.properties.template hive-log4j.properties
vim hive-log4j.properties
hive.log.dir=/home/hadoop/yangyang/hive
hive.log.file=hive.log
3.7 加上mysql的jdbc 的jar 包
tar -zxvf mysql-connector-java-5.1.27.tar.gz
cd mysql-connector-java-5.1.27/
cp -p mysql-connector-java-5.1.27-bin.jar yangyang/hive/lib/
运行hive
bin/hive
