[关闭]
@songlaf 2016-05-19T09:33:05.000000Z 字数 3090 阅读 1342

作业十三、【Hive安装部署及测试】

北风网大数据培训


一) 安装Hive

1.1) 安装MySQL

  1. #文件安装
  2. rpm -qa |grep mysql
  3. rpm -e --nodeps mysql-libs-5.1.66-2.el6_3.x86_64
  4. rpm -ivh /home/beifeng/MySQL-client-5.6.24-1.el6.x86_64.rpm
  5. rpm -ivh /home/beifeng/MySQL-server-5.6.24-1.el6.x86_64.rpm
  6. #或者采用yum安装
  7. #安装mysql 服务器端:
  8. yum install -y mysql-server mysql-devel mysql-libs
  9. #安装mysql客户端:
  10. yum install mysql
  11. #mysql服务管理命令:
  12. #启动
  13. service mysqld start
  14. #自动启动:
  15. chkconfig mysqld on
  16. #修改登录密码
  17. mysqladmin -u root password 123456
  18. #登陆
  19. mysql -uroot -p123456

1.2) 安装JDBC的驱动包

  1. cp mysql-connector-java-5.1.27-bin.jar /opt/modules/apache-hive-0.13.1-bin/lib/

1.3) 修改配置参数

1.3.1) 修改/etc/profile

  1. #配置HADOOP的目录
  2. export HADOOP_HOME=/opt/modules/hadoop-2.5.0
  3. #配置HIVE的目录
  4. export HIVE_HOME=/opt/modules/apache-hive-0.13.1-bin
  5. #增加在PATH增加:$HIVE_HOME/bin

1.3.2) 修改/etc/profile

  1. #配置HIVE的目录
  2. HIVE_HOME=/opt/modules/apache-hive-0.13.1-bin
  3. #增加在PATH增加:$HIVE_HOME/bin

1.3.3)修改hive-ev.sh

复制hive-env.sh.templaete为hive-env.sh

  1. <!--设置Hadoop的目录-->
  2. export HADOOP_HOME=${HADOOP_HOME}
  3. <!--设置HIVE的配置目录-->
  4. export HIVE_CONF_DIR=$(HIVE_HOME)/conf

1.3.3)修改hive-site.xml

复制hive-default.xml.template为hive-site.xml

  1. <property>
  2. <name>javax.jdo.option.ConnectionURL</name>
  3. <value>jdbc:mysql://njt.song.s9:3306/metastore?createDatabaseIfNotExist=true</value>
  4. <description>JDBC连接MySQL字符串</description>
  5. </property>
  6. <property>
  7. <name>javax.jdo.option.ConnectionDriverName</name>
  8. <value>com.mysql.jdbc.Driver</value>
  9. <description>Mysql Java驱动</description>
  10. </property>
  11. <property>
  12. <name>javax.jdo.option.ConnectionUserName</name>
  13. <value>root</value>
  14. <description>MySQL用户名</description>
  15. </property>
  16. <property>
  17. <name>javax.jdo.option.ConnectionPassword</name>
  18. <value>123456</value>
  19. <description>MySQL 密码</description>
  20. </property>
1.3.3.4) MySQL数据库授权用户
  1. #登录MySQL
  2. mysql -uroot -p123456
  1. mysql> grant all privileges on *.* to root@'njt.song.s6' identified by '123456' ;
  2. Query OK, 0 rows affected (0.02 sec)
  3. mysql> flush privileges;
  4. Query OK, 0 rows affected (0.00 sec)
1.3.3.4)创建HIVE的数据目录
  1. #当通过Hive把本地数据写入HDFS时,先写到临时目录,最后移动到正式目录
  2. bin/hdfs dfs -mkdir /tmp
  3. #数据仓库目录
  4. bin/hdfs dfs -mkdir -p /user/hive/warehouse
  5. bin/hdfs dfs -chmod g+w /user/hive/warehouse
  6. bin/hdfs dfs -chmod g+w /tmp
1.3.3.5) 启动Hive命令行
  1. #注意要首先启动Hadoop,Mapreduce,Yarn
  2. bin/hive

二)测试

  1. #建立数据库
  2. create database if not exists song_test_db
  3. comment 'my test data base'
  4. with dbproperties('creator'='song','date'='2016-05-18')
  5. #创建表
  6. create table Employee
  7. (
  8. name string,
  9. work_place ARRAY<string>,
  10. sex_age STRUCT<sex: string, age: int>,
  11. skills_score MAP<string, int>,
  12. depart_title MAP<STRING, ARRAY<STRING>>
  13. )
  14. COMMENT 'Employee info'
  15. ROW FORMAT DELIMITED
  16. FIELDS TERMINATED BY '|'
  17. COLLECTION ITEMS TERMINATED BY ','
  18. MAP KEYS TERMINATED BY ':';
  19. #Load插入数据
  20. LOAD DATA LOCAL INPATH '/home/sjf/employee.txt' OVERWRITE INTO TABLE employee;
  21. #数据查询
  22. select * from employee

无标题1111.jpg-166.3kB

文件employee.txt内容如下

  1. chael| Montreal, Toronto| Male, 30| DB: 80| Product: Developer^DLead
  2. Will| Montreal| Male, 35| Perl: 85| Product: Lead, Test: Lead
  3. Shelley| New York| Female, 27| Python: 80| Test: Lead, COE: Architect
  4. Lucy| Vancouver| Female, 57| Sales: 89, HR: 94| Sales: Lead
  5. LOAD DATA LOCAL INPATH '/home/sjf/employee.txt' OVERWRITE INTO TABLE Employee;
添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注