码迷,mamicode.com
首页 > 系统相关 > 详细

Ubuntu系统下安装并配置hive-2.1.0

时间:2016-11-21 11:12:46      阅读:524      评论:0      收藏:0      [点我收藏+]

标签:0.00   grant   setting   XML   .com   acl   auth   drive   monitor   

 

 

一、mysql-server和mysql-client的下载

root@SparkSingleNode:/usr/local#  sudo apt-get install mysql-server  mysql-client (Ubuntu版本)

 技术分享

  我这里,root密码,为rootroot。

 

 

 

 

 

二、启动MySQL服务

技术分享

root@SparkSingleNode:/usr/local# sudo /etc/init.d/mysql start      (Ubuntu版本)     
* Starting MySQL database server mysqld [ OK ]
root@SparkSingleNode:/usr/local#

 

 

 

三、进入mysql服务

Ubuntu里 的mysql里有个好处,直接自己对root@下的所有,自己默认设置好了

技术分享

技术分享

root@SparkSingleNode:/usr/local# mysql -uroot -p
Enter password:   //输入rootroot
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 43
Server version: 5.5.53-0ubuntu0.14.04.1 (Ubuntu)

Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type ‘help;‘ or ‘\h‘ for help. Type ‘\c‘ to clear the current input statement.

mysql> create database if not exists hive_metadata;
Query OK, 1 row affected (0.00 sec)

mysql> grant all privileges on hive_metadata.* to ‘hive‘@‘%‘ identified by ‘hive‘;
Query OK, 0 rows affected (0.00 sec)

mysql> grant all privileges on hive_metadata.* to ‘hive‘@‘localhost‘ identified by ‘hive‘;
Query OK, 0 rows affected (0.00 sec)

mysql> grant all privileges on hive_metadata.* to ‘hive‘@‘SparkSingleNode‘ identified by ‘hive‘;        //注意,SparkSingleNode是我的主机名,别乱复制
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

mysql> use hive_metadata;
Database changed
mysql> select user,host,password from mysql.user;
+------------------+-----------------+-------------------------------------------+
| user | host | password |
+------------------+-----------------+-------------------------------------------+
| root | localhost | *6C362347EBEAA7DF44F6D34884615A35095E80EB |
| root | sparksinglenode | *6C362347EBEAA7DF44F6D34884615A35095E80EB |
| root | 127.0.0.1 | *6C362347EBEAA7DF44F6D34884615A35095E80EB |
| root | ::1 | *6C362347EBEAA7DF44F6D34884615A35095E80EB |
| debian-sys-maint | localhost | *5DD77395EB71A702D01A6B0FADD8F2C0C88830C5 |
| hive | % | *4DF1D66463C18D44E3B001A8FB1BBFBEA13E27FC |
| hive | localhost | *4DF1D66463C18D44E3B001A8FB1BBFBEA13E27FC |
| hive | sparksinglenode | *4DF1D66463C18D44E3B001A8FB1BBFBEA13E27FC |
+------------------+-----------------+-------------------------------------------+
8 rows in set (0.00 sec)

mysql> exit;
Bye
root@SparkSingleNode:/usr/local#

 

 

 四、安装hive

  这里,很简单,不多赘述。

技术分享

spark@SparkSingleNode:/usr/local/hive$ ll
total 12
drwxr-xr-x 3 spark spark 4096 11月 21 10:39 ./
drwxr-xr-x 15 root root 4096 11月 21 10:25 ../
drwxrwxr-x 9 spark spark 4096 11月 21 10:38 apache-hive-2.1.0-bin/
spark@SparkSingleNode:/usr/local/hive$ mv apache-hive-2.1.0-bin hive-2.1.0
spark@SparkSingleNode:/usr/local/hive$ ls
hive-2.1.0
spark@SparkSingleNode:/usr/local/hive$ cd hive-2.1.0/
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0$ ls
bin examples jdbc LICENSE README.txt scripts
conf hcatalog lib NOTICE RELEASE_NOTES.txt
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0$ cd conf/
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/conf$ ls
beeline-log4j2.properties.template ivysettings.xml
hive-default.xml.template llap-cli-log4j2.properties.template
hive-env.sh.template llap-daemon-log4j2.properties.template
hive-exec-log4j2.properties.template parquet-logging.properties
hive-log4j2.properties.template
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/conf$ cp hive-default.xml.template hive-site.xml
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/conf$

 

 

 五、配置hive

 技术分享

<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://SparkSingleNode:3306/hive_metadata?createDatabaseIfNotExist=true</value>
<description>
JDBC connect string for a JDBC metastore.
To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
</description>

 

 

技术分享

<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>

 

 

技术分享

<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>Username to use against metastore database</description>
</property>

 

 

技术分享

<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property>

 

 

 

spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/conf$ cp hive-env.sh.template hive-env.sh
spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/conf$ vim hive-env.sh

 

 

 

 技术分享

spark@SparkSingleNode:/usr/local/hive/hive-2.1.0/bin$ vim hive-config.sh 

技术分享

export JAVA_HOME=/usr/local/jdk/jdk1.8.0_60
export HIVE_HOME=/usr/local/hive/hive-2.1.0
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.0

 

 

 

 vim /etc/profile

#hive
export HIVE_HOME=/usr/local/hive/hive-2.1.0
export PATH=$PATH:$HIVE_HOME/bin

source /etc/profile

 

将mysql-connector-java-***.jar,复制到hive安装目录下的lib下。

 

 

 

技术分享

 

 

技术分享

spark@SparkSingleNode:/usr/local/hadoop/hadoop-2.6.0$ sbin/start-all.sh

 

Ubuntu系统下安装并配置hive-2.1.0

标签:0.00   grant   setting   XML   .com   acl   auth   drive   monitor   

原文地址:http://www.cnblogs.com/zlslch/p/6084704.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!