How to install hive ecosystem under hadoop ?

Want create site? Find Free WordPress Themes and plugins.

Step 1 : Go to google.com and Type Download Apache Hive and Click on search button and then click on Downloads – Apache Hive – The Apache Software Foundation!

Step 2 : After then following page will open and  then Click on Download a release now!

Step 3 : Once you click on the url shown at step 2. The following page will open. Click on this url —>http://www-eu.apache.org/dist/hive/

Step 4 : After you follow Step 3 following page will open. Here i will be downloading hive-2.1.1/ . So click on hive-2.1.1

Step 5 :  Once you click on hive-2.1.1/. Download apache-hive-2.1.1-bin.tar.gz

Note : This is another url http://apache.mesi.com.ar/ where you can download all the package related to apache. Here in this url http://apache.mesi.com.ar/ you will find all the apache tools under one roof and you can dowload according to your requirements,

Step 6 :  Below is how i downloaded apache-hive-2.1.1-bin.tar.gz file under linux OS

[hduser@storage hadoop]$ cd ~/Softwares/

[hduser@storage Softwares]$  wget http://www-us.apache.org/dist/hive/stable-2/apache-hive-2.1.1-bin.tar.gz
–2017-03-10 18:36:08–  http://www-us.apache.org/dist/hive/stable-2/apache-hive-2.1.1-bin.tar.gz
Resolving www-us.apache.org… 140.211.11.105
Connecting to www-us.apache.org|140.211.11.105|:80… connected.
HTTP request sent, awaiting response… 200 OK
Length: 149756462 (143M) [application/x-gzip]
Saving to: “apache-hive-2.1.1-bin.tar.gz”

100%[=============================================================================================================================>] 149,756,462 4.96K/s   in 16m 6s

2017-03-10 18:52:15 (151 KB/s) – “apache-hive-2.1.1-bin.tar.gz” saved [149756462/149756462]

[hduser@storage Softwares]$ ls
apache-hive-2.1.1-bin.tar.gz  hadoop-2.6.5.tar.gz  sqoop  sqoop-1.4.6.bin__hadoop-2.0.4-alpha
[hduser@storage Softwares]$  ls -ltr
total 341216
-rw-r–r–. 1 hduser hadoop 199635269 Oct 11 10:31 hadoop-2.6.5.tar.gz
-rw-r–r–. 1 hduser hadoop 149756462 Dec  8 21:45 apache-hive-2.1.1-bin.tar.gz
lrwxrwxrwx. 1 hduser hadoop        35 Feb 27 10:20 sqoop -> sqoop-1.4.6.bin__hadoop-2.0.4-alpha
drwxr-xr-x. 9 hduser hadoop      4096 Feb 27 14:12 sqoop-1.4.6.bin__hadoop-2.0.4-alpha

Step 7 : Unzip apache-hive-2.1.1-bin.tar.gz file
[hduser@storage Softwares]$ tar -xzf apache-hive-2.1.1-bin.tar.gz


[hduser@storage Softwares]$  ls -ltr
total 341220
-rw-r–r–. 1 hduser hadoop 199635269 Oct 11 10:31 hadoop-2.6.5.tar.gz
-rw-r–r–. 1 hduser hadoop 149756462 Dec  8 21:45 apache-hive-2.1.1-bin.tar.gz
lrwxrwxrwx. 1 hduser hadoop        35 Feb 27 10:20 sqoop -> sqoop-1.4.6.bin__hadoop-2.0.4-alpha
drwxr-xr-x. 9 hduser hadoop      4096 Feb 27 14:12 sqoop-1.4.6.bin__hadoop-2.0.4-alpha
drwxr-xr-x. 9 hduser hadoop      4096 Mar 11 12:15 apache-hive-2.1.1-bin
[hduser@storage Softwares]$

Step 8 : Now let’s create the SYMBOLIC LINK for apache-hive-2.1.1-bin.tar.gz. Here i will name it as hive. So that you don’t need to go inside apache-hive-2.1.1-bin  and more sub directory. By typing simply hive it will open.

[hduser@storage Softwares]$  ls -ltr
total 341220
-rw-r–r–. 1 hduser hadoop 199635269 Oct 11 10:31 hadoop-2.6.5.tar.gz
-rw-r–r–. 1 hduser hadoop 149756462 Dec  8 21:45 apache-hive-2.1.1-bin.tar.gz
lrwxrwxrwx. 1 hduser hadoop        35 Feb 27 10:20 sqoop -> sqoop-1.4.6.bin__hadoop-2.0.4-alpha
drwxr-xr-x. 9 hduser hadoop      4096 Feb 27 14:12 sqoop-1.4.6.bin__hadoop-2.0.4-alpha
drwxr-xr-x. 9 hduser hadoop      4096 Mar 11 12:15 apache-hive-2.1.1-bin

[hduser@storage Softwares]$ ln -s apache-hive-2.1.1-bin hive (Symbolic Link Command)

[hduser@storage Softwares]$ pwd
/home/hduser/Softwares
[hduser@storage Softwares]$

Step 9 :  Once you create symbolic link. Now lets go to .bashrc file and type following statement or you can do copy paste.

hduser@storage Softwares]$ vi ~/.bashrc

# Add Hive bin / directory to PATH
export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.1.1-bin
export PATH=$PATH:$HIVE_HOME/bin/

Step 10 : Create folder in hdfs

[hduser@storage Desktop]$ hadoop fs -ls /homd/hduser/
17/03/11 17:30:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
ls: `/homd/hduser/’: No such file or directory
[hduser@storage Desktop]$ hadoop fs -ls /home/hduser/
17/03/11 17:30:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 1 items
drwxr-xr-x   – hduser supergroup          0 2017-03-06 19:44 /home/hduser/sqoop
[hduser@storage Desktop]$ hadoop fs -mkdir /home/hduser/hive/
17/03/11 17:30:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage Desktop]$ hadoop fs -ls /home/hduser/
17/03/11 17:30:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 2 items
drwxr-xr-x   – hduser supergroup          0 2017-03-11 17:30 /home/hduser/hive
drwxr-xr-x   – hduser supergroup          0 2017-03-06 19:44 /home/hduser/sqoop
[hduser@storage Desktop]$ hadoop fs -mkdir -p  /home/hduser/hive/datawarehouse/
17/03/11 17:31:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage Desktop]$ hadoop fs -ls /home/hduser/hive/datawarehouse/
17/03/11 17:32:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage Desktop]$ hadoop fs -ls /home/hduser/hive/
17/03/11 17:32:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 1 items
drwxr-xr-x   – hduser supergroup          0 2017-03-11 17:31 /home/hduser/hive/datawarehouse
[hduser@storage Desktop]$

Step 11 : Give write command to datawarehouse folder

[hduser@storage Desktop]$ hadoop fs -chmod g+w /home/hduser/hive/datawarehouse
17/03/11 17:34:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage Desktop]$

Step 12 : Create tmp folder in hdfs.

[hduser@storage ~]$ hadoop fs -ls /home/hduser/
17/03/11 17:38:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 2 items
drwxr-xr-x   – hduser supergroup          0 2017-03-11 17:31 /home/hduser/hive
drwxr-xr-x   – hduser supergroup          0 2017-03-06 19:44 /home/hduser/sqoop
[hduser@storage ~]$ hadoop fs -mkdir -p  /tmp
17/03/11 17:38:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage ~]$ hadoop fs -ls  /
17/03/11 17:38:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 3 items
drwxr-xr-x   – hduser supergroup          0 2017-03-06 17:52 /home
drwxr-xr-x   – hduser supergroup          0 2017-03-11 17:38 /tmp
drwxr-xr-x   – hduser supergroup          0 2017-03-07 14:16 /user
[hduser@storage ~]$ hadoop fs -chmod g+w /tmp
17/03/11 17:39:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hduser@storage ~]$

Step 13 : Go to /home/hduser/Softwares/apache-hive-2.1.1-bin/bin and edit hive-config.sh and add following red line information inside hive-config.sh export HADOOP_HOME=/home/hduser/hadoop-2.6.5

[hduser@storage bin]$ pwd
/home/hduser/Softwares/apache-hive-2.1.1-bin/bin
[hduser@storage bin]$

[hduser@storage bin]$ ls -ltr
total 64
-rwxr-xr-x. 1 hduser hadoop  884 Nov 29 05:32 schematool
-rwxr-xr-x. 1 hduser hadoop  832 Nov 29 05:32 metatool
-rwxr-xr-x. 1 hduser hadoop 2278 Nov 29 05:32 hplsql.cmd
-rwxr-xr-x. 1 hduser hadoop 1030 Nov 29 05:32 hplsql
-rwxr-xr-x. 1 hduser hadoop  885 Nov 29 05:32 hiveserver2
-rwxr-xr-x. 1 hduser hadoop 1900 Nov 29 05:32 hive-config.sh
-rwxr-xr-x. 1 hduser hadoop 1584 Nov 29 05:32 hive-config.cmd
-rwxr-xr-x. 1 hduser hadoop 8823 Nov 29 05:32 hive.cmd
-rwxr-xr-x. 1 hduser hadoop 2553 Nov 29 05:32 beeline.cmd
-rwxr-xr-x. 1 hduser hadoop 1261 Nov 29 05:32 beeline
-rwxr-xr-x. 1 hduser hadoop 8692 Nov 29 05:35 hive
drwxr-xr-x. 3 hduser hadoop 4096 Mar 11 12:14 ext
[hduser@storage bin]$ vi hive-config.sh
export HADOOP_HOME=/home/hduser/hadoop-2.6.5

Step 14 : Now check starting hive if it open successfully or not. Before starting hive make sure to check the STATUS via [hduser@storage bin]$ jps (command). If it is opened then simply type hive which is deomonstrate below. Now while starting hive you might get following error. The reason for getting below mentioned error is when you installed hive . The hive contained the library folder and under the library folder you will find log4j-slf4j-impl-2.4.1.jar (in red text mentioned below). Now similar jar file is also available in the hadoop2.x library folder. So when you try to launch your hive it get confused to pick up which jar file i.e. either the hadoop supplied jar or hive supplied jar file ? and then you will get some. error  You will get the error like this SLF4J: Class path contains multiple SLF4J bindings.  So in order to avoid this error, I would suggest to go to hive library folder and  delete the log4j-slf4j-impl-2.4.1.jar. For further demonstration for deleting log4j-slf4j-impl-2.4.1.jar, Please follow

STEP 15.

[hduser@storage bin]$ jps
11041 NameNode
11475 ResourceManager
11331 SecondaryNameNode
11141 DataNode
11577 NodeManager
12187 Jps

If you did not get above status ouput from jps command then you can trigger start-all.sh.

[hduser@storage bin]$ start-all.sh

[hduser@storage bin]$ hive
which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop:/bin/:/home/hduser/Softwares/hive:/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop:/bin/:/home/hduser/Softwares/hive:/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.5/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Exception in thread “main” java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
… 9 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
… 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
… 23 more
Caused by: MetaException(message:Version information not found in metastore. )
at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7753)
at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:7731)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
at com.sun.proxy.$Proxy21.verifySchema(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:565)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:626)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
… 28 more
[hduser@storage bin]$

Step 15 : This steps demonstrate deleting log4j-slf4j-impl-2.4.1.jar

[hduser@storage lib]$ cd /home/hduser/Softwares/apache-hive-2.1.1-bin/lib
or
[hduser@storage lib]$  ls -ltr /home/hduser/Softwares/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar
-rw-r–r–. 1 hduser hadoop 22935 Nov 29 02:42 /home/hduser/Softwares/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar
[hduser@storage lib]$ rm /home/hduser/Softwares/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar
[hduser@storage lib]$

Step 16 : Setup metastore for hive.

In order to setup metastore make sure you have installed database in your OS. In my case i will be installing mysql (rdbms) database. Following step demonstrate the steps for installing mysql under linux OS.

How to Setup Mysql in RedHat for Hadoop ?

Step 17 : Once you setup MYSQL. Login to mysql with root user

[hduser@storage bin]$ mysql -u root -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 2
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type ‘help;’ or ‘\h’ for help. Type ‘\c’ to clear the current input statement.

mysql> create database hive;
Query OK, 1 row affected (0.05 sec)

mysql> use hive;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> CREATE USER ‘hive’@’storage.castrading.com’ IDENTIFIED BY ‘hive’;
Query OK, 0 rows affected (0.00 sec)

mysql> grant all privileges on hive to ‘hive’@’storage.castrading.com’ identified by ‘hive’ with grant option;
Query OK, 0 rows affected (0.08 sec)

mysql> grant all privileges on hive.* to ‘hive’@’storage.castrading.com’ identified by ‘hive’ with grant option;
Query OK, 0 rows affected (0.09 sec)

mysql> grant all on *.* to ‘hive’@’%’ identified by ‘hive’;
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

mysql>

 

Step 18 : Try login to mysql from hive user which is demonstrate below.

Note : While login to hive user, If you find any issue even by passing correct credential then you should trigger following command [hduser@storage ~]$ /usr/bin/mysql_secure_installation i.e. follow only step 6 from this URL and then try to login again to mysql with hive user as mentioned below.

[hduser@storage bin]$ mysql -u hive -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 3
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type ‘help;’ or ‘\h’ for help. Type ‘\c’ to clear the current input statement.

mysql>

mysql> use hive;
Database changed
mysql>

mysql> show tables;
Empty set (0.00 sec)

mysql> select user,host from mysql.user;
+————+————————+
| user       | host                   |
+————+————————+
|            | %                      |
| %          | %                      |
| retail_dba | %                      |
| root       | %                      |
| root       | 127.0.0.1              |
| retail_dba | 192.168.0.227          |
| root       | 192.168.0.227          |
| root       | localhost              |
| root       | sotrage.castrading.com |
| hive       | storage.castrading.com |
| root       | storage.castrading.com |
+————+————————+
11 rows in set (0.00 sec)

mysql>

mysql> SOURCE /home/hduser/Softwares/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-2.1.0.mysql.sql;
Query OK, 0 rows affected (0.06 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.02 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.06 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.01 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.01 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.01 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

ERROR 1061 (42000): Duplicate key name ‘PCS_STATS_IDX’
Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.01 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

ERROR 1061 (42000): Duplicate key name ‘CONSTRAINTS_PARENT_TABLE_ID_INDEX’
ERROR:
Failed to open file ‘hive-txn-schema-2.1.0.mysql.sql’, error: 2
ERROR 1062 (23000): Duplicate entry ‘1’ for key ‘PRIMARY’
Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.01 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

Query OK, 0 rows affected (0.00 sec)

mysql> show tables;
+—————————+
| Tables_in_hive            |
+—————————+
| BUCKETING_COLS            |
| CDS                       |
| COLUMNS_V2                |
| DATABASE_PARAMS           |
| DBS                       |
| DB_PRIVS                  |
| DELEGATION_TOKENS         |
| FUNCS                     |
| FUNC_RU                   |
| GLOBAL_PRIVS              |
| IDXS                      |
| INDEX_PARAMS              |
| KEY_CONSTRAINTS           |
| MASTER_KEYS               |
| NOTIFICATION_LOG          |
| NOTIFICATION_SEQUENCE     |
| NUCLEUS_TABLES            |
| PARTITIONS                |
| PARTITION_EVENTS          |
| PARTITION_KEYS            |
| PARTITION_KEY_VALS        |
| PARTITION_PARAMS          |
| PART_COL_PRIVS            |
| PART_COL_STATS            |
| PART_PRIVS                |
| ROLES                     |
| ROLE_MAP                  |
| SDS                       |
| SD_PARAMS                 |
| SEQUENCE_TABLE            |
| SERDES                    |
| SERDE_PARAMS              |
| SKEWED_COL_NAMES          |
| SKEWED_COL_VALUE_LOC_MAP  |
| SKEWED_STRING_LIST        |
| SKEWED_STRING_LIST_VALUES |
| SKEWED_VALUES             |
| SORT_COLS                 |
| TABLE_PARAMS              |
| TAB_COL_STATS             |
| TBLS                      |
| TBL_COL_PRIVS             |
| TBL_PRIVS                 |
| TYPES                     |
| TYPE_FIELDS               |
| VERSION                   |
+—————————+
46 rows in set (0.00 sec)

mysql>

Step 19 :

[root@storage ~]# yum install libmysql-java
Loaded plugins: refresh-packagekit, security, ulninfo
Setting up Install Process
bintray–sbt-rpm                                                                                                                      | 1.3 kB     00:00
public_ol6_UEK_base                                                                                                                   | 1.2 kB     00:00
public_ol6_ga_base                                                                                                                    | 1.4 kB     00:00
public_ol6_latest                                                                                                                     | 1.4 kB     00:00

[hduser@storage ~]$ schematool -dbType mysql -initSchema

Sources : https://cwiki.apache.org/confluence/display/Hive/Hive+Schema+Tool

Did you find apk for android? You can find new Free Android Games and apps.
error: Please respect the Copyright of this Website ! Do not copy the information from this website.