Centos7 installation hive3.1.2

Time:2020-10-19

Hive3.1.2 installation

1. Download

wget https://mirrors.tuna.tsinghua.edu.cn/apache/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz

2. Decompression

tar -zxvf apache-hive-3.1.2-bin.tar.gz

3. Rename

mv apache-hive-3.1.2-bin hive

Configure hive

1. Edit hive- site.xml

#Create a hive- site.xml file
vim /usr/share/hive/conf/hive-site.xml
#Add the following content (MySQL is configured to store metadata)
<configuration>
  
    <property>
      <name>hive.exec.scratchdir</name>
      <value>/home/hadoop/scratchdir</value>
    </property>

    <property>
      <name>hive.metastore.warehouse.dir</name>
      <value>/home/hadoop/warehouse</value>
    </property>

    <property>
      <name>hive.metastore.uris</name>
      <value>thrift://hadoop:9083</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.cj.jdbc.Driver</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://hadoop:3306/hive?createDatabaseIfNotExist=true&amp;useSSL=false</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionPassword</name>
      <value>hive</value>
    </property>

MySQL configuration

Why configure MySQL,Why add MySQL when hive is used?
1. Download the Java version driver of MySQL, unzip it, put it in the hive / lib file, and modify the permissions

wget https://downloads.mysql.com/archives/get/p/3/file/mysql-connector-java-8.0.11.tar.gz
tar -zxvf mysql-connector-java-8.0.11.tar.gz
cd mysql-connector-java-8.0.11
chmod 777 mysql-connector-java-8.0.11.jar
cp mysql-connector-java-8.0.11.jar /usr/share/hive/lib/

2. Establish corresponding user and database in MySQL

CREATE USER 'hive'@'%' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%';
DELETE FROM mysql.user WHERE user='';
flush privileges;
CREATE DATABASE hive charset=utf8;

Add environment variables

vim /etc/bashrc
#Finally, add the following
export HIVE_HOME=/usr/share/hive
export PATH=$PATH:/usr/share/miniconda3/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$HBASE_HOME/bin/hbase
#Save and exit. Implementation:
source /etc/bashrc

Launch hive

1. Initialize schema

schematool -dbType mysql -initSchema

2. Start metacore service

hive --service metastore &

3. Enter hive

hive

Report error and solve it

1.

(base) \[[email protected] bin\]# schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in \[jar:file:/usr/share/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: Found binding in \[jar:file:/usr/share/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type \[org.apache.logging.slf4j.Log4jLoggerFactory\]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5104)
    at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
    at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

1.1 slf4j reports an error because there is a conflict between Hadoop’s slf4j and hive’s slf4j jar package. Remove one of them.

rm -rf /usr/share/hive/lib/slf4j-log4j12-1.7.25.jar

1.2 nosuchmethoderror is due to the dependency in hive guava.jar It is caused by the version inconsistency between Hadoop and Hadoop. Compare which version is lower and replace the one with lower version (mine is higher than Hadoop).

cd /usr/share/hadoop/share/hadoop/common/lib/
cp guava-27.0-jre.jar /usr/share/hive/lib/
RM / usr / share / hive / lib / low version guava.jar

2

/usr/share/hadoop/libexec/hadoop- functions.sh : Line 2366: Hadoop_ ORG.APACHE.HADOOP . HBASE.UTIL.GETJAVAPROPERTY_ User: wrong replacement
/usr/share/hadoop/libexec/hadoop- functions.sh : Line 2461: Hadoop_ ORG.APACHE.HADOOP . HBASE.UTIL.GETJAVAPROPERTY_ Opts: wrong replacement

If the HBase version is too high, you can change to a lower version (I replaced HBase version 1.6, there is no error), or you can ignore it