欢迎投稿

今日深度:

Hive的配置,

Hive的配置,


转自:https://www.edureka.co/blog/apache-hive-installation-on-ubuntu

(个人做了点调整)

Please follow the below steps to install Apache Hive on Ubuntu:

Step 1:  Download Hive tar.

Command: wget http://archive.apache.org/dist/hive/hive-2.1.0/apache-hive-2.1.0-bin.tar.gz

Step 2:  Extract the tar file.

Command: tar -xzf apache-hive-2.1.0-bin.tar.gz

Command: ls

Step 3: Edit the “.bashrc” file to update the environment variables for user.

Command:  sudo gedit .bashrc

Add the following at the end of the file:

# Set HIVE_HOME

export HIVE_HOME=/home/edureka/apache-hive-2.1.0-bin
export PATH=$PATH:/home/edureka/apache-hive-2.1.0-bin/bin

Also, make sure that hadoop path is also set.

Run below command to make the changes work in same terminal.

Command: source .bashrc

Step 4: Check hive version.

Step 5:  Create Hive directories within HDFS. The directory ‘warehouse’ is the location to store the table or data related to hive.

Command:

  • hdfs dfs -mkdir -p /user/hive/warehouse
  • hdfs dfs -mkdir /tmp

Step 6: Set read/write permissions for table.

Command:

In this command, we are giving write permission to the group:

  • hdfs dfs -chmod g+w /user/hive/warehouse
  • hdfs dfs -chmod g+w /tmp

Step 7:  Set Hadoop path in hive-env.sh

Command: cd apache-hive-2.1.0-bin/

Command: gedit conf/hive-env.sh

Set the parameters as shown in the below snapshot.

Step 8: Edit hive-site.xml

cp hive-default.xml.template hive-site.xml

Command: gedit conf/hive-site.xml

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/home/edureka/apache-hive-2.1.0-bin/metastore_db;create=true</value>
<description>
JDBC connect string for a JDBC metastore.
To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
<property>
<name>hive.metastore.uris</name>
<value/>
<description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.apache.derby.jdbc.EmbeddedDriver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.PersistenceManagerFactoryClass</name>
<value>org.datanucleus.api.jdo.JDOPersistenceManagerFactory</value>
<description>class implementing the jdo persistence</description>
</property>
</configuration>

下面这些也要改  我在别的地方找到的

<name>hive.exec.scratchdir</name>
<value>/tmp/hive-${user.name}</value>

 <name>hive.exec.local.scratchdir</name>
 <value>/tmp/${user.name}</value>

<name>hive.downloaded.resources.dir</name>
<value>/tmp/${user.name}_resources</value>

<name>hive.scratch.dir.permission</name>
    <value>733</value>

Step 9: By default, Hive uses Derby database. Initialize Derby database.

sudo chown hduser:hduser -R /usr/local/hive   (本人是放在/usr/local/hive下的)

Command: bin/schematool -initSchema -dbType derby

 

Step 10: Launch Hive.

Command: hive

Step 11: Run few queries in Hive shell.

Command: show databases;

Command: create table employee (id string, name string, dept string) row format delimited fields terminated by ‘\t’ stored as textfile;

Command: show tables;

Step 12: To exit from Hive:

Command: exit;

 

Hive+mysql http://blog.csdn.net/x_i_y_u_e/article/details/46845609 




常见错误

tried to install hive on a raspberry pi 2. I installed Hive by uncompress zipped Hive package and configure $HADOOP_HOME and $HIVE_HOME manually under hduser user-group I created. When running hive, I got the following error message: hive

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.

Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)

So I ran the command suggested in the above error message: schematool -dbType derby -initSchema I got the error message:

Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000) org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !! * schemaTool failed *


解决方式

oooooo the answer is:

  1. Before you run hive for the first time, run

    schematool -initSchema -dbType derby

  2. If you already ran hive and then tried to initSchema and it's failing:

    mv metastore_db metastore_db.tmp

  3. Re run

    schematool -initSchema -dbType derby

  4. Run hive again


www.htsjk.Com true http://www.htsjk.com/hive/40619.html NewsArticle Hive的配置, 转自:https://www.edureka.co/blog/apache-hive-installation-on-ubuntu (个人做了点调整) Please follow the below steps to install  Apache Hive  on Ubuntu: Step 1:   Download  Hive tar. Command:  wget http:/...
相关文章
    暂无相关文章
评论暂时关闭