Or alternatively, directly run the following command: $HIVE_HOME/bin/init-hive-dfs.sh guava lib issue Hadoop fs -chmod g+w /user/hive/warehouse Run the following commands: hadoop fs -mkdir /tmp Now let’s setup the HDFS folders for Hive. $ jpsĪs you can see, all the services are running successfully in my WSL. Run the following command (jps) to make sure all the services are running successfully. To restart the services, run the following command: sudo service ssh restart In WSL, you may need to restart you ssh services if ssh doesn’t work: localhost: ssh: connect to host localhost port 22: Connection refused Start your Hadoop services (if you have not done that) by running the following command: $HADOOP_HOME/sbin/start-all.sh home/tangr/hadoop/apache-hive-3.1.2-bin Setup Hive HDFS folders Verify the environment variables: echo $HIVE_HOME Run the following command to source the variables: source ~/.bashrc * Change the highlighted user name to your own one. bashrc file too: vi ~/.bashrcĪdd the following lines to the end of the file: export HIVE_HOME=/home/ tangr/hadoop/apache-hive-3.1.2-bin Let’s run the following command to add Hive required environment variables into. In the prerequisites sections, we’ve already configured some environment variables like the following: export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64Įxport HADOOP_HOME=/home/ tangr/hadoop/hadoop-3.3.0 In the hadoop folder there are now two subfolders at least (one for Hadoop and another for Hive): Now unzip Hive package using the following command: tar -xvzf apache-hive-3.1. -C ~/hadoop If you have configured Hadoop 3.3.0 successfully by following Kontext guide (in prerequisites section), there should be one folder named hadoop existing in your home folder already: $ ls -l In WSL bash terminal, run the following command to download the package: wget Unzip binary package Now let’s start to install Apache Hive 3.1.2 on WSL. (Mandatory) Install Hadoop 3.3.0 on Linux.Install Windows Subsystem for Linux on a Non-System Drive. Install WSL in a system or non-system drive on your Windows 10 and then install Hadoop 3.3.0 on it: Warning Alert - Apache Hive is impacted by Log4j vulnerabilities refer to page Apache Log4j Security Vulnerabilities to find out the fixes.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |