Automated bash script to export all Hive DDLs from an existing environment at one go!
Exporting all Hive DDLs from an existing environment
The below script scans through all the databases on your Hive system, and routes all the Create Table statements for all the tables to a file.
This will be helpful when you need to set up a new environment based on the existing one. It has been tested on databases with over 500 tables. Steps to be performed are as follows:
1. Create a script, say hive_ddls_export.sh in the Hadoop box with HIVE CLI installed on it with the following content:
#!/bin/bash
databases=`hive -e "show databases;"`
all_db_names=${databases}
datetoday=`date +%Y-%m-%d_%H:%M:%S`
touch dev_hive_ext_tables_$datetoday.sql
chmod 744 dev_hive_ext_tables_$datetoday.sql
for listofdatabases in $ all_db_names
do
tables=`hive -e "use $listofdatabases;show tables;"`
all_tab_names=`echo "${tables}"`
echo " /**** Start DDLs for Tables in ${listofdatabases} ****/ " >> dev_hive_ext_tables_$datetoday.sql
for listoftables in $ all_tab_names
do
buildcreate="show create table "
buildcreatetable="$buildcreate${listofdatabases}.${listoftables};"
echo " ====== Running SHOW CREATE TABLE Statement for $all_db_names.${listoftables} ======= : "
hive -e "use ${listofdatabases}; $buildcreatetable" >> dev_hive_ext_tables_$datetoday.sql
echo "; " >> dev_hive_ext_tables_$datetoday.sql
done
echo " /**** End DDLs for Tables in ${listofdatabases} ****/ " >> dev_hive_ext_tables_$datetoday.sql
done
2. Run the script by issuing the following command to generate a concatenated file whose name starts with dev_hive_ext_tables as per the above script:
./hive_ddls_export.sh
Watch this space for many more technical articles related to the Big Data world. Happy learning!!
Thanks,
Vishal.
Comments
Post a Comment