0% found this document useful (0 votes)
67 views4 pages

Hari Hive Assignment

The document describes creating three tables in Hive called emp, dept, and salgrade with different column definitions. It then uses Sqoop to import sample data from Oracle tables into HDFS files and loads those files into the Hive tables.

Uploaded by

Hari Sampathirao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views4 pages

Hari Hive Assignment

The document describes creating three tables in Hive called emp, dept, and salgrade with different column definitions. It then uses Sqoop to import sample data from Oracle tables into HDFS files and loads those files into the Hive tables.

Uploaded by

Hari Sampathirao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Hive:

HQL--Creating Tables:

Create table emp (empno int,ename string,job string,mgr int,hiredate string,sal int,deptno int)

ROW FORMAT DELIMITED FIELDS TERMINATED BY , LINES TERMINATED BY \n;

Note: I have used bigint here, we can use int too.(Based on precision).

Create table dept(deptno int,dname string,loc string) ROW FORMAT DELIMITED FIELDS TERMINAT
LINES TERMINATED BY \n;

Create table salgrade(grade int,losal int,hisal int) ROW FORMAT DELIMITED FIELDS TERMINATED B
LINES TERMINATED BY \n;
I have used Sqoop to import sample data to load into hive table.

$ sqoop import - -connect jdbc:oracle:thin:@ 172.27.133.63:1521/xe - -username hari_source -


-password hari_source

- -table EMP - -columns EMPNO,ENAME,JOB,MGR,,HIREDATE,SAL,COMM,DEPTNO - -as-textfile -


-target-dir /user/practice/oracle_emp m 1

$ sqoop import - -connect jdbc:oracle:thin:@172.27.133.63:1521/xe - -username hari_source -


-password hari_source

- -table DEPT - -columns DEPTNO,DNAME,LOC - -as-textfile - -target-dir


/user/practice/oracle_dept m 1

$ sqoop import - -connect jdbc:oracle:thin:@172.27.133.63:1521/xe - -username hari_source -


-password hari_source

- -table SALGRADE - -columns GRADE,LOSAL,HISAL - -as-textfile - -target-dir


/user/practice/oracle_salgrade m 1
Files in HDFS:

Loading Data into Hive Tables:

Emp table loading:

Load data inpath /user/pratice/oracle_emp/part-m-00000 into table emp;

Dept table loading:

Load data inpath /user/pratice/oracle_dept/part-m-00000 into table dept;


Salgrade table loading:

Load data inpath /user/pratice/oracle_salgrade/part-m-00000 into table salgrade;

You might also like