SQL Interview Questions 2021
SQL Interview Questions 2021
1. How to create an empty table in hive from another table without copying data?
insert into `payments` values
(103,'HQ336336','2016-10-19','6066.78'),(103,'JM555205','2016-10-05','14571.44'),(103,'OM314933','2016-10-18','16
76.14'),(112,'BO864823','2016-10-17','14191.12'),(112,'HQ55022','2016-10-06','32641.98'),(112,'ND748579','2016-10-
20','33347.88'),(114,'GG31455','2016-10-20','45864.03'),(114,'MA765515','2016-10-15','82261.22'),(114,'NP603840','
2016-10-31','7565.08'),(114,'NR27552','2016-10-10','44894.74'),(119,'DB933704','2016-10-14','19501.82'),(119,'LN37
3447','2016-10-08','47924.19'),(119,'NG94694','2016-10-22','49523.67'),(121,'DB889831','2016-10-16','50218.95'),(12
1,'FD317790','2016-10-28','1491.38'),(121,'KI831359','2016-10-04','17876.32'),(121,'MA302151','2016-10-28','34638.
14'),(124,'AE215433','2016-10-05','101244.59'),(124,'BG255406','2016-10-28','85410.87'),(124,'CQ287967','2016-10-1
1','11044.30'),(124,'ET64396','2016-10
-16','83598.04'),(124,'HI366474','2016-10-27','47142.70'),(124,'HR86578','2016-10-02','55639.66'),(124,'KI131716','20
16-10-15','111654.40'),(124,'LF217299','2016-10-26','43369.30'),(124,'NT141748','2016-10-25','45084.38'),(128,'DI92
5118','2016-10-28','10549.01'),(128,'FA465482','2016-10-18','24101.81'),(128,'FH668230','2016-10-24','33820.62'),(12
8,'IP383901','2016-10-18','7466.32'),(129,'DM826140','2016-10-08','26248.78'),(129,'ID449593','2016-10-11','23923.9
3'),(129,'PI42991','2016-10-09','16537.85'),(131,'CL442705','2016-10-12','22292.62'),(131,'MA724562','2016-10-02','5
0025.35'),(131,'NB445135','2016-10-11','35321.97'),(141,'AU364101','2016-10-19','36251.03'),(141,'DB583216','2016-
10-01','36140.38'),(141,'DL460618','2016-10-19','46895.48'),(141,'HJ32686','2016-10-30','59830.55'),(141,'ID10962','2
016-10-31','116208.40'),(141,'IN446258','2016-10-25','65071.26'),(141,'JE105477','2016-10-18','120166.58'),(141,'JN3
55280','2016-10-26','49539.37'),(141,'JN722010','2016-10-25','40206.20'),(141,'KT52578','2016-10-09','63843.55'),(14
1,'MC46946','2016-10-09','35420.74'),(141,'MF629602','2016-10-16','20009.53'),(141,'NU627706','2016-10-17','2615
5.91'),(144,'IR846303','2016-10-12','36005.71'),(144,'LA685678','2016-10-09','7674.94'),(145,'CN328545','2016-10-03'
,'4710.73'),(145,'ED39322','2016-10-26','28211.70'),(145,'HR182688','2016-10-01','20564.86'),(145,'JJ246391','2016-1
0-20','53959.21'),(146,'FP549817','2016-10-18','40978.53'),(146,'FU793410','2016-10-16','49614.72'),(146,'LJ160635','
2016-10-10','39712.10'),(148,'BI507030','2016-10-22','44380.15'),(148,'DD635282','2016-10-11','2611.84'),(148,'KM1
72879','2016-10-26','105743.00'),(148,'ME497970','2016-10-27','3516.04'),(151,'BF686658','2016-10-22','58793.53'),(
151,'GB852215','2016-10-26','20314.44'),(151,'IP568906','2016-10-18','58841.35'),(151,'KI884577','2016-10-14','3996
4.63'),(157,'HI618861','2016-10-19','35152.12'),(157,'NN711988','2016-10-07','63357.13'),(161,'BR352384','2016-10-1
4','2434.25'),(161,'BR478494','2016-10-18','50743.65'),(161,'KG644125','2016-10-02','12692.19'),(161,'NI908214','201
6-10-05','38675.13'),(166,'BQ327613','2016-10-16','38785.48'),(166,'DC979307','2016-10-07','44160.92'),(166,'LA318
629','2016-10-28','22474.17'),(167,'ED743615','2016-10-19','12538.01'),(167,'GN228846','2016-10-03','85024.46'),(17
1,'GB878038','2016-10-15','18997.89'),(171,'IL104425','2016-10-22','42783.81'),(172,'AD832091','2016-10-09','1960.8
0'),(172,'CE51751','2016-10-04','51209.58'),(172,'EH208589','2016-10-20','33383.14'),(173,'GP545698','2016-10-13','1
1843.45'),(173,'IG462397','2016-10-29','20355.24'),(175,'CITI3434344','2016-10-19','28500.78'),(175,'IO448913','2016
-10-19','24879.08'),(175,'PI15215','2016-10-10','42044.77'),(177,'AU750837','2016-10-17','15183.63'),(177,'CI381435',
'2016-10-19','47177.59'),(181,'CM564612','2016-10-25','22602.36'),(181,'GQ132144','2016-10-30','5494.78'),(181,'OH
367219','2016-10-16','44400.50'),(186,'AE192287','2016-10-10','23602.90'),(186,'AK412714','2016-10-27','37602.48'),
(186,'KA602407','2016-10-21','34341.08'),(187,'AM968797','2016-10-03','52825.29'),(187,'BQ39062','2016-10-08','47
159.11'),(187,'KL124726','2016-10-27','48425.69'),(189,'BO711618','2016-10-03','17359.53'),(189,'NM916675','2016-
10-01','32538.74'),(198,'FI192930','2016-10-06','9658.74'),(198,'HQ920205','2016-10-06','6036.96'),(198,'IS946883','2
016-10-21','5858.56'),(201,'DP677013','2016-10-20','23908.24'),(201,'OO846801','2016-10-15','37258.94'),(202,'HI35
8554','2016-10-18','36527.61'),(202,'IQ627690','2016-10-08','33594.58'),(204,'GC697638','2016-10-13','51152.86'),(2
04,'IS150005','2016-10-24','4424.40'),(205,'GL756480','2016-10-04','3879.96'),(205,'LL562733','2016-10-05','50342.74
'),(205,'NM739638','2016-10-06','39580.60'),(209,'BOAF82044','2016-10-03','35157.75'),(209,'ED520529','2016-10-21'
,'4632.31'),(209,'PH785937','2016-10-04','36069.26'),(211,'BJ535230','2016-10-09','45480.79'),(216,'BG407567','2016
-10-09','3101.40'),(216,'ML780814','2016-10-06','24945.21'),(216,'MM342086','2016-10-14','40473.86'),(219,'BN1787
0','2016-10-02','3452.75'),(219,'BR941480','2016-10-18','4465.85'),(227,'MQ413968','2016-10-31','36164.46'),(227,'N
U21326','2016-10-02','53745.34'),(233,'BOFA23232','2016-10-20','29070.38'),(233,'II180006','2016-10-01','22997.45'),
(233,'JG981190','2016-10-18','16909.84'),(239,'NQ865547','2016-10-15','80375.24'),(240,'IF245157','2016-10-16','467
88.14'),(240,'JO719695','2016-10-28','24995.61'),(242,'AF40894','2016-10-22','33818.34'),(242,'HR224331','2016-10-0
3','12432.32'),(242,'KI744716','2016-10-21','14232.70'),(249,'IJ399820','2016-10-19','33924.24'),(249,'NE404084','201
6-10-04','48298.99'),(250,'EQ12267','2016-10-17','17928.09'),(250,'HD284647','2016-10-30','26311.63'),(250,'HN1143
06','2016-10-18','23419.47'),(256,'EP227123','2016-10-10','5759.42'),(256,'HE84936','2016-10-22','53116.99'),(259,'E
U280955','2016-10-06','61234.67'),(259,'GB361972','2016-10-07','27988.47'),(260,'IO164641','2016-10-30','37527.58'
),(260,'NH776924','2016-10-24','29284.42'),(276,'EM979878','2016-10-09','27083.78'),(276,'KM841847','2016-10-13','
38547.19'),(276,'LE432182','2016-10-28','41554.73'),(276,'OJ819725','2016-10-30','29848.52'),(278,'BJ483870','2016-
10-05','37654.09'),(278,'GP636783','2016-10-02','52151.81'),(278,'NI983021','2016-10-24','37723.79'),(282,'IA793562'
,'2016-10-03','24013.52'),(282,'JT819493','2016-10-02','35806.73'),(282,'OD327378','2016-10-03','31835.36'),(286,'DR
578578','2016-10-28','47411.33'),(286,'KH910279','2016-10-05','43134.04'),(298,'AJ574927','2016-10-13','47375.92'),(
298,'LF501133','2016-10-18','61402.00'),(299,'AD304085','2016-10-24','36798.88'),(299,'NR157385','2016-10-05','322
60.16'),(311,'DG336041','2016-10-15','46770.52'),(311,'FA728475','2016-10-06','32723.04'),(311,'NQ966143','2016-10
-25','16212.59'),(314,'LQ244073','2016-10-09','45352.47'),(314,'MD809704','2016-10-03','16901.38'),(319,'HL685576',
'2016-10-06','42339.76'),(319,'OM548174','2016-10-07','36092.40'),(320,'GJ597719','2016-10-18','8307.28'),(320,'HO
576374','2016-10-20','41016.75'),(320,'MU817160','2016-10-24','52548.49'),(321,'DJ15149','2016-10-03','85559.12'),(
321,'LA556321','2016-10-15','46781.66'),(323,'AL493079','2016-10-23','75020.13'),(323,'ES347491','2016-10-24','3728
1.36'),(323,'HG738664','2016-10-05','2880.00'),(323,'PQ803830','2016-10-24','39440.59'),(324,'DQ409197','2016-10-1
3','13671.82'),(324,'FP443161','2016-10-07','29429.14'),(324,'HB150714','2016-10-23','37455.77'),(328,'EN930356','2
016-10-16','7178.66'),(328,'NR631421','2016-10-30','31102.85'),(333,'HL209210','2016-10-15','23936.53'),(333,'JK479
662','2016-10-17','9821.32'),(333,'NF959653','2016-10-01','21432.31'),(334,'CS435306','2016-10-27','45785.34'),(334,'
HH517378','2016-10-16','29716.86'),(334,'LF737277','2016-10-22','28394.54'),(339,'AP286625','2016-10-24','23333.0
6'),(339,'DA98827','2016-10-28','34606.28'),(344,'AF246722','2016-10-24','31428.21'),(344,'NJ906924','2016-10-02','1
5322.93'),(347,'DG700707','2016-10-18','21053.69'),(347,'LG808674','2016-10-24','20452.50'),(350,'BQ602907','2016-
10-11','18888.31'),(350,'CI471510','2016-10-25','50824.66'),(350,'OB648482','2016-10-29','1834.56'),(353,'CO351193'
,'2016-10-10','49705.52'),(353,'ED878227','2016-10-21','13920.26'),(353,'GT878649','2016-10-21','16700.47'),(353,'HJ
618252','2016-10-09','46656.94'),(357,'AG240323','2016-10-16','20220.04'),(357,'NB291497','2016-10-15','36442.34')
,(362,'FP170292','2016-10-11','18473.71'),(362,'OG208861','2016-10-21','15059.76'),(363,'HL575273','2016-10-17','50
799.69'),(363,'IS232033','2016-10-16','10223.83'),(363,'PN238558','2016-10-05','55425.77'),(379,'CA762595','2016-10
-12','28322.83'),(379,'FR499138','2016-10-16','32680.31'),(379,'GB890854','2016-10-02','12530.51'),(381,'BC726082','
2016-10-03','12081.52'),(381,'CC475233','2016-10-19','1627.56'),(381,'GB117430','2016-10-03','14379.90'),(381,'MS1
54481','2016-10-22','1128.20'),(382,'CC871084','2016-10-12','35826.33'),(382,'CT821147','2016-10-01','6419.84'),(38
2,'PH29054','2016-10-27','42813.83'),(385,'BN347084','2016-10-02','20644.24'),(385,'CP804873','2016-10-19','15822.
84'),(385,'EK785462','2016-10-09','51001.22'),(386,'DO106109','2016-10-18','38524.29'),(386,'HG438769','2016-10-18
','51619.02'),(398,'AJ478695','2016-10-14','33967.73'),(398,'DO787644','2016-10-21','22037.91'),(398,'JPMR4544','20
16-10-18','615.45'),(398,'KB54275','2016-10-29','48927.64'),(406,'BJMPR4545','2016-10-23','12190.85'),(406,'HJ21768
7','2016-10-28','49165.16'),(406,'NA197101','2016-10-17','25080.96'),(412,'GH197075','2016-10-25','35034.57'),(412,'
PJ434867','2016-10-14','31670.37'),(415,'ER54537','2016-10-28','31310.09'),(424,'KF480160','2016-10-07','25505.98'),
(424,'LM271923','2016-10-16','21665.98'),(424,'OA595449','2016-10-31','22042.37'),(447,'AO757239','2016-10-15','6
631.36'),(447,'ER615123','2016-10-25','17032.29'),(447,'OU516561','2016-10-17','26304.13'),(448,'FS299615','2016-1
0-18','27966.54'),(448,'KR822727','2016-10-30','48809.90'),(450,'EF485824','2016-10-21','59551.38'),(452,'ED473873',
'2016-10-15','27121.90'),(452,'FN640986','2016-10-20','15130.97'),(452,'HG635467','2016-10-03','8807.12'),(455,'HA7
77606','2016-10-05','38139.18'),(455,'IR662429','2016-10-12','32239.47'),(456,'GJ715659','2016-10-13','27550.51'),(4
56,'MO743231','2016-10-30','1679.92'),(458,'DD995006','2016-10-15','33145.56'),(458,'NA377824','2016-10-06','221
62.61'),(458,'OO606861','2016-10-13','57131.92'),(462,'ED203908','2016-10-15','30293.77'),(462,'GC60330','2016-10-
08','9977.85'),(462,'PE176846','2016-10-27','48355.87'),(471,'AB661578','2016-10-28','9415.13'),(471,'CO645196','20
16-10-10','35505.63'),(473,'LL427009','2016-10-17','7612.06'),(473,'PC688499','2016-10-27','17746.26'),(475,'JP11322
7','2016-10-09','7678.25'),(475,'PB951268','2016-10-13','36070.47'),(484,'GK294076','2016-10-26','3474.66'),(484,'JH
546765','2016-10-29','47513.19'),(486,'BL66528','2016-10-14','5899.38'),(486,'HS86661','2016-10-23','45994.07'),(486
,'JB117768','2016-10-20','25833.14'),(487,'AH612904','2016-10-28','29997.09'),(487,'PT550181','2016-10-29','12573.2
8'),(489,'OC773849','2016-10-04','22275.73'),(489,'PO860906','2016-10-31','7310.42'),(495,'BH167026','2016-10-26','
59265.14'),(495,'FN155234','2016-10-14','6276.60'),(496,'EU531600','2016-10-25','30253.75'),(496,'MB342426','2016
-10-16','32077.44'),(496,'MN89921','2016-10-31','52166.01');
Efficient way -
3. Can we insert data into the hive table? multiple records in one time without using load
command?
Yes, refer to the script mentioned above.
Duplicate handling:
Option 1:
select distinct * from payments; -- 274 records
Option 2: (Better approach)
select customerNumber,checkNumber,paymentDate,amount
from payments
group by customerNumber,checkNumber,paymentDate,amount; -- 274 records
5. How to identify which are the customer table records are duplicated and how many duplicates
are there? When do you use where and when do you use having for filtering of data?
Ans: where used for direct filter, where having used for aggregated filter.
select - from - where - group by - having - order by - limit;
6. How to remove/delete duplicate payments and retain only the de-duplicated payment
information in a table with or without partition?
How do you delete duplicate data from a hive table? or can u delete data from hive table?
Group By -> Distinct -> window funct -> CTE, Unbounded preceding/succeeding
--don’t run this insert - insert overwrite table payments select distinct * from payments;
select * from payments_dup;
set hive.exec.dynamic.partition.mode=nonstrict;
insert overwrite table payments_part partition (paymentdate) select
customernumber,checknumber,amount,paymentdate from payments;
select customernumber,checknumber,amount,paymentdate,count(1)
from payments_part
group by customernumber,checknumber,amount,paymentdate
having count(1)>1;
select customernumber,checknumber,amount,paymentdate,count(1)
from payments_part
group by customernumber,checknumber,amount,paymentdate
having count(1)>1;
7. Show the customer info who made very first payment and last payment to our company? or
the very first customer of the company?
select * from payments_part
where paymentdate in (select min(paymentdate) from payments_part );
Using co-related query: - (Show the first payment, last made by the all the customers?)
select a.* from payments as a where a.paymentdate in (select max(b.paymentdate) from
payments as b where b.customernumber=a.customernumber);
9. Show the last (but one) or second recent payment made by the customer 496?
This is the way u can achieve the result, but not a right way to do, in hive the nested subquery
with more than 1 is not supported.
Windowing/Analytical Functions:
10. Show the lowest payment, highest, number of payment made by the customer 496?
insert into `payments_part` partition(paymentdate='2016-10-30') values
(496,'HQ336436',52166.01);
Windowing Functions:
Cume_dist
It returns the cumulative distribution of a value. It results from 0 to 1. For suppose if the total number of records are 10 then
for the 1st row the cume_dist will be 1/10 and for the second 2/10 and so on till 10/10.
Rank
The rank function will return the rank of the values as per the result set of the over clause. If two values are same then it will
give the same rank to those 2 values and then for the next value, the sub-sequent rank will be skipped.
Row_number
Row number will return the continuous sequence of numbers for all the rows of the result set of the over clause.
Dense_rank
It is same as the rank() function but the difference is if any duplicate value is present then the rank will not be skipped for the
subsequent rows. Each unique value will get the ranks in a sequence.
set hive.cli.print.header=true;
select customernumber,paymentdate,amount, rank() over(partition by customernumber order
by amount desc) as rnk,dense_rank() over(partition by customernumber order by amount desc)
as d_rank,row_number() over(partition by customernumber order by amount desc) as
rownum,cume_dist() over(partition by customernumber order by paymentdate) as
cumulative_dist from payments_part where customernumber=496;
114 MA765515 82261.22 2016-10-15 1
114 GG31455 45864.03 2016-10-20 2
114 NR27552 44894.74 2016-10-10 3
114 NP603840 7565.08 2016-10-31 4
R_N D_R Rank
496 MN89921 52166.01 2016-10-30 1 1 1
496 HQ336436 52166.01 2016-10-31 2 1 1
496 MB342426 32077.44 2016-10-16 3 2 3
496 EU531600 30253.75 2016-10-25 4 3 4
11. Show the second, third largest payment made by the customer 496?
select customernumber,paymentdate,amount,rownumdt from
(select customernumber,paymentdate,amount, rank() over(partition by customernumber
order by amount desc) as rnk,dense_rank() over(partition by customernumber order by
amount desc) as d_rank,row_number() over(partition by customernumber order by amount
desc) as rownum,cume_dist() over(partition by customernumber order by paymentdate) as
cumulative_dist,row_number() over(partition by customernumber order by paymentdate
desc) as rownumdt from payments_part where customernumber=496) as temp
where rownum in (2,3);
12. Show the customer purchase rate whether growing up or leaning down from the overall
largest and smallest payment made?
13. Show the customer purchase rate whether growing up or leaning down from the immediate
previous and next payment made?
Analytical Functions:
--case when condition then exec when cond2 then exe else exec end as alias
select customernumber,paymentdate,amount, case when amount_paid_previous_day> amount
then "purchase capacity is improved" when amount_paid_previous_day= amount then "no
change in the purchase amount" else "purchase capacity is reduced" end as lag,
case when amount_paid_next_day> amount then "next day payment is high" when
amount_paid_next_day= amount then "next day payment is same" else "next day payment is
low" end as lead
from(select customernumber, paymentdate,lag(amount) over(partition by customernumber
order by paymentdate) as amount_paid_previous_day,amount, lead(amount) over(partition by
customernumber order by paymentdate) as amount_paid_next_day from payments_part where
customernumber in (496)) as temp
order by customernumber,paymentdate;
16. How to create version numbers for the payments made by the customers?
17.
current data
cid ver amt
10 1 1000
10 2 1100
10 3 3000
10 4 2000
10 5 3050
10 6 1000
next day
cid amt
10 2000
10 3050
10 1000
Joins: Inner, Outer (left, right, full), semi, anti, self, cross)
insert into `payments` values
(1000,'HQ336336','2016-10-19','6066.78'),(1003,'JM555205','2016-10-05','14571.44');
18. Show me only the customers who didnt make the payments.
eg. customernumber 481
select * from payments where customernumber=481;
select c.customernumber,c.customername,p.amount
from customers c left join payments p
on c.customernumber=p.customernumber
where
--p.amount is null and
c.customernumber in(496,481);
19. Show me only the customers who made the payments and what is the payment amount (this
is possible only with join).
Join/subquery/exists can be used, but for multiple columns comparison for eg. customernumber
and customerphonenumbedr we cant use subquery, we can only use join in the case if we need
some columns from payments table also.
select c.customernumber,c.customername,p.amount
from customers c inner join payments p
on c.customernumber=p.customernumber and c.customernumber in (496,481);
21. show the customers and payments info in both the above cases
select c.customernumber,c.customername,p.amount from customers c full outer join payments
p on c.customernumber=p.customernumber;
24. show me the combined result of two different tables picking 1 column from table1 and 2
columns from table2. Is it possible to union two tables which contains different number of
columns or different data types?
yes, by adding dummy columns and by casting the datatype accordingly.
25. Customer who made the payment (insersect), who are the anonymous customers minus
(above use cases)?
26. Show me the customers who have more than 1 phone number and more than 1 address?
insert into `customers` values (103,'Atelier graphique','Schmitt','Carine ','908-199-0411','55, rue
Royale',NULL,'Nantes',NULL,'44000','France',1370,'21000.00');
Complex types:
27. How do you create structure data from set of columns in hive, how do you group the list of
columns and form complex type like structure or json objects?
select named_struct("id",customernumber,"name",customername) from customers;
28. How to collect the columns as array or how to group or convert a given column values to a row
in Hive? Eg. Group all phone numbers of the given customer? or how do you pivot a column
into row?
select customernumber,collect_list(phone) as grouped_phone from customers where
customernumber in (103,112) group by customernumber;
29. How to collect the array as rows or Eg. UnGroup all webpage visited numbers of the given
customer? or how do you unpivot an array column into row of values?
select explode(pagenavigation) from orderpages ;
30. Show me the position of the array elements in an array column in Hive? or show the order in
which customer navigated in the webpage which is stored as array?
select posexplode(pagenavigation) from orderpages ;
31. Display the customer number, comments and the pages navigated from the array data ?
We need to use Lateral view is used in conjunction with user-defined table generating functions such as
explode() UDTF (user defined table generating functions). A lateral view first applies the UDTF to each
row of base table and then joins resulting output rows to the input rows to form a virtual table having
the supplied table alias.
select customernumber,comments,idx,pgnavigation_column
from orderpages lateral view posexplode(pagenavigation) exploded_tbl as idx,pgnavigation_column;
What is the first page or second page every customers are visiting in my webpage?
select customernumber,comments,idx,pgnavigation_column
from orderpages lateral view posexplode(pagenavigation) exploded_tbl as idx,pgnavigation_column
where idx=1;
Ordering:
32. Why order by in hive is more costly? how can you avoid using it?
Order by will go with single reducer operation, can be avoided with multi reducer operation
using distribute by and sort by.
set mapred.reduce.tasks=3;
select * from payments order by customernumber,amount ;
select * from payments distribute by customernumber sort by customernumber,amount ;
select * from payments cluster by customernumber,amount;
or
34. which hive analytic functions and windowing functions you used in project?
Row_Number
Rank and Dense Rank
Sum
Cumulative Sum
Average
Count
Min and Max
First_Value and Last_Value
Lead and Lag
35. Is it Possible to Delete and Update in Hive Table. Have You used in Your Project?
Yes, partition wise we did it or using ACID properties or using insert overwite.
37. A table have 2000 records B table have 1000 records.. B and a table have 500 records matched
between them.. Then what is count output you will get if you do inner join, left jojn, right join, full
outer join, cross join
inner join - 500 - customer made the payment once or multiple times.
left semi join =500 - customer made the payment minimum once.
left outer - 2000 - all customers who made or didn’t make the payment
right outer - 1000 - all payments made by the customers regardless of the entry in the customer table.
full outer - 2500 (left 1500 rows + right 500 rows + 500 rows common between both)- customer made the
payment, didn’t make the payment, all payments made by the customers regardless of the entry in the
customer table
cross join - 2000 x 1000 = 2000000 - goibobo.com, makemytrip.com
40. Where you see the sql pivot kind of feature in hive/spark
To convert rows to columns and vice versa.
explode, lateral view or pivot options can be used.
costly approach:
select *
from ( select custno,category from txnrecords where category='Games') as T1 inner join ( select custno,category from
txnrecords where category='Puzzles') as T2 inner join T2 AS X
ON T1.custno=t2.custno
ON T1.CATEGORY=X.CATEGORY
union
select * from ( select custno,category from txnrecords where category='Games') as T1;
46. write a query - How to fetch the latest record ? or how to fetch most recent transaction?
select * from (select row_number() over () as rno,t.* from txnrecords as t) as temp
where rno in (select count(1) from txnrecords);
select txnno,custno,amount,category,product,city,state,spendby,
from_unixtime(unix_timestamp(txndate,'MM-dd-yyyy'),'yyyy-MM-dd'),
from_unixtime(unix_timestamp(txndate,'MM-dd-yyyy'))
from txnrecords;