Convert Interger or decimal to Date format
Thursday, 16 December 2021
Tuesday, 31 August 2021
Hive with Parquet file
File location on hdfs :
/data44/2/part-00000-ddd75d27-e608-4a17-a96d-4d631c71e875-c000.snappy.parquet
Create external table in Hive
create external table test_paq4(deptid string,name string,id int) stored as parquet location 'hdfs://127.0.0.1:9000/data44/2';
hive> select * from test_paq4;
Note :- If you are using parquet then you can select the sequence of column on random basis e.g id,name,deptid or deptid,name,id
Only the name of the column should match with Parquet file column. It mean each parquet file contains schema (column names).
Monday, 28 June 2021
Sqoop Commands
sqoop import --connect "jdbc:mysql://localhost/test" --username "root" --table "t1" --target-dir
input1 --m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-import -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-table bigdb.t12 -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --warehouse-dir /user/hive/warehouse/bigdb --m 1
sqoop export --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t5" --export-dir /user/hive/warehouse/t5
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--table "total_gender" --export-dir /user/hive/warehouse/gender
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--password 'root123!' --table "gender" --export-dir
/user/hive/warehouse/bigdb.db/user_gender --input-fields-terminated-by
',' -verbose -m 1
---increamental load sqoop ---
sqoop import --connect "jdbc:mysql://localhost/test" --username "root"
--table "city5" --hive-import -check-column upd_dt --incremental
lastmodified --last-value 2015-05-26 --hive-import --m 1