Link
http://hortonworks.com/hadoop-tutorial/loading-data-into-the-hortonworks-sandbox/
http://thinkbiganalytics.com/hadoop_nosql_services/freestone-framework/
CREATE USER 'user1'@'localhost' IDENTIFIED BY PASSWORD 'pass1';
sqoop import --connect "jdbc:mysql://localhost/test" --username "root" --table "t1" --target-dir
input1 --m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-import -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-table bigdb.t12 -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --warehouse-dir /user/hive/warehouse/bigdb --m 1
-----Import Using query—
bin/sqoop import --connect
jdbc:teradata://dnedwt.edc.cingular.net/edwdb --driver
com.teradata.jdbc.TeraDriver --username dy662t --password Dhiru+12 -
-query "select UserName , AccountName, UserOrProfile from
DBC.AccountInfoV where \$CONDITIONS" --hive-import --hive-table
tada_ipub.TMP_billing --split-by UserName --target-dir
/user/hive/warehouse/test_tmp1 -verbose
sqoop export --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t5" --export-dir /user/hive/warehouse/t5
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--table "total_gender" --export-dir /user/hive/warehouse/gender
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--password 'root123!' --table "gender" --export-dir
/user/hive/warehouse/bigdb.db/user_gender --input-fields-terminated-by
',' -verbose -m 1
---increamental load sqoop ---
sqoop import --connect "jdbc:mysql://localhost/test" --username "root"
--table "city5" --hive-import -check-column upd_dt --incremental
lastmodified --last-value 2015-05-26 --hive-import
Sqoop the data directly in externall table
-----------------
http://hortonworks.com/hadoop-tutorial/loading-data-into-the-hortonworks-sandbox/
http://thinkbiganalytics.com/hadoop_nosql_services/freestone-framework/
CREATE USER 'user1'@'localhost' IDENTIFIED BY PASSWORD 'pass1';
sqoop import --connect "jdbc:mysql://localhost/test" --username "root" --table "t1" --target-dir
input1 --m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-import -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --hive-table bigdb.t12 -m 1
sqoop import --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t4" --warehouse-dir /user/hive/warehouse/bigdb --m 1
-----Import Using query—
bin/sqoop import --connect
jdbc:teradata://dnedwt.edc.cingular.net/edwdb --driver
com.teradata.jdbc.TeraDriver --username dy662t --password Dhiru+12 -
-query "select UserName , AccountName, UserOrProfile from
DBC.AccountInfoV where \$CONDITIONS" --hive-import --hive-table
tada_ipub.TMP_billing --split-by UserName --target-dir
/user/hive/warehouse/test_tmp1 -verbose
sqoop export --connect "jdbc:mysql://localhost/db2" --username "root" -
-table "t5" --export-dir /user/hive/warehouse/t5
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--table "total_gender" --export-dir /user/hive/warehouse/gender
sqoop export --connect "jdbc:mysql://localhost/mydb" --username "root"
--password 'root123!' --table "gender" --export-dir
/user/hive/warehouse/bigdb.db/user_gender --input-fields-terminated-by
',' -verbose -m 1
---increamental load sqoop ---
sqoop import --connect "jdbc:mysql://localhost/test" --username "root"
--table "city5" --hive-import -check-column upd_dt --incremental
lastmodified --last-value 2015-05-26 --hive-import
Sqoop the data directly in externall table
-----------------
(1)
Create external table in hive.(Attached
sample DDL)
Example - dist_daily_temp2
(2)
Run Sqoop command and keep the path of
“--target-dir” same as location of external table.
sqoop
import --connect 'jdbc:oracle:thin:@(DESCRIPTION =(LOAD_BALANCE =
yes)(SDU = 32768) (ADDRESS =(PROTOCOL = TCP) (HOST = ipadddress) (PORT = 1526) ) (CONNECT_DATA
= (SERVICE_NAME = xxxI)))' --username
'AMSPODS_PUB' --password 'xxx' --table PUB_FCT_DSTRB_SLS_DAILY
--fields-terminated-by '\034' --target-dir
/sales/channel/Talend/test/test2 -m 1
(3)
All the columns in sample table in string format
to get the data properly from oracle later on we will change data type.
No comments:
Post a Comment