- Create a table in Teradata:
CREATE TABLE vmtest.test (a integer, b timestamp(6) FORMAT 'yyyy-mm-ddbhh:mi:ss.s(6)') PRIMARY INDEX (a); INSERT INTO vmtest.test VALUES (1, '2016-04-05 11:27:24.699022');
- And sqoop import command:
sqoop import --connect jdbc:teradata://
/database=vmtest \ --username dbc --password dbc --target-dir /tmp/test --delete-target-dir \ --as-textfile --fields-terminated-by "," --table test - data stored in HDFS as below:
[[email protected] ~]$ hadoop fs -cat /tmp/test/part* 1,2016-04-05 11:27:24.699
sqoop import --connect jdbc:teradata://After import, data is stored in HDFS correctly:/database=vmtest \ --username dbc --password dbc --target-dir /tmp/test \ --delete-target-dir --as-textfile --fields-terminated-by "," \ --query "SELECT a, cast(cast(b as format 'YYYY-MM-DD HH:MI:SS.s(6)') as char(40)) from test WHERE \$CONDITIONS" \ --split-by a
[[email protected] ~]$ hadoop fs -cat /tmp/test/part* 1,2016-04-0511:27:24.699022