2013年8月8日星期四

sqoop date format when exporting ORACLE issue

oracle has had a table in the field for the date, data : 2013/5/3 17:33:21 , with sqoop into HDFS , found became 2013-05-03 , fewer minutes and seconds , you can ask whether sqoop specify the import format?
------ Solution ---------------------------------------- ----
sqoop / bin / sqoop import
- connect jdbc: XXXXXXXXXXX
- username XXXXXXXXXXX
-P
- query "select TO_CHAR (date, 'yyyy-mm-dd hh: mm: ss') from TB "
- hive-import
- hive-table XXXXXXXXX
-m 1
- append
- target-dir XXXXXXXXXXXXXX
------ For reference only --------------------------- ------------
never encountered such a situation
Try using the ORACLE TO_CHAR


sqoop/bin/sqoop import
--connect jdbc:XXXXXXXXXXX
--username XXXXXXXXXXX
-P
--query "select TO_CHAR(date,'yyyy-mm-dd hh:mm:ss') from TB"
--hive-import
--hive-table XXXXXXXXX
-m 1
--append
--target-dir XXXXXXXXXXXXXX

------ For reference only ----------------------------------- ----


Thank you, this statement is in fact, there is a problem , that is, even if you are unconditional full table export , but also need to write where \ $ CONDITIONS, or will be error , the next script as follows I
. / sqoop import - connect jdbc: oracle: thin: @ 192.168.1.10:1521: crmdb - username aaa-P - query "select TO_CHAR (yy, 'yyyy-mm-dd hh: mm: ss') from testtable where \ $ CONDITIONS "-m 1 - append - target-dir apps / as / hive / testtable

没有评论:

发表评论