sqoop从oracle导入hive分区,sqoop从oracle导入hive Hive exited with status 64
導入命令
./sqoop import -Dmapreduce.map.java.opts=-Xmx3000m -Dmapreduce.map.memory.mb=3200 --connect jdbc:oracle:thin:@192.168.113.17:1521:btobbi --username tianlianbi --P --table BIO_PRODUCT_MAIN --hive-import --hive-overwrite -m 4
數據已經進入到hdfs了,hive表也自動建好了,但是就是最后報錯了,hive里面沒有數據
下面是報錯日志
17/03/23 17:19:23 INFO hive.HiveImport: OK
17/03/23 17:19:23 INFO hive.HiveImport: Time taken: 1.361 seconds
17/03/23 17:22:20 INFO hive.HiveImport: FAILED: SemanticException Line 2:17 Invalid path ''hdfs://cluster/user/root/BIO_PRODUCT_MAIN''
17/03/23 17:22:20 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 64
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:389)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:339)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
求教解決方案
總結
以上是生活随笔為你收集整理的sqoop从oracle导入hive分区,sqoop从oracle导入hive Hive exited with status 64的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: python 弹窗不阻断线程_Javas
- 下一篇: mysql binlog查看_MySQL