hive中删除表的错误Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException
生活随笔
收集整理的這篇文章主要介紹了
hive中删除表的错误Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
1:請看操作
[jifeng@jifeng02 hive-0.12.0-bin]$ hiveLogging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties hive> show tables; OK t1 tianq tianqi Time taken: 3.338 seconds, Fetched: 3 row(s) hive> drop table t1; FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDODataStoreException: Iteration request failed : SELECT `A0`.`COMMENT`,`A0`.`COLUMN_NAME`,`A0`.`TYPE_NAME`,`A0`.`INTEGER_IDX` AS NUCORDER0 FROM `COLUMNS_V2` `A0` WHERE `A0`.`CD_ID` = ? AND `A0`.`INTEGER_IDX` >= 0 ORDER BY NUCORDER0at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:610)at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:622)at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:631)at org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2293)at org.apache.hadoop.hive.metastore.ObjectStore.preDropStorageDescriptor(ObjectStore.java:2321)at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:742)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)at com.sun.proxy.$Proxy10.dropTable(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1192)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1328)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)at com.sun.proxy.$Proxy11.drop_table_with_environment_context(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:671)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:647)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy12.dropTable(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:869)at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:836)at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3329)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:277)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.util.RunJar.main(RunJar.java:160) NestedThrowablesStackTrace: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:526)at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)at com.mysql.jdbc.Util.getInstance(Util.java:381)at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030)at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3491)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3423)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1936)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2060)at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2536)at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1463)at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1869)at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)at org.datanucleus.store.rdbms.scostore.JoinListStore.listIterator(JoinListStore.java:748)at org.datanucleus.store.rdbms.scostore.AbstractListStore.listIterator(AbstractListStore.java:92)at org.datanucleus.store.rdbms.scostore.AbstractListStore.iterator(AbstractListStore.java:82)at org.datanucleus.store.types.backed.List.loadFromStore(List.java:300)at org.datanucleus.store.types.backed.List.load(List.java:273)at org.datanucleus.state.JDOStateManager.loadUnloadedFields(JDOStateManager.java:2967)at org.datanucleus.api.jdo.state.Hollow.transitionRetrieve(Hollow.java:168)at org.datanucleus.state.AbstractStateManager.retrieve(AbstractStateManager.java:598)at org.datanucleus.ExecutionContextImpl.retrieveObject(ExecutionContextImpl.java:1778)at org.datanucleus.ExecutionContextThreadedImpl.retrieveObject(ExecutionContextThreadedImpl.java:203)at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:605)at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:622)at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:631)at org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2293)at org.apache.hadoop.hive.metastore.ObjectStore.preDropStorageDescriptor(ObjectStore.java:2321)at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:742)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)at com.sun.proxy.$Proxy10.dropTable(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1192)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1328)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)at com.sun.proxy.$Proxy11.drop_table_with_environment_context(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:671)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:647)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy12.dropTable(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:869)at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:836)at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3329)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:277)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.util.RunJar.main(RunJar.java:160) ) hive> show tables; OK t1 tianq tianqi Time taken: 0.033 seconds, Fetched: 3 row(s) hive> drop table tianq; FAILED: SemanticException [Error 10001]: Table not found tianq hive> show tables; OK t1 tianq tianqi Time taken: 0.022 seconds, Fetched: 3 row(s)
我也用這個新的包試試,并且把mysql-connector-java-5.1.6-bin.jar刪除
問題解決了,估計是mysql的bug吧 [jifeng@jifeng02 hive-0.12.0-bin]$ hiveLogging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties hive> show tables; OK t1 tianq tianqi Time taken: 2.182 seconds, Fetched: 3 row(s) hive> drop table t1; OK Time taken: 1.02 seconds hive> select * from tianq; OK Time taken: 0.192 seconds hive> drop table tianq; OK Time taken: 0.28 seconds
總結
以上是生活随笔為你收集整理的hive中删除表的错误Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Hadoop Pig学习笔记 各种SQ
- 下一篇: hive加载大文件(3G)