check your cluster UI to ensure that workers are registered and have sufficient resources
完整報(bào)錯(cuò)如下:
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
------------------------------------------------------------------------------------------------------------------------------------------------------------
①訪問http://master:8080/
點(diǎn)擊下面的Running Applications
②選擇下面RUNNING中的stderr
得到下面的報(bào)錯(cuò):
java.io.InvalidClassException: scala.collection.mutable.WrappedArray$ofRef; local class incompatible: stream classdesc serialVersionUID = 3456489343829468865, local class serialVersionUID = 1028182004549731694at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$2(NettyRpcEnv.scala:292)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:345)at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$1(NettyRpcEnv.scala:291)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:291)at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$7(NettyRpcEnv.scala:246)at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$7$adapted(NettyRpcEnv.scala:246)at org.apache.spark.rpc.netty.RpcOutboxMessage.onSuccess(Outbox.scala:90)at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:194)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:142)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)at java.lang.Thread.run(Thread.java:748)最終辦法:
jps檢查兩個(gè)節(jié)點(diǎn)是否都有啟動(dòng)worker
修改File->Project Structure->Libraries中的Scala SDK為自己的$SCALA_HOME。
?
mvn clean
mvn package
?
總結(jié)
以上是生活随笔為你收集整理的check your cluster UI to ensure that workers are registered and have sufficient resources的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 成语“狡兔三窟”中的“狡兔”指的是哪种兔
- 下一篇: 小米电视 EA70(L70MA-EA)上