Yarn application has already ended! It might have been killed or unable to launch application master
啟動命令:
spark-shell --master ?yarn --driver-memory 4g --executor-memory 4g --num-executors 6 --executor-cores 4
完整報錯如下:
Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2020-05-01 16:29:11 WARN Client:66 - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2020-05-01 16:31:48 ERROR SparkContext:91 - Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)at $line3.$read$$iw$$iw.<init>(<console>:15)at $line3.$read$$iw.<init>(<console>:43)at $line3.$read.<init>(<console>:45)at $line3.$read$.<init>(<console>:49)at $line3.$read$.<clinit>(<console>)at $line3.$eval$.$print$lzycompute(<console>:7)at $line3.$eval$.$print(<console>:6)at $line3.$eval.$print(<console>)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at scala.collection.immutable.List.foreach(List.scala:381)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)at org.apache.spark.repl.Main$.doMain(Main.scala:76)at org.apache.spark.repl.Main$.main(Main.scala:56)at org.apache.spark.repl.Main.main(Main.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2020-05-01 16:31:48 ERROR TransportClient:233 - Failed to send RPC 4694280693106493037 to /127.0.0.1:33232: java.nio.channels.ClosedChannelException java.nio.channels.ClosedChannelExceptionat io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 2020-05-01 16:31:48 ERROR YarnSchedulerBackend$YarnSchedulerEndpoint:91 - Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful java.io.IOException: Failed to send RPC 4694280693106493037 to /127.0.0.1:33232: java.nio.channels.ClosedChannelExceptionat org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:431)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748) Caused by: java.nio.channels.ClosedChannelExceptionat io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 2020-05-01 16:31:48 ERROR Utils:91 - Uncaught exception in thread main org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.requestTotalExecutors(CoarseGrainedSchedulerBackend.scala:567)at org.apache.spark.scheduler.cluster.YarnSchedulerBackend.stop(YarnSchedulerBackend.scala:95)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:155)at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:508)at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1755)at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1931)at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1360)at org.apache.spark.SparkContext.stop(SparkContext.scala:1930)at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)at $line3.$read$$iw$$iw.<init>(<console>:15)at $line3.$read$$iw.<init>(<console>:43)at $line3.$read.<init>(<console>:45)at $line3.$read$.<init>(<console>:49)at $line3.$read$.<clinit>(<console>)at $line3.$eval$.$print$lzycompute(<console>:7)at $line3.$eval$.$print(<console>:6)at $line3.$eval.$print(<console>)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at scala.collection.immutable.List.foreach(List.scala:381)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)at org.apache.spark.repl.Main$.doMain(Main.scala:76)at org.apache.spark.repl.Main$.main(Main.scala:56)at org.apache.spark.repl.Main.main(Main.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.io.IOException: Failed to send RPC 4694280693106493037 to /127.0.0.1:33232: java.nio.channels.ClosedChannelExceptionat org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:431)at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748) Caused by: java.nio.channels.ClosedChannelExceptionat io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 2020-05-01 16:31:48 WARN MetricsSystem:66 - Stopping a MetricsSystem that is not running org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)... 55 elided <console>:14: error: not found: value sparkimport spark.implicits._^ <console>:14: error: not found: value sparkimport spark.sql^ Welcome to____ __/ __/__ ___ _____/ /___\ \/ _ \/ _ `/ __/ '_//___/ .__/\_,_/_/ /_/\_\ version 2.3.1/_/Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) Type in expressions to have them evaluated. Type :help for more information.scala> import spark.implicits._ <console>:17: error: not found: value sparkimport spark.implicits._^scala> :q不要相信網上博客的任何解決方案(包括這一篇),按照下面的流程自己去看log!
log的鏈接:
http://desktop:8088/cluster
下圖中點擊雙向剪頭進行時間排序,然后選擇左邊最靠前的application(帶下劃線的那個)
?
上面的圖點進去后,進入下面的圖,選擇最新的log
?
得到下面的路徑
?
說一些反常識的東西,注意我們需要的報錯信息不一定在stderr,而是可能在stdout中,所以都要看下.
?
最終我們從stdout中得到報錯信息:
Invalid resource request, requested virtual cores < 0, or requested virtual cores > max configured, requestedVirtualCores=4, maxVirtualCores=2這段話的意思是:
你的命令行中的executor-cores 4 <yarn.xml中的yarn.scheduler.maximum-allocation-vcores
命令行中的executor-cores 等效于spark-env.sh中的參數SPARK_EXECUTOR_CORES
大意是說,用戶要申請的資源(requestedVirtualCores)比資源上限(maxVirtualCores)還要大,因此不行(Invalid resource request)。
接下來兩個辦法,要么你修改自己的命令中的executor-cores 4改成executor-cores 2,要么修改yarn.xml
這里修改yarn-site.xml:
<property> <name>yarn.scheduler.maximum-allocation-vcores</name> <value>2</value> </property>改成:
<property> <name>yarn.scheduler.maximum-allocation-vcores</name> <value>4</value> </property>重啟hadoop(不要忘記)
再次運行命令:
spark-shell --master ?yarn --driver-memory 4g --executor-memory 4g --num-executors 6 --executor-cores 4
重復上述流程,得到報錯:
Failed to connect to driver at x.x.x.x,retrying.....
改成如下命令:
$ spark-shell --master ?yarn --driver-memory 4g --executor-memory 4g --num-executors 6 --executor-cores 4 --conf spark.driver.host=Desktop
?
小結:
終端的報錯信息和web ui的報錯信息是不一致的,要搞清楚web ui中哪種報錯信息導致了終端的報錯信息
注意這個錯誤屬于多因一果,多種原因都可能導致相同的終端報錯,
千萬不要看這篇文章的結論,重點是流程,回去,按照博客里面的整個流程操作一遍!!!
?
?
總結
以上是生活随笔為你收集整理的Yarn application has already ended! It might have been killed or unable to launch application master的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 搞笑聊天记录(聊天话题100个幽默句子)
- 下一篇: 16年稳坐第一,海信官宣巩俐代言!