spark-submitでScala+sbtで作成したjarを指定し実行すると、
下記Exceptionが発生しSparkアプリケーションが起動に失敗します。
エラー原因として起動時にフォルダを作成し削除する際に削除できずにエラーとなっています。
こちらはなぜ削除でエラーとなっているか分かりますでしょうか?
またこのディレクトリを作成する場所を変更することは可能でしょうか?
2018-06-01 00:10:27 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2018-06-01 00:10:27 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2018-06-01 00:10:27 WARN SparkEnv:87 - Exception while deleting Spark temp dir: C:\Users\hoge\AppData\Local\Temp\spark-77ff060d-d64c-44e9-9663-d3eef3ddc312\userFiles-87718b89-3d72-4415-92bb-6c3dca83c423 java.io.IOException: Failed to delete: C:\Users\hoge\AppData\Local\Temp\spark-77ff060d-d64c-44e9-9663-d3eef3ddc312\userFiles-87718b89-3d72-4415-92bb-6c3dca83c423 at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1070) at org.apache.spark.SparkEnv.stop(SparkEnv.scala:103) at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1940) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357) at org.apache.spark.SparkContext.stop(SparkContext.scala:1939) at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:572) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
あなたの回答
tips
プレビュー