実現したいこと
InteliJでSparkの環境構築がしたい
前提
ビルドシステム:sbt
JDK:OpenJDK 17.0.6
Scala:2.3.10
ApacheSpark:3.3.2
■■な機能を実装中に以下のエラーメッセージが発生しました。
発生している問題・エラーメッセージ
ターゲット VM に接続しました。アドレス : '127.0.0.1:51070'、トランスポート: 'ソケット' Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 23/02/20 22:47:13 WARN Utils: Your hostname, hasegawayoshionoMacBook-Air.local resolves to a loopback address: 127.0.0.1; using 192.168.0.3 instead (on interface en0) 23/02/20 22:47:13 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 23/02/20 22:47:13 INFO SparkContext: Running Spark version 3.3.2 23/02/20 22:47:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 23/02/20 22:47:13 INFO ResourceUtils: ============================================================== 23/02/20 22:47:13 INFO ResourceUtils: No custom resources configured for spark.driver. 23/02/20 22:47:13 INFO ResourceUtils: ============================================================== 23/02/20 22:47:13 INFO SparkContext: Submitted application: 31c2c321-8c36-4df0-a1fa-979035f786da 23/02/20 22:47:13 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 23/02/20 22:47:13 INFO ResourceProfile: Limiting resource is cpu 23/02/20 22:47:13 INFO ResourceProfileManager: Added ResourceProfile id: 0 23/02/20 22:47:13 INFO SecurityManager: Changing view acls to: hasegawayoshio 23/02/20 22:47:13 INFO SecurityManager: Changing modify acls to: hasegawayoshio 23/02/20 22:47:13 INFO SecurityManager: Changing view acls groups to: 23/02/20 22:47:13 INFO SecurityManager: Changing modify acls groups to: 23/02/20 22:47:13 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hasegawayoshio); groups with view permissions: Set(); users with modify permissions: Set(hasegawayoshio); groups with modify permissions: Set() 23/02/20 22:47:13 INFO Utils: Successfully started service 'sparkDriver' on port 51073. 23/02/20 22:47:13 INFO SparkEnv: Registering MapOutputTracker 23/02/20 22:47:13 INFO SparkEnv: Registering BlockManagerMaster 23/02/20 22:47:13 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 23/02/20 22:47:13 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x791f145a) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x791f145a at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213) at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114) at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353) at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279) at org.apache.spark.SparkContext.<init>(SparkContext.scala:464) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947) at spark$.<clinit>(spark.scala:5) at spark.main(spark.scala) ターゲット VM から切断されました。アドレス: '127.0.0.1:51070'、トランスポート: 'ソケット'
該当のソースコード
build.sbt
1ThisBuild / version := "0.1.0-SNAPSHOT" 2 3ThisBuild / scalaVersion := "2.13.10" 4 5lazy val root = (project in file(".")) 6 .settings( 7 name := "spark" 8 ) 9libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.3.2" 10libraryDependencies += "org.apache.spark" %% "spark-core" % "3.3.2"
scala
1import org.apache.spark.sql.{DataFrame, SparkSession} 2 3object spark { 4 5 val sc: SparkSession = SparkSession.builder.master(master = "local[2]").getOrCreate() 6 def main(args: Array[String]): Unit = { 7 val df = sc.read.csv(path = "data.csv") 8 df.show() 9 } 10}
試したこと
下記YouTubeを参考に実施
https://www.youtube.com/watch?v=ugFBalvTEcE
バッドをするには、ログインかつ
こちらの条件を満たす必要があります。