kandi background
kandi background
Explore Kits
kandi background
Explore Kits
Explore all Spark open source software, libraries, packages, source code, cloud functions and APIs.

Popular New Releases in Spark

elasticsearch

Elasticsearch 8.1.3

xgboost

Release candidate of version 1.6.0

kibana

Kibana 8.1.3

luigi

3.0.3

mlflow

MLflow 1.25.1

elasticsearch

Elasticsearch 8.1.3

xgboost

Release candidate of version 1.6.0

kibana

Kibana 8.1.3

luigi

3.0.3

mlflow

MLflow 1.25.1

Popular Libraries in Spark

elasticsearch

by elastic java

star image 59266 NOASSERTION

Free and Open, Distributed, RESTful Search Engine

spark

by apache scala

star image 32507 Apache-2.0

Apache Spark - A unified analytics engine for large-scale data processing

xgboost

by dmlc c++

star image 22464 Apache-2.0

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

kafka

by apache java

star image 21667 Apache-2.0

Mirror of Apache Kafka

data-science-ipython-notebooks

by donnemartin python

star image 21519 NOASSERTION

Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.

flink

by apache java

star image 18609 Apache-2.0

Apache Flink

kibana

by elastic typescript

star image 17328 NOASSERTION

Your window into the Elastic Stack

luigi

by spotify python

star image 14716 Apache-2.0

Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

presto

by prestodb java

star image 13394 Apache-2.0

The official home of the Presto distributed SQL query engine for big data

elasticsearch

by elastic java

star image 59266 NOASSERTION

Free and Open, Distributed, RESTful Search Engine

spark

by apache scala

star image 32507 Apache-2.0

Apache Spark - A unified analytics engine for large-scale data processing

xgboost

by dmlc c++

star image 22464 Apache-2.0

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

kafka

by apache java

star image 21667 Apache-2.0

Mirror of Apache Kafka

data-science-ipython-notebooks

by donnemartin python

star image 21519 NOASSERTION

Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.

flink

by apache java

star image 18609 Apache-2.0

Apache Flink

kibana

by elastic typescript

star image 17328 NOASSERTION

Your window into the Elastic Stack

luigi

by spotify python

star image 14716 Apache-2.0

Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

presto

by prestodb java

star image 13394 Apache-2.0

The official home of the Presto distributed SQL query engine for big data

Trending New libraries in Spark

airbyte

by airbytehq java

star image 6468 NOASSERTION

Airbyte is an open-source EL(T) platform that helps you replicate your data in your warehouses, lakes and databases.

orchest

by orchest python

star image 2877 AGPL-3.0

Build data pipelines, the easy way ๐Ÿ› ๏ธ

SZT-bigdata

by geekyouth scala

star image 1137 GPL-3.0

ๆทฑๅœณๅœฐ้“ๅคงๆ•ฐๆฎๅฎขๆตๅˆ†ๆž็ณป็ปŸ๐Ÿš‡๐Ÿš„๐ŸŒŸ

goodreads_etl_pipeline

by san089 python

star image 832 MIT

An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.

notebooks

by huggingface jupyter notebook

star image 824 Apache-2.0

Notebooks using the Hugging Face libraries ๐Ÿค—

dlink

by DataLinkDC java

star image 735 Apache-2.0

Dinky is an out of the box one-stop real-time computing platform dedicated to the construction and practice of Unified Batch & Streaming and Unified Data Lake & Data Warehouse. Based on Apache Flink, Dinky provides the ability to connect many big data frameworks including OLAP and Data Lake.

fugue

by fugue-project python

star image 626 Apache-2.0

A unified interface for distributed computing. Fugue executes SQL, Python, and Pandas code on Spark and Dask without any rewrites.

notebooker

by man-group python

star image 603 AGPL-3.0

Productionise & schedule your Jupyter Notebooks as easily as you wrote them.

Udacity-Data-Engineering-Projects

by san089 python

star image 521 NOASSERTION

Few projects related to Data Engineering including Data Modeling, Infrastructure setup on cloud, Data Warehousing and Data Lake development.

airbyte

by airbytehq java

star image 6468 NOASSERTION

Airbyte is an open-source EL(T) platform that helps you replicate your data in your warehouses, lakes and databases.

orchest

by orchest python

star image 2877 AGPL-3.0

Build data pipelines, the easy way ๐Ÿ› ๏ธ

SZT-bigdata

by geekyouth scala

star image 1137 GPL-3.0

ๆทฑๅœณๅœฐ้“ๅคงๆ•ฐๆฎๅฎขๆตๅˆ†ๆž็ณป็ปŸ๐Ÿš‡๐Ÿš„๐ŸŒŸ

goodreads_etl_pipeline

by san089 python

star image 832 MIT

An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.

notebooks

by huggingface jupyter notebook

star image 824 Apache-2.0

Notebooks using the Hugging Face libraries ๐Ÿค—

dlink

by DataLinkDC java

star image 735 Apache-2.0

Dinky is an out of the box one-stop real-time computing platform dedicated to the construction and practice of Unified Batch & Streaming and Unified Data Lake & Data Warehouse. Based on Apache Flink, Dinky provides the ability to connect many big data frameworks including OLAP and Data Lake.

fugue

by fugue-project python

star image 626 Apache-2.0

A unified interface for distributed computing. Fugue executes SQL, Python, and Pandas code on Spark and Dask without any rewrites.

notebooker

by man-group python

star image 603 AGPL-3.0

Productionise & schedule your Jupyter Notebooks as easily as you wrote them.

Udacity-Data-Engineering-Projects

by san089 python

star image 521 NOASSERTION

Few projects related to Data Engineering including Data Modeling, Infrastructure setup on cloud, Data Warehousing and Data Lake development.

Top Authors in Spark

1

PacktPublishing

101 Libraries

2708

2

apache

90 Libraries

154349

3

aws-samples

42 Libraries

1369

4

awslabs

24 Libraries

9195

5

databricks

22 Libraries

16423

6

MrPowers

21 Libraries

1557

7

jgperrin

20 Libraries

220

8

mraad

20 Libraries

210

9

pkourany

19 Libraries

165

10

knoldus

19 Libraries

261

1

101 Libraries

2708

2

90 Libraries

154349

3

42 Libraries

1369

4

24 Libraries

9195

5

22 Libraries

16423

6

21 Libraries

1557

7

20 Libraries

220

8

20 Libraries

210

9

19 Libraries

165

10

19 Libraries

261

Trending Kits in Spark

No Trending Kits are available at this moment for Spark

Trending Discussions on Spark

    spark-shell throws java.lang.reflect.InvocationTargetException on running
    Why joining structure-identic dataframes gives different results?
    AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks'>
    Problems when writing parquet with timestamps prior to 1900 in AWS Glue 3.0
    NoSuchMethodError on com.fasterxml.jackson.dataformat.xml.XmlMapper.coercionConfigDefaults()
    Cannot find conda info. Please verify your conda installation on EMR
    How to set Docker Compose `env_file` relative to `.yml` file when multiple `--file` option is used?
    Read spark data with column that clashes with partition name
    How do I parse xml documents in Palantir Foundry?
    docker build vue3 not compatible with element-ui on node:16-buster-slim

QUESTION

spark-shell throws java.lang.reflect.InvocationTargetException on running

Asked 2022-Apr-01 at 19:53

When I execute run-example SparkPi, for example, it works perfectly, but when I run spark-shell, it throws these exceptions:

1WARNING: An illegal reflective access operation has occurred
2WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/big_data/spark-3.2.0-bin-hadoop3.2-scala2.13/jars/spark-unsafe_2.13-3.2.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
3WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
4WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
5WARNING: All illegal access operations will be denied in a future release
6Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
7Setting default log level to "WARN".
8To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
9Welcome to
10      ____              __
11     / __/__  ___ _____/ /__
12    _\ \/ _ \/ _ `/ __/  '_/
13   /___/ .__/\_,_/_/ /_/\_\   version 3.2.0
14      /_/
15
16Using Scala version 2.13.5 (OpenJDK 64-Bit Server VM, Java 11.0.9.1)
17Type in expressions to have them evaluated.
18Type :help for more information.
1921/12/11 19:28:36 ERROR SparkContext: Error initializing SparkContext.
20java.lang.reflect.InvocationTargetException
21        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
22        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
23        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
24        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
25        at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
26        at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
27        at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
28        at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
29        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
30        at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
31        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
32        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
33        at scala.Option.getOrElse(Option.scala:201)
34        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
35        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
36        at $line3.$read$$iw.<init>(<console>:5)
37        at $line3.$read.<init>(<console>:4)
38        at $line3.$read$.<clinit>(<console>)
39        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
40        at $line3.$eval$.$print(<synthetic>:5)
41        at $line3.$eval.$print(<synthetic>)
42        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
43        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
44        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
45        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
46        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
47        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
48        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
49        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
50        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
51        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
52        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
53        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
54        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
55        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
56        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
57        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
58        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
59        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
60        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
61        at scala.collection.immutable.List.foreach(List.scala:333)
62        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
63        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
64        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
65        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
66        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
67        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
68        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
69        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
70        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
71        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
72        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
73        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
74        at org.apache.spark.repl.Main$.main(Main.scala:59)
75        at org.apache.spark.repl.Main.main(Main.scala)
76        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
77        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
78        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
79        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
80        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
81        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
82        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
83        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
84        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
85        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
86        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
87        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
88Caused by: java.net.URISyntaxException: Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes
89        at java.base/java.net.URI$Parser.fail(URI.java:2913)
90        at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
91        at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
92        at java.base/java.net.URI$Parser.parse(URI.java:3114)
93        at java.base/java.net.URI.<init>(URI.java:600)
94        at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
95        ... 67 more
9621/12/11 19:28:36 ERROR Utils: Uncaught exception in thread main
97java.lang.NullPointerException
98        at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
99        at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
100        at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:927)
101        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2516)
102        at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2086)
103        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442)
104        at org.apache.spark.SparkContext.stop(SparkContext.scala:2086)
105        at org.apache.spark.SparkContext.<init>(SparkContext.scala:677)
106        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
107        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
108        at scala.Option.getOrElse(Option.scala:201)
109        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
110        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
111        at $line3.$read$$iw.<init>(<console>:5)
112        at $line3.$read.<init>(<console>:4)
113        at $line3.$read$.<clinit>(<console>)
114        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
115        at $line3.$eval$.$print(<synthetic>:5)
116        at $line3.$eval.$print(<synthetic>)
117        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
118        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
119        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
120        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
121        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
122        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
123        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
124        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
125        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
126        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
127        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
128        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
129        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
130        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
131        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
132        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
133        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
134        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
135        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
136        at scala.collection.immutable.List.foreach(List.scala:333)
137        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
138        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
139        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
140        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
141        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
142        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
143        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
144        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
145        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
146        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
147        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
148        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
149        at org.apache.spark.repl.Main$.main(Main.scala:59)
150        at org.apache.spark.repl.Main.main(Main.scala)
151        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
152        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
153        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
154        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
155        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
156        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
157        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
158        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
159        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
160        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
161        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
162        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16321/12/11 19:28:36 WARN MetricsSystem: Stopping a MetricsSystem that is not running
16421/12/11 19:28:36 ERROR Main: Failed to initialize Spark session.
165java.lang.reflect.InvocationTargetException
166        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
167        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
168        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
169        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
170        at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
171        at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
172        at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
173        at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
174        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
175        at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
176        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
177        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
178        at scala.Option.getOrElse(Option.scala:201)
179        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
180        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
181        at $line3.$read$$iw.<init>(<console>:5)
182        at $line3.$read.<init>(<console>:4)
183        at $line3.$read$.<clinit>(<console>)
184        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
185        at $line3.$eval$.$print(<synthetic>:5)
186        at $line3.$eval.$print(<synthetic>)
187        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
188        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
189        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
190        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
191        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
192        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
193        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
194        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
195        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
196        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
197        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
198        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
199        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
200        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
201        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
202        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
203        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
204        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
205        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
206        at scala.collection.immutable.List.foreach(List.scala:333)
207        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
208        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
209        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
210        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
211        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
212        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
213        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
214        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
215        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
216        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
217        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
218        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
219        at org.apache.spark.repl.Main$.main(Main.scala:59)
220        at org.apache.spark.repl.Main.main(Main.scala)
221        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
222        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
223        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
224        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
225        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
226        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
227        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
228        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
229        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
230        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
231        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
232        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
233Caused by: java.net.URISyntaxException: Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes
234        at java.base/java.net.URI$Parser.fail(URI.java:2913)
235        at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
236        at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
237        at java.base/java.net.URI$Parser.parse(URI.java:3114)
238        at java.base/java.net.URI.<init>(URI.java:600)
239        at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
240        ... 67 more
24121/12/11 19:28:36 ERROR Utils: Uncaught exception in thread shutdown-hook-0
242java.lang.ExceptionInInitializerError
243        at org.apache.spark.executor.Executor.stop(Executor.scala:333)
244        at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
245        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
246        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
247        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
248        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
249        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
250        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
251        at scala.util.Try$.apply(Try.scala:210)
252        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
253        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
254        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
255        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
256        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
257        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
258        at java.base/java.lang.Thread.run(Thread.java:829)
259Caused by: java.lang.NullPointerException
260        at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala:465)
261        ... 16 more
26221/12/11 19:28:36 WARN ShutdownHookManager: ShutdownHook '' failed, java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
263java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
264        at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
265        at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
266        at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
267        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
268Caused by: java.lang.ExceptionInInitializerError
269        at org.apache.spark.executor.Executor.stop(Executor.scala:333)
270        at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
271        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
272        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
273        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
274        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
275        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
276        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
277        at scala.util.Try$.apply(Try.scala:210)
278        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
279        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
280        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
281        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
282        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
283        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
284        at java.base/java.lang.Thread.run(Thread.java:829)
285Caused by: java.lang.NullPointerException
286        at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala:465)
287        ... 16 more
288

As I can see it caused by Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes, but I don't understand what does it mean exactly and how to deal with that

How can I solve this problem?

I use Spark 3.2.0 Pre-built for Apache Hadoop 3.3 and later (Scala 2.13)

JAVA_HOME, HADOOP_HOME, SPARK_HOME path variables are set.

ANSWER

Answered 2022-Jan-07 at 15:11

i face the same problem, i think Spark 3.2 is the problem itself

switched to Spark 3.1.2, it works fine

Source https://stackoverflow.com/questions/70317481

Community Discussions contain sources that include Stack Exchange Network

    spark-shell throws java.lang.reflect.InvocationTargetException on running
    Why joining structure-identic dataframes gives different results?
    AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks'>
    Problems when writing parquet with timestamps prior to 1900 in AWS Glue 3.0
    NoSuchMethodError on com.fasterxml.jackson.dataformat.xml.XmlMapper.coercionConfigDefaults()
    Cannot find conda info. Please verify your conda installation on EMR
    How to set Docker Compose `env_file` relative to `.yml` file when multiple `--file` option is used?
    Read spark data with column that clashes with partition name
    How do I parse xml documents in Palantir Foundry?
    docker build vue3 not compatible with element-ui on node:16-buster-slim

QUESTION

spark-shell throws java.lang.reflect.InvocationTargetException on running

Asked 2022-Apr-01 at 19:53

When I execute run-example SparkPi, for example, it works perfectly, but when I run spark-shell, it throws these exceptions:

1WARNING: An illegal reflective access operation has occurred
2WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/big_data/spark-3.2.0-bin-hadoop3.2-scala2.13/jars/spark-unsafe_2.13-3.2.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
3WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
4WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
5WARNING: All illegal access operations will be denied in a future release
6Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
7Setting default log level to "WARN".
8To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
9Welcome to
10      ____              __
11     / __/__  ___ _____/ /__
12    _\ \/ _ \/ _ `/ __/  '_/
13   /___/ .__/\_,_/_/ /_/\_\   version 3.2.0
14      /_/
15
16Using Scala version 2.13.5 (OpenJDK 64-Bit Server VM, Java 11.0.9.1)
17Type in expressions to have them evaluated.
18Type :help for more information.
1921/12/11 19:28:36 ERROR SparkContext: Error initializing SparkContext.
20java.lang.reflect.InvocationTargetException
21        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
22        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
23        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
24        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
25        at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
26        at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
27        at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
28        at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
29        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
30        at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
31        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
32        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
33        at scala.Option.getOrElse(Option.scala:201)
34        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
35        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
36        at $line3.$read$$iw.<init>(<console>:5)
37        at $line3.$read.<init>(<console>:4)
38        at $line3.$read$.<clinit>(<console>)
39        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
40        at $line3.$eval$.$print(<synthetic>:5)
41        at $line3.$eval.$print(<synthetic>)
42        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
43        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
44        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
45        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
46        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
47        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
48        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
49        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
50        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
51        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
52        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
53        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
54        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
55        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
56        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
57        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
58        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
59        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
60        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
61        at scala.collection.immutable.List.foreach(List.scala:333)
62        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
63        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
64        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
65        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
66        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
67        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
68        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
69        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
70        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
71        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
72        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
73        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
74        at org.apache.spark.repl.Main$.main(Main.scala:59)
75        at org.apache.spark.repl.Main.main(Main.scala)
76        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
77        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
78        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
79        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
80        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
81        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
82        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
83        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
84        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
85        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
86        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
87        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
88Caused by: java.net.URISyntaxException: Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes
89        at java.base/java.net.URI$Parser.fail(URI.java:2913)
90        at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
91        at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
92        at java.base/java.net.URI$Parser.parse(URI.java:3114)
93        at java.base/java.net.URI.<init>(URI.java:600)
94        at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
95        ... 67 more
9621/12/11 19:28:36 ERROR Utils: Uncaught exception in thread main
97java.lang.NullPointerException
98        at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
99        at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
100        at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:927)
101        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2516)
102        at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2086)
103        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442)
104        at org.apache.spark.SparkContext.stop(SparkContext.scala:2086)
105        at org.apache.spark.SparkContext.<init>(SparkContext.scala:677)
106        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
107        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
108        at scala.Option.getOrElse(Option.scala:201)
109        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
110        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
111        at $line3.$read$$iw.<init>(<console>:5)
112        at $line3.$read.<init>(<console>:4)
113        at $line3.$read$.<clinit>(<console>)
114        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
115        at $line3.$eval$.$print(<synthetic>:5)
116        at $line3.$eval.$print(<synthetic>)
117        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
118        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
119        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
120        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
121        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
122        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
123        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
124        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
125        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
126        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
127        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
128        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
129        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
130        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
131        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
132        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
133        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
134        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
135        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
136        at scala.collection.immutable.List.foreach(List.scala:333)
137        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
138        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
139        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
140        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
141        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
142        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
143        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
144        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
145        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
146        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
147        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
148        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
149        at org.apache.spark.repl.Main$.main(Main.scala:59)
150        at org.apache.spark.repl.Main.main(Main.scala)
151        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
152        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
153        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
154        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
155        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
156        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
157        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
158        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
159        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
160        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
161        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
162        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16321/12/11 19:28:36 WARN MetricsSystem: Stopping a MetricsSystem that is not running
16421/12/11 19:28:36 ERROR Main: Failed to initialize Spark session.
165java.lang.reflect.InvocationTargetException
166        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
167        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
168        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
169        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
170        at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
171        at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
172        at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
173        at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
174        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
175        at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
176        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
177        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
178        at scala.Option.getOrElse(Option.scala:201)
179        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
180        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:114)
181        at $line3.$read$$iw.<init>(<console>:5)
182        at $line3.$read.<init>(<console>:4)
183        at $line3.$read$.<clinit>(<console>)
184        at $line3.$eval$.$print$lzycompute(<synthetic>:6)
185        at $line3.$eval$.$print(<synthetic>:5)
186        at $line3.$eval.$print(<synthetic>)
187        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
188        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
189        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
190        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
191        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:670)
192        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1006)
193        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$1(IMain.scala:506)
194        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
195        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
196        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:43)
197        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:505)
198        at scala.tools.nsc.interpreter.IMain.$anonfun$doInterpret$3(IMain.scala:519)
199        at scala.tools.nsc.interpreter.IMain.doInterpret(IMain.scala:519)
200        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:503)
201        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:501)
202        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:216)
203        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
204        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
205        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$interpretPreamble$1(ILoop.scala:924)
206        at scala.collection.immutable.List.foreach(List.scala:333)
207        at scala.tools.nsc.interpreter.shell.ILoop.interpretPreamble(ILoop.scala:924)
208        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$3(ILoop.scala:963)
209        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
210        at scala.tools.nsc.interpreter.shell.ILoop.echoOff(ILoop.scala:90)
211        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$2(ILoop.scala:963)
212        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
213        at scala.tools.nsc.interpreter.IMain.withSuppressedSettings(IMain.scala:1406)
214        at scala.tools.nsc.interpreter.shell.ILoop.$anonfun$run$1(ILoop.scala:954)
215        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
216        at scala.tools.nsc.interpreter.shell.ReplReporterImpl.withoutPrintingResults(Reporter.scala:64)
217        at scala.tools.nsc.interpreter.shell.ILoop.run(ILoop.scala:954)
218        at org.apache.spark.repl.Main$.doMain(Main.scala:84)
219        at org.apache.spark.repl.Main$.main(Main.scala:59)
220        at org.apache.spark.repl.Main.main(Main.scala)
221        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
222        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
223        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
224        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
225        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
226        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
227        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
228        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
229        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
230        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
231        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
232        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
233Caused by: java.net.URISyntaxException: Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes
234        at java.base/java.net.URI$Parser.fail(URI.java:2913)
235        at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
236        at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
237        at java.base/java.net.URI$Parser.parse(URI.java:3114)
238        at java.base/java.net.URI.<init>(URI.java:600)
239        at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
240        ... 67 more
24121/12/11 19:28:36 ERROR Utils: Uncaught exception in thread shutdown-hook-0
242java.lang.ExceptionInInitializerError
243        at org.apache.spark.executor.Executor.stop(Executor.scala:333)
244        at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
245        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
246        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
247        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
248        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
249        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
250        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
251        at scala.util.Try$.apply(Try.scala:210)
252        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
253        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
254        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
255        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
256        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
257        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
258        at java.base/java.lang.Thread.run(Thread.java:829)
259Caused by: java.lang.NullPointerException
260        at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala:465)
261        ... 16 more
26221/12/11 19:28:36 WARN ShutdownHookManager: ShutdownHook '' failed, java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
263java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
264        at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
265        at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
266        at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
267        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
268Caused by: java.lang.ExceptionInInitializerError
269        at org.apache.spark.executor.Executor.stop(Executor.scala:333)
270        at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76)
271        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
272        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
273        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
274        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
275        at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
276        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
277        at scala.util.Try$.apply(Try.scala:210)
278        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
279        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
280        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
281        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
282        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
283        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
284        at java.base/java.lang.Thread.run(Thread.java:829)
285Caused by: java.lang.NullPointerException
286        at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala:465)
287        ... 16 more
288

As I can see it caused by Illegal character in path at index 42: spark://DESKTOP-JO73CF4.mshome.net:2103/C:\classes, but I don't understand what does it mean exactly and how to deal with that

How can I solve this problem?

I use Spark 3.2.0 Pre-built for Apache Hadoop 3.3 and later (Scala 2.13)

JAVA_HOME, HADOOP_HOME, SPARK_HOME path variables are set.

ANSWER

Answered 2022-Jan-07 at 15:11

i face the same problem, i think Spark 3.2 is the problem itself

switched to Spark 3.1.2, it works fine

Source https://stackoverflow.com/questions/70317481