Crashing in NeoForge DevEnv with NPE
shartte opened this issue ยท 1 comments
Description
When using Spark to profile my dev-env I get a NPE upon uploading the result.
Reproduction Steps
Clone AE2 dev env
Drop spark into run/mods
Run runClient
Profile & upload
Expected Behaviour
It should not crash.
Platform Information
- Minecraft Version: 1.21.1
- Platform Type: joined
- Platform Brand: NeoFOrge
- Platform Version: 21.1.1
Spark Version
v1.10.95
Logs and Configs
[00:46:28] [spark-worker-pool-1-thread-3/ERROR] [spark/]: Exception occurred whilst executing a spark command
java.lang.NullPointerException: Cannot invoke "Object.getClass()" because "value" is null
at TRANSFORMER/[email protected]/me.lucko.spark.proto.SparkProtos$PluginOrModMetadata.setAuthor(SparkProtos.java:16909)
at TRANSFORMER/[email protected]/me.lucko.spark.proto.SparkProtos$PluginOrModMetadata.access$34500(SparkProtos.java:16780)
at TRANSFORMER/[email protected]/me.lucko.spark.proto.SparkProtos$PluginOrModMetadata$Builder.setAuthor(SparkProtos.java:17151)
at TRANSFORMER/[email protected]/me.lucko.spark.common.sampler.source.SourceMetadata.toProto(SourceMetadata.java:77)
at TRANSFORMER/[email protected]/me.lucko.spark.common.platform.SparkMetadata.writeTo(SparkMetadata.java:132)
at TRANSFORMER/[email protected]/me.lucko.spark.common.sampler.AbstractSampler.writeMetadataToProto(AbstractSampler.java:191)
at TRANSFORMER/[email protected]/me.lucko.spark.common.sampler.java.JavaSampler.toProto(JavaSampler.java:195)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.handleUpload(SamplerModule.java:422)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profilerStop(SamplerModule.java:408)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:141)
at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:462)
at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:359)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
at java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:317)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1583)
Extra Details
Since it breaks on the exception in IDE, I was able to walk up the stack frames and find the likely culprit: