spark

spark

26M Downloads

Exception starting profiler on NeoForge 1.21.2

jpenilla opened this issue · 1 comments

commented

Description

An exception prints when starting the profiler.

Reproduction Steps

Start a profiler, i.e. run /spark profiler start --timeout 30 --interval 2

Expected Behaviour

Profiler starts normally

Platform Information

  • Minecraft Version: 1.21.2
  • Platform Type: client
  • Platform Brand: NeoForge
  • Platform Version: 21.2.0-beta

Spark Version

v1.10.113

Logs and Configs

[15:58:27] [spark-worker-pool-2-thread-1/ERROR] [spark/]: Exception occurred whilst executing a spark command
[15:58:27] [Render thread/INFO] [minecraft/ChatComponent]: [System] [CHAT] [⚡] Stopping the background profiler before starting... please wait
java.lang.NullPointerException: Cannot invoke "Object.hashCode()" because the return value of "me.lucko.spark.common.command.sender.AbstractCommandSender.getObjectForComparison()" is null
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.sender.AbstractCommandSender.hashCode(AbstractCommandSender.java:44)
	at java.base/java.util.HashMap.hash(HashMap.java:338)
	at java.base/java.util.HashMap.put(HashMap.java:618)
	at java.base/java.util.HashSet.add(HashSet.java:229)
	at java.base/java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)
	at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
	at java.base/java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:411)
	at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.allSenders(CommandResponseHandler.java:77)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.broadcast(CommandResponseHandler.java:101)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.broadcastPrefixed(CommandResponseHandler.java:121)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profilerStart(SamplerModule.java:230)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:145)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:473)
	at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.lambda$executeCommand$3(SparkPlatform.java:370)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)

Extra Details

This issue doesn't happen on Fabric 1.21.2

commented

Pushed a fix, let me know if that works :)