spark

spark

95M Downloads

server 1.19 Purpur --stop no work

authorless opened this issue · 1 comments

commented

I am faced with the problem that when I want to check the performance of spark, I enter the /spark profiler command and Async mode turns on. But as soon as I write /spark profiler --stop, an error appears in the console.
Java version 17
I am using the Purpur 1.19 kernel
Current: git-Purpur-1735 (MC: 1.19)*

  • You are running the latest version
>.... [17:07:55 INFO]: ThuDanirex issued server command: /spark profiler --stop
>.... [17:07:55 INFO]: [⚡] The active profiler has been stopped! Uploading results...
>.... [17:07:55 ERROR]: [spark] Exception occurred whilst executing a spark command
>.... [17:07:55 WARN]: java.nio.BufferUnderflowException
>.... [17:07:55 WARN]: at java.base/java.nio.Buffer.nextGetIndex(Buffer.java:699)
>.... [17:07:55 WARN]: at java.base/java.nio.DirectByteBuffer.get(DirectByteBuffer.java:329)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.jfr.JfrReader.getVarint(JfrReader.java:455)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.jfr.JfrReader.readConstantPool(JfrReader.java:280)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.jfr.JfrReader.readChunk(JfrReader.java:216)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.jfr.JfrReader.<init>(JfrReader.java:80)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.AsyncSampler.aggregateOutput(AsyncSampler.java:186)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.sampler.async.AsyncSampler.toProto(AsyncSampler.java:166)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.command.modules.SamplerModule.handleUpload(SamplerModule.java:306)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.command.modules.SamplerModule.profilerStop(SamplerModule.java:300)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:134)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:382)
>.... [17:07:55 WARN]: at spark-1.9.33-bukkit.jar//me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:292)
>.... [17:07:55 WARN]: at org.bukkit.craftbukkit.v1_19_R1.scheduler.CraftTask.run(CraftTask.java:101)
>.... [17:07:55 WARN]: at org.bukkit.craftbukkit.v1_19_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:57)
>.... [17:07:55 WARN]: at com.destroystokyo.paper.ServerSchedulerReportingWrapper.run(ServerSchedulerReportingWrapper.java:22)
>.... [17:07:55 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
>.... [17:07:55 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
commented

Closing as a duplicate of #225 - please see my comment there