spark

spark

26M Downloads

Anomalies with background profiler

SlimeDog opened this issue · 4 comments

commented

Spigot 1.19.2-3610
spark 1.10.17

On server start-up

[08:12:52] [Server thread/INFO]: [spark] Loading spark v1.10.17
[08:13:39] [Server thread/INFO]: [spark] Enabling spark v1.10.17
[08:13:39] [Server thread/INFO]: [spark] Using Bukkit scheduler for tick monitoring
[08:13:39] [Server thread/INFO]: [spark] Starting background profiler...
[08:13:40] [Server thread/INFO]: [PlaceholderAPI] Successfully registered expansion: spark [1.10.17]
[08:13:40] [Server thread/INFO]: [spark] Registered PlaceholderAPI placeholders
[08:14:45] [spark-asyncsampler-worker-thread/WARN]: [spark] Timed out waiting for world statistics
[08:15:45] [spark-asyncsampler-worker-thread/WARN]: [spark] Timed out waiting for world statistics

On spark profiler stop

>spark profiler info
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡] Profiler is already running!
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡] It was started automatically when spark enabled and has been running in the background for 26m 14s.
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡] To stop the profiler and upload the results, run:
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡]   /spark profiler stop
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡] To cancel the profiler without uploading the results, run:
[08:39:54] [Craft Scheduler Thread - 53/INFO]: [⚡]   /spark profiler cancel
>spark profiler stop
[08:40:06] [Craft Scheduler Thread - 53/INFO]: [⚡] Stopping the profiler & uploading results, please wait...
[08:40:10] [Craft Scheduler Thread - 44/WARN]: [spark] A command execution has not completed after 5 seconds, it might be stuck. Trace: 
  java.base/java.util.zip.ZipFile$Source.getEntryPos(ZipFile.java:1662)
  java.base/java.util.zip.ZipFile.getEntry(ZipFile.java:331)
  java.base/java.util.jar.JarFile.getEntry(JarFile.java:516)
  java.base/java.util.jar.JarFile.getJarEntry(JarFile.java:471)
  java.base/jdk.internal.loader.URLClassPath$JarLoader.getResource(URLClassPath.java:954)
  java.base/jdk.internal.loader.URLClassPath.getResource(URLClassPath.java:322)
  java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:424)
  java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:421)
  java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
  java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:420)
  java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
  java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:575)
  org.bukkit.plugin.java.PluginClassLoader.loadClass0(PluginClassLoader.java:104)
  org.bukkit.plugin.java.JavaPluginLoader.getClassByName(JavaPluginLoader.java:207)
  org.bukkit.plugin.java.PluginClassLoader.loadClass0(PluginClassLoader.java:122)
  org.bukkit.plugin.java.PluginClassLoader.loadClass(PluginClassLoader.java:99)
  java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
  java.base/java.lang.Class.forName0(Native Method)
  java.base/java.lang.Class.forName(Class.java:390)
  java.base/java.lang.Class.forName(Class.java:381)
  me.lucko.spark.common.util.ClassFinder.findClass(ClassFinder.java:63)
  me.lucko.spark.common.sampler.source.ClassSourceLookup$VisitorImpl.lambda$visitStackNode$0(ClassSourceLookup.java:281)
  me.lucko.spark.common.sampler.source.ClassSourceLookup$SourcesMap.computeIfAbsent(ClassSourceLookup.java:340)
  me.lucko.spark.common.sampler.source.ClassSourceLookup$VisitorImpl.visitStackNode(ClassSourceLookup.java:278)
  me.lucko.spark.common.sampler.source.ClassSourceLookup$VisitorImpl.visit(ClassSourceLookup.java:272)
  me.lucko.spark.common.sampler.AbstractSampler.writeDataToProto(AbstractSampler.java:201)
  me.lucko.spark.common.sampler.async.AsyncSampler.toProto(AsyncSampler.java:180)
  me.lucko.spark.common.command.modules.SamplerModule.handleUpload(SamplerModule.java:351)
  me.lucko.spark.common.command.modules.SamplerModule.profilerStop(SamplerModule.java:337)
  me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:126)
  me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:408)
  me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:317)
  org.bukkit.craftbukkit.v1_19_R1.scheduler.CraftTask.run(CraftTask.java:82)
  org.bukkit.craftbukkit.v1_19_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:54)
  java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
  java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
  java.base/java.lang.Thread.run(Thread.java:1589)
commented

Please could you share the full log file

commented

Perhaps we could configure the delay?

commented
[08:13:39] [Server thread/INFO]: [spark] Starting background profiler...
[08:13:40] [Server thread/INFO]: [PlaceholderAPI] Successfully registered expansion: spark [1.10.17]
[08:13:40] [Server thread/INFO]: [spark] Registered PlaceholderAPI placeholders
[08:14:45] [spark-asyncsampler-worker-thread/WARN]: [spark] Timed out waiting for world statistics
[08:15:45] [spark-asyncsampler-worker-thread/WARN]: [spark] Timed out waiting for world statistics

The "timed out waiting for.." errors are because the server hadn't fully enabled 1 minute after spark loaded. You can safely ignore the warning.

[08:40:10] [Craft Scheduler Thread - 44/WARN]: [spark] A command execution has not completed after 5 seconds, it might be stuck. Trace: 

The command successfully completes shortly afterwards, so this warning is a false positive. :)