Error after opening live viewer (Forge)
TehMartinXz opened this issue · 2 comments
Server software: Forge-1.19.2-43.2.1 ; Java 17
The error happens after accessing the live viewer. Doesn't seem to interrupt anything within the viewer or the server though.
Error:
spark profiler open
[14:51:55] [spark-worker-pool-1-thread-1/INFO] [minecraft/MinecraftServer]: [⚡] Profiler live viewer:
[14:51:55] [spark-worker-pool-1-thread-1/INFO] [minecraft/MinecraftServer]: https://spark.lucko.me/jIC3i3yPwq
[14:52:03] [ForkJoinPool.commonPool-worker-2/ERROR] [ne.mi.ev.EventSubclassTransformer/EVENTBUS]: Could not find parent me/lucko/spark/lib/protobuf/GeneratedMessageLite for class me/lucko/spark/proto/SparkWebSocketProtos$RawPacket in classloader jdk.internal.loader.ClassLoaders$AppClassLoader@5a07e868 on thread Thread[ForkJoinPool.commonPool-worker-2,5,SERVER]
[14:52:03] [ForkJoinPool.commonPool-worker-2/ERROR] [ne.mi.ev.EventSubclassTransformer/EVENTBUS]: An error occurred building event handler
java.lang.ClassNotFoundException: me.lucko.spark.lib.protobuf.GeneratedMessageLite
at jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) ~[?:?] {}
at jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) ~[?:?] {}
at java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[?:?] {}
at net.minecraftforge.eventbus.EventSubclassTransformer.buildEvents(EventSubclassTransformer.java:92) ~[eventbus-6.0.3.jar%2351!/:?] {}
at net.minecraftforge.eventbus.EventSubclassTransformer.transform(EventSubclassTransformer.java:44) ~[eventbus-6.0.3.jar%2351!/:?] {}
at net.minecraftforge.eventbus.EventBusEngine.processClass(EventBusEngine.java:26) ~[eventbus-6.0.3.jar%2351!/:?] {}
at net.minecraftforge.eventbus.service.ModLauncherService.processClassWithFlags(ModLauncherService.java:32) ~[eventbus-6.0.3.jar%2351!/:6.0.3+6.0.3+master.039e4ea9] {}
at cpw.mods.modlauncher.LaunchPluginHandler.offerClassNodeToPlugins(LaunchPluginHandler.java:88) ~[modlauncher-10.0.8.jar%2354!/:?] {}
at cpw.mods.modlauncher.ClassTransformer.transform(ClassTransformer.java:120) ~[modlauncher-10.0.8.jar%2354!/:?] {}
at cpw.mods.modlauncher.TransformingClassLoader.maybeTransformClassBytes(TransformingClassLoader.java:50) ~[modlauncher-10.0.8.jar%2354!/:?] {}
at cpw.mods.cl.ModuleClassLoader.readerToClass(ModuleClassLoader.java:113) ~[securejarhandler-2.1.4.jar:?] {}
at cpw.mods.cl.ModuleClassLoader.lambda$findClass$15(ModuleClassLoader.java:219) ~[securejarhandler-2.1.4.jar:?] {}
at cpw.mods.cl.ModuleClassLoader.loadFromModule(ModuleClassLoader.java:229) ~[securejarhandler-2.1.4.jar:?] {}
at cpw.mods.cl.ModuleClassLoader.findClass(ModuleClassLoader.java:219) ~[securejarhandler-2.1.4.jar:?] {}
at cpw.mods.cl.ModuleClassLoader.loadClass(ModuleClassLoader.java:135) ~[securejarhandler-2.1.4.jar:?] {}
at java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[?:?] {}
at me.lucko.spark.common.ws.ViewerSocketConnection.decodeRawPacket(ViewerSocketConnection.java:183) ~[?:?] {re:classloading}
at me.lucko.spark.common.ws.ViewerSocketConnection.onText(ViewerSocketConnection.java:105) ~[?:?] {re:classloading}
at me.lucko.spark.common.util.ws.BytesocksClientImpl$ListenerImpl.lambda$onText$1(BytesocksClientImpl.java:141) ~[?:?] {re:classloading}
at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) [?:?] {}
at java.util.concurrent.CompletableFuture$AsyncRun.exec(CompletableFuture.java:1796) [?:?] {}
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) [?:?] {}
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) [?:?] {}
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) [?:?] {re:mixin,re:computing_frames}
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) [?:?] {re:mixin,re:computing_frames}
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) [?:?] {re:mixin}```
Seems like a classloading bug - that class should exist, not sure this is a spark issue necessarily
I can confirm we also get the same issue on our 1.18.2 Forge server. But it only happens after a server restart when you first do /spark profiler open
The console throws that above error messages but it doesn't seem to affect anything negatively, the link to the spark profile is fine and the data shows correctly. If we then do a /spark profiler close and leave it for a few minutes, a subsequent /spark profiler open run perfectly with no error messages being thrown.