spark

spark

26M Downloads

Server crashing because of spark (Different to the previous issue)

redstonerti opened this issue ยท 4 comments

commented

Ever since moving my server to linux, spark kills the server whenever it does anything. I tried spark versions 1.9.45 and 1.10.9, I'm playing on fabric 1.19 and am on a fresh install arch linux.
Here's the jvm crash: https://mclo.gs/sMc5vna
Here's latest.log: https://mclo.gs/e0GUhkw
No crash report gets generated.
CPU: 7950x
Ram: 64GB DDR5@5600MT/s

commented

I've implemented a temporary solution:

Screenshot 2022-11-27 at 23 37 31

Unfortunately the problem is with async-profiler (the library the spark uses to gather profiling data) so there's not much I can do to resolve it. I have just updated spark to use the latest version though - so hopefully that will help!

commented

Thanks for trying to fix it! Are you aware that /spark profiler --force-java-sampler avoids the crash?
Also, unfortunately I downloaded the latest version of spark, 1.10.12 and the server crashes on startup. On 1.10.9 it also crashed on startup. On 1.9.45 it only crashed when I tried /spark profiler
JDK17 latest.log: https://mclo.gs/TzuZqmH
JDK17 JVM crash: https://mclo.gs/N1dXfPb

JDK19 latest.log: https://mclo.gs/dnQM7kq
JDK19 JVM crash: https://mclo.gs/EosVb6f

commented

After the 2nd startup it should be fine. You can also use this option in the spark config file: https://spark.lucko.me/docs/Configuration#backgroundprofilerengine (set it to "java")

commented

It is not fine after the second startup. It doesn't crash on startup if I set background profiler engine to java, but It still crashes a lot. I tried starting it up again after stopping it once and it crashed again. After that, it also crashed on startup. In order for it not to crash on startup, I had to close the server, remove spark, run the server, close it, add spark again and run it again. There was no other way. Could you please make an issue in the library that you use? Spark is a really useful mod and it's unusable for me right now. Thanks.