Running Spark Profiler while using OpenJ9 crash
MilleniumModsSources opened this issue · 6 comments
Apparently when running the Spark profiler while using OpenJ9's Java 17 causes the process to just abort.
spark profiler
[14:21:26 INFO]: [⚡] Initializing a new profiler, please wait...
[14:21:26 INFO]: [⚡] Profiler now active! (async)
[14:21:26 INFO]: [⚡] Use '/spark profiler --stop' to stop profiling and upload the results.
JVMDUMP039I Processing dump event "abort", detail "" at 2022/05/05 14:21:28 - please wait.
JVMDUMP032I JVM requested System dump using '/home/container/core.20220505.142128.52.0001.dmp' in response to an event
JVMDUMP010I System dump written to /home/container/core.20220505.142128.52.0001.dmp
JVMDUMP032I JVM requested Java dump using '/home/container/javacore.20220505.142128.52.0002.txt' in response to an event
JVMDUMP010I Java dump written to /home/container/javacore.20220505.142128.52.0002.txt
JVMDUMP032I JVM requested Snap dump using '/home/container/Snap.20220505.142128.52.0003.trc' in response to an event
JVMDUMP010I Snap dump written to /home/container/Snap.20220505.142128.52.0003.trc
JVMDUMP032I JVM requested JIT dump using '/home/container/jitdump.20220505.142128.52.0004.dmp' in response to an event
JVMDUMP051I JIT dump occurred in 'Async-profiler Sampler' thread 0x00000000020A3A00
JVMDUMP010I JIT dump written to /home/container/jitdump.20220505.142128.52.0004.dmp
JVMDUMP013I Processed dump event "abort", detail "".
``
[javacore.20220505.142128.52.0002.txt](https://github.com/lucko/spark/files/8636195/javacore.20220505.142128.52.0002.txt)
`
Spark 1.9.5, but I think it happened on every Spark version that I could've tested.
More details: Paper 1.18.2 (Also happened on Fabric and Purpur 1.18.1/1.18.2) | Running in a pterodactyl container | Was able to replicate this on ~10 different servers.
Sometimes it can take around 15-20 minutes after running the profiler for this issue to appear, and sometimes it happen just a few seconds after executing "spark profiler"
I've been able to reproduce - seems to be an issue with async-profiler.
In the meantime, you can start the profiler with /spark profiler --force-java-sampler
to work around the problem :)
Should be fixed by 06de991