/spark heapdump Crashes server
JHarris12345 opened this issue ยท 2 comments
When my server has a lot of people on, /spark heapdump crashes the server and I never get to take the heapdump to see what is using the most memory.
Here is pastebin of a heapdump I did with only 8 players on (although this one didnt crash, it froze for about 30 seconds before giving a stack trace. Usually they crash on my main servers)
So your issue is that your server crashes, but you gave us info about the time it didn't crash. You aren't helping very much... Creation of heap dumps take a lot of time (depending on how much memory you have), this is not a spark-specific issue.
Are you sure the server actually crashed when you thought it crashed? It not responding for a few minutes might be normal, but again, it depends on how much memory you have allocated.
spark just asks the VM to create a heapdump - it doesn't actually do anything special to produce the output itself.
If you server is crashing as a result, there's not much that can be done. You might have better luck using another tool (not spark) - something from the command line - to generate the help dump instead.