Spark showing wrong number of CPU threads in informations section
magrigry opened this issue ยท 2 comments
I am running a server through pterodactyl. Pterodactyl is not configured to limit CPU.
However, spark show that there is only one thread available.
Even if I setup pterodatyl to use some specific core, spark still show that only one CPU thread is available.
A similar server with a similar hardware, with the same configuraiton as above, but a different dedicate server, is showing the correct number of CPU threads.
I am not sure if I am miss understanding something. I don't feel like that my server isn't only using 1 thread, I can see that CPU sometimes go over 100% based on pterodactyl graph (which mean that it use more than a thread I guess) despite of what Spark says.
spark reports the result of this Java stdlib method call: https://docs.oracle.com/en/java/javase/17/docs/api/java.base/java/lang/Runtime.html#availableProcessors()
As far as I know, this is always a pretty accurate representation.
Maybe you've found a Pterodactyl bug?