TimedTeleport becoming intensive
james090500 opened this issue ยท 8 comments
Hello,
I am having a look and for some reason TimedTeleport seems to be using significantly more resources than i'd expect. I might be wrong but would love your input.
Paper 1.13.2 git-Paper-413
EssentialsX-2.15.0.55
https://timings.aikar.co/?id=7bcba1ea6de84610b6c53d4d2b80cfbc#timings
Thanks for the report. However, on its own, the Timings report does not provide enough detail to nail down what the problem is.
Is there any chance you could install spark, start the profiler with /spark start --timeout 300
, perform a few timed teleports then post the link generated after five minutes?
Will definitely do that, the issue is that the spikes sometime seem to be random, however I have a stack trace which might be useful to you but in the meantime I'll install the plugin and see what I can do for you.
Here is the Spark Report
https://sparkprofiler.github.io/?hHUHeceMVm
By the looks of it, the server lags loading entities once a users teleported which means it may not be an essentials issue? Please confirm.
Thanks for getting back with both the stack trace and the Spark report!
From your Paper watchdog thread dump, it seems like the server locked up due to something to do with Brigadier (Minecraft's new command handling system) when Paper detected a freeze, at least at 10, 15 and 20 seconds. In particular, this recurring stack trace through each watchdog thread dump:
[13:24:25] [Paper Watchdog Thread/ERROR]: com.mojang.brigadier.tree.CommandNode.addChild(CommandNode.java:110)
[13:24:25] [Paper Watchdog Thread/ERROR]: net.minecraft.server.v1_13_R2.CommandDispatcher.a(CommandDispatcher.java:319)
[13:24:25] [Paper Watchdog Thread/ERROR]: net.minecraft.server.v1_13_R2.CommandDispatcher.a(CommandDispatcher.java:265)
[13:24:25] [Paper Watchdog Thread/ERROR]: net.minecraft.server.v1_13_R2.PlayerList.a(PlayerList.java:1172)
[13:24:25] [Paper Watchdog Thread/ERROR]: net.minecraft.server.v1_13_R2.PlayerList.f(PlayerList.java:825)
[13:24:25] [Paper Watchdog Thread/ERROR]: net.minecraft.server.v1_13_R2.PlayerList.moveToWorld(PlayerList.java:726)
[13:24:25] [Paper Watchdog Thread/ERROR]: org.bukkit.craftbukkit.v1_13_R2.entity.CraftPlayer.teleport(CraftPlayer.java:751)
[13:24:25] [Paper Watchdog Thread/ERROR]: com.earth2me.essentials.Teleport.now(Teleport.java:127)
[13:24:25] [Paper Watchdog Thread/ERROR]: com.earth2me.essentials.Teleport.teleport(Teleport.java:198)
[13:24:25] [Paper Watchdog Thread/ERROR]: com.earth2me.essentials.Teleport.teleport(Teleport.java:154)
[13:24:25] [Paper Watchdog Thread/ERROR]: com.earth2me.essentials.commands.Commandhome.goHome(Commandhome.java:97)
[13:24:25] [Paper Watchdog Thread/ERROR]: com.earth2me.essentials.commands.Commandhome.run(Commandhome.java:51)
Above that is this line:
[13:24:20] [Paper Watchdog Thread/ERROR]: java.util.stream.Collectors$$Lambda$34/684566052.accept(Unknown Source)
The fact that $Lambda$34/684566052
is repeated three times implies Brigadier might have got caught up on one command for some reason (though I'm not familiar with Brigadier or what it's trying to do at that point).
Would you happen to have lots of sheep in a condensed area somewhere? ๐ There seems to be a lot of sheep collision checking consuming CPU time.
Looking back at your Timings report, syncChunkLoad
s seem to take up a lot of the tick time spent processing EssentialsX teleportation commands like /home
and /tpaccept
. This is (currently) to be expected, as EssentialsX doesn't yet support async chunk loading. However, none of said commands seem to be taking up much time one their own. In addition, syncChunkLoad
also doesn't account for much of the TimedTeleport
tick time, and there's nothing to suggest synchronous chunk loading is an issue from the Spark report.
One thing to try might be to run Spark, then generate a Timings report that covers the same period of time as the Spark report. That way, it's easier to spot patterns shared by them both.
Is that EntiySheep living tick actually collisions or them just existing in the world?
Also, I'll attempt a timings and spark at the same time when I'm next on. Cheers for the help
Is that EntiySheep living tick actually collisions or them just existing in the world?
If you expand it down very far (and turn on the Bukkit -> MCP (1.13.1)
mappings), you'll see that the majority of CPU time spent on ticking sheep is spent on (presumably) their physics calculations.
In the end, turns out my VPS was just terribly underpowered, upgraded to a dedi and no other issues. to much going on and the CPU couldn't handle