Shopkeepers

Shopkeepers

2M Downloads

high consumption

thedourn opened this issue ยท 11 comments

commented

Hello

I have a power consumption problem on my server identified by a timings:

Shopkeepers::Event: c.n.s.s.l.CreatureForceSpawnListener (CreatureSpawnEvent)count(17059357) total(33.82% 600.075s, 43.68% of tick)avg(0.04ms per - 21.84ms/620.95 per tick

  • Shopkeepers version: 2.9.1
  • Paper version: 1.15.2 build 93

thanks

commented

Please attach a complete timings report. And for comparison also please try with Spigot and attach a timings report of that as well.
My guess is that this is not an issue of Shopkeepers, but a side effect of some other issue (eg. something else causing excessive amounts of mob spawning or chunk reloading..).

commented

https://timings.aikar.co/?id=f17573be87034926b44635e43a3c0abc

The server does not work well with spigot that's why I use paper which allows finer adjustments

commented

I can't really explain the timings report. For instance, it shows Shopkeepers processing 484 CreatureSpawnEvents per tick while all other plugins process on average 0 per tick of the same event. This indicates that there is something wrong with the timings report.. I can't really derive an issue in Shopkeepers from this timings report. The mentioned event handler does not perform any performance intensive tasks. I don't know where these timing results come from or why it associates these performance costs with the Shopkeepers event handler..

Paper has some internal differences to Spigot which has caused subtle bugs in the past. To exclude some Paper modification being the underlying cause for this issue I propose testing it on Spigot. And to exclude some weird interference with another plugin being the issue I propose running it without other plugins to see if the issue disappears or persists.
You may for example make a local copy of your server to do these tests.

Something else you could try is to run the CPU profiler of Spark (https://www.spigotmc.org/resources/spark.57242/) and send me the report of that.

Also, IF you can reliably reproduce this issue (i.e. it persists on server restarts), could you try to run the server without Shopkeepers and send me a timings report of that? It would be intersting to know whether the performance impact disappears or gets associated with something else in the timings report.

commented

@thedourn Where you able to do some more testing? Especially the spark output might be interesting to help track down the issue.
Without further information I won't be able to help..

commented

I had time to test the server on spigot and the server is unplayable here are the timings with each

with spigot
https://timings.spigotmc.org/?url=ozerokolum

with Paper :
https://timings.aikar.co/?id=e819babf76604b3eb3eb71ce53f89270

commented

Sorry i have a lot of work i should be able to do the tests next week

commented

The timings (v2) seem to attribute quite some lag to the ticking of parrots now. Since there is an open issue about timings randomly attributing lag to a certain entity type (PaperMC/Paper#2489) I assume this is a false positive as well and the underlying cause for the lag is probably something different.

Looking at both of these timings I would investigate the number of hoppers used on the server. Try to somehow check for large hopper contraptions on your server. Cutting down on those will maybe resolve your lag issue. I only searched very briefly but wasn't able to find any good plugins for finding these hoppers. However, the 'regions' section of the v2 timings report seems to provide some tile entity statistics for various world regions.

Other than that I still propose giving a CPU profiler a try. This may produce more accurate results in regards to how much of its processing time the server spends on its various tasks. The Spark plugin mentioned above makes this as convenient as producing a timings report.

So far I don't believe Shopkeepers is your issue here and it somehow ended up as a false positive in your original timings report. I will therefore close this issue.
If you send me a Spark report I will look into it and try to further help you with figuring out what is causing this lag for you. But checking for and cutting down on extensive amounts of hoppers is probably a first promising step. Let me know in case you were able to resolve your lag issue.

commented

I had a hard time understanding the plugin because the commands change and that I don't speak English.
But I managed to edit her

https://sparkprofiler.github.io/#dwAMjozhlf

commented

Hm, the results doesn't show anything unexpected and mostly regular server activities. Hoppers only account for 1.50% of your server's load there. However, the profiling period is rather short, so this might not be representative. Could you try to run this for a bit longer and while your server is experiencing lag?

commented

hello here I redid an analysis

https://sparkprofiler.github.io/#fdz6j8xmQr

commented
  • 16% of the time is spent with ticking sheeps (mostly movement and collision handling). Check if there are excessive amounts of (possibly crammed) sheeps.
  • 12.5% is spent with hopper ticking. Check for excessive amounts of hoppers / try to reduce their count.
  • 10-14% of the time is spent with villager ticking (6% AI behaviors): Consider disabling 'tick-inactive-villagers' in the spigot config and see if that has an effect. Make sure that you have 'use-legacy-mob-behavior' disabled in the shopkeepers config (it is by default). Other than that, this should have nothing to do with Shopkeepers's villagers, since those have their AI disabled. So this is the time spent with ticking all regular villagers on the server.
  • 9-15% of the time is spent with ticking chunks (checking if they are should load, unload, etc.): Player count, view distance, number of loaded chunks, entity AI, etc., might affect this.
  • 3% of the time is spent by clearlag's MobLimitListener