[Bug]: Worlds Remain Loaded Despite KeepSpawnInMemory: False Setting & Configs Reset
mercurialmusic opened this issue ยท 2 comments
/mv version -p
output
https://paste.gg/p/anonymous/d05b26409ce14974ac39626df5cbfc08
Server logs
No errors. https://gist.github.com/mercurialmusic/8a0a334bdaa65104511eecb833d5e9bd
Server Version
Paper version 1.21.1-40-master@2fdb2e9 (2024-08-21T01:35:33Z) (Implementing API version 1.21.1-R0.1-SNAPSHOT)
Bug Description
Configs aren't being reset exactly, since so far this is the only setting we're noticing a change with, but despite setting them manually to false while the server is offline and also using /mv modify set keepspawninmemory false [world]
for all of our "accessory" worlds, all worlds are being kept loaded. When restarting this setting is automatically reverted to true. And yes, we know how to use YAML parsers.
Running the command in game does temporarily allow the world to unload as it should, but a server restart resets the setting again.
This only started happening since we updated to Multiverse-Core 4.3.13-pre.2.
Steps to reproduce
Using only Multiverse and a plugin to check which worlds are loaded I checked which worlds were loaded and observed that nearly all were. I used the mvm set
command as described above to disable keepspawninmemory, and observed that all spawns unloaded appropriately. I shut down the server and checked the worlds.yml and noticed all settings had been reverted to true. I set these to false and started the server and observed that all spawns were loaded again. The log of this test is below.
https://gist.github.com/mercurialmusic/b2f1235915e5da51e19ab71fbaf59b8d
Agreements
- I have searched for and ensured there isn't already an open or resolved issue(s) regarding this.
- I was able to reproduce my issue on a freshly setup and up-to-date server with the latest version of Multiverse plugins with no other plugins and with no kinds of other server or client mods.
World's being loaded and keepspawninmemory are 2 different things...
Which one is your issue
Duplicate of #3075