[Bug]: Gamerules and world spawn reset when multiplayer server restarts
TheDeviantCrafter opened this issue ยท 6 comments
General Info
- I can reproduce this issue consistently in single-player
- I can reproduce this issue consistently in multi-player
- I have searched for this issue previously and it was either (1) not previously reported, or (2) previously fixed and I am having the same problem.
- I am crashing and can provide my crash report(s)
- I am using the latest version of the modpack
Modpack version
1.0.0
Java version
Java 8 update 301
Modpack Launcher Used
GDLauncher
Memory Allocated
4 GB
Minecraft World Environment
Server
Misc Client Info
- Optifine Installed
- Shaders Installed
Server Java Version
Java 8 update 301
Server Operating System
Windows 10
Misc Server Info
- Sponge or Non-Vanilla Forge Server
Issue Description
Each time the server restarts, gamerules and the /setworldspawn location resets to their default values. Changes made to these commands only persist until the next restart. Until then they work as expected.
Additional Information
Have not tested in singleplayer or in version 1.0.1. Waiting on an official server download before I update.
Can you check whether the seed is staying the same between server restarts? We're tracking an issue some users have and it looks like their server configs are being reset. I'm unable to replicate so any help would be awesome!
I can change the seed by editing the level.dat file in NBTExplorer. That is where gamerules are stored as well. Would that be helpful? Neither setting is stored in the server configs folder to the best of my knowledge.
What I'm asking is for you to run the /seed
command when your server restarts and tell me if it stays the same. You don't need to make any changes in your files
Freshly generated world: 5939652427529613824
After restart 1: -4109549960344522187
After restart 2: -4109549960344522187
The server resets to a default seed on restart.
Hello!
We are looking into this issue more with the developer of the mod and ourselves with more testing/reviewing. We'll update you here once we have anything in regards. But, we are 100% aware of this issue and working on it as quickly as possible.
Thanks!