Dynmap-Forge/Fabric

Dynmap-Forge/Fabric

1M Downloads

Properly store MarkerFile sets as relational data in SQL mode instead of giant JSON blobs

GNUGradyn opened this issue ยท 6 comments

commented

Hello, we used the FileTree storage mode for our creative server. This was fine until the world started getting large. We switched to MySQL mode to rectify the issue. However, it was extremely slow because the SQL server was sporadically timing out.
image
After some discussion and investigation in the support discord, we discovered the issue was Plots2Dynmap was creating a large number of markers in a single set. Dynmap stores the entire set as 1 gargantuan JSON string in a database cell. This resulted in 1 single string cell that was over 14MB long!
image
So the source of this sporadic error is every time Plots2Dynmap updates the markers, the database has to mutate this 14 million character long string, which hangs the entire Dynmap database for a bit preventing tiles from being created. These should be stored properly as relational data in the database in SQL mode so that large numbers of markers do not slow things down.

commented

It is worth noting this is an exceptionally extreme example, and plot2dynmap should not be creating these as persistent (see linked issue), but this could still happen if too many persistent markers are created so the data storage should probably be updated

commented

Even vanilla MySQL 5.7.x has had a dedicated JSON datatype for a while, which should speed things up quite a bit.

commented

The principle use of the marker sets is that they are returned to the web UI as a unit for each marker set layer - this sort of extreme pathology will not be helped by encoding the fields using a JSON data type (since they'll still be fetched as a unit - they'll just incur encoding/decoding overhead costs), and the web UI will still be served at 14MB JSON file. Honestly, this isn't gonna get better anytime soon, since enabling this sort of pathological use of the marker sets would create more problems than it solved - since every web UI request would then need to be processed to reflect (say) the visual bounds of the set, so that only the visible ones were returned - which would create additional query and caching impacts, particularly for external web servers (which currently just have those files published to them, and updated when needed, but otherwise are not changed and are thus cached and do not involve per-caller processing overhead).
Otherwise, the DB access is on a 'when modified' basis - which is also where a pathologically large set like this will have problems.

Net-net: encoding using a JSON data type will accomplish nothing, as the access and update pattern would not benefit from it anyway.

commented

(also, the citation of the "Use non-persistent markers for Dynmap" is spot on - there is no reason for these to be persistent, and if they were not, they would not BE in the database.... only reason a plugin should use persistent markers is if dynmap is the one that is going to be 'source of truth': 99% of add ons are presenting a marker based on something defined in THEIR data, and thus using a persistent marker just generates overhead AND potential loss of synchronization/coherence between the marker stored persistently in Dynmap and the object that it is derived from elsewhere).

commented

Thanks. I was not aware the entire blob of data was sent to the client at once but it makes sense why it must be done that way.

commented

Honestly, the whole marker system needs to be reworked, particularly given how high population some folks have opted to use them for (even on the web browser side, having tens of thousands of icon or outline markers is VERY expensive) - it's just that it would be a top to bottom rework, including for internal versus external servers (e.g. how to make it incremental and user view focused for static file servers or for S3 storage?). There are ways to do it that would involve things that other people will hate... partitioning the sets into lots of files representing contiguous regions on the map, so that the UI could fetch the 'sections' of a marker layer, much like it fetches the tiles - based on which ones are part of its current view port).