CC: Tweaked

CC: Tweaked

57M Downloads

textutils.unserializeJSON throwing 'Too long without yielding'

SammyForReal opened this issue ยท 3 comments

commented

Description

I noticed some behaviour when parsing large .json files. The file I used here is this one, with about 25MB of size.

When trying to unserialize the data of the file with textutils.unserializeJSON, the following message appears:
...rom/apis/textutils.lua:562: Too long without yielding

How to reproduce

This screenshot shows the steps I took to reproduce this error:
image

MC Version: 1.19.X; Mod version: 1.104.0
This also happens in emulators like CraftOS-PC or CCEmuX

Expected behaviour

Expected behaviour would be not to get this error instead of the result, even if it takes some time.
As compromise, maybe some kind of precheck can be included to make sure the given data is not too large?

Of course, the chance of someone parsing such large files is not that high. On the other hand, I did not test this with less larger files, that may be more realistic.

Either way, what the current behaviour is does not seem to be a proper solution to me.
And even though it seems unrealistic to parse such large files, it is not impossible.

I could not find an already existing issue, which is why I though I should open this one to make sure the community is aware of this.

commented

This is caused by the coroutine library. It's something related to the coroutine timeout.

commented

I don't think we'd ever aim to support parsing JSON files larger than 1MB, which should currently parse fine.

That said, there are probably some performance improvements which could be made here. I suspect the main bottleneck is parse_string, which we could probably improve a bit by using string.find rather than iterating over each character (much like we do for the Lua lexer).

commented

Another option would be to let the caller pass a reader function and behave like load does. The reader function can yield every once in a while if the input is too long.