Memory Leak in Shadow-item code with Automated Crafting mod
andrewaramsay opened this issue ยท 2 comments
Version Information
lithium-fabric-mc1.19.2-0.8.3.jar
Expected Behavior
Memory to not leak
Actual Behavior
Memory usage increases greatly with millions of LithiumStackList and RefIntPair instances.
Reproduction Steps
- Install Automated Crafting mod (automated-crafting-1.4.7+MC1.19-1.19.2)
- Place Hopper
- Place Automated Crafting Table (mod item) on top of hopper (So items will be removed by the hopper)
- Add an item (e.g. Squid ink) to the Template side of the crafting table (the item should not actually be consumed, a copy is placed in the UI for display purposes)
- Wait 15-20 seconds.
- Create a heap dump (I used Spark) and see an ever increasing number of LithiumStackList and RefIntPair instances. It should be a few hundred instances by 15-20 seconds but will continue to grow indefinitely. For example: https://spark.lucko.me/KGopb8Su2q
Other Information
I am not sure how Automated Crafting has implemented their template inventory, but from analyzing heap dumps, it appears however they did it, it's leading Lithium to end up in the multiple inventories / shadow item area of code (as best as I can tell).
Is it possible that the newed up RefIntPair on line 91 isn't correctly being garbage collected?
I can provide a world download and/or heap dumps if necessary.
Other Mods:
- automated-crafting-1.4.7+MC1.19-1.19.2
- fabric-api-0.67.1+1.19.2
- spark-1.9.42-fabric
- worldedit-mod-7.2.12
By accident Lithium is applying some optimizations with major assumptions to the modded inventory.
I think this actions build should fix the issue: https://github.com/CaffeineMC/lithium-fabric/actions/runs/3615282106
If you want to test it that would be great
Ran through the steps in my world where I reproduced in isolation and it appears to be working properly now. The heap summary shows only one LithiumStackList
and no growth over time https://spark.lucko.me/vB0jRSRmTi. I can try tomorrow loading up a copy of our main world where the problem first turned up and see if it works there as well, but I imagine the fix here will also be the fix there.
Thanks for the fast turnaround!