Dynmap-Forge/Fabric

Dynmap-Forge/Fabric

888k Downloads

Server Crash with NoClassDefFoundError when using S3 storage

qe201020335 opened this issue · 1 comments

commented

Issue Description:
I followed the wiki to set up my dynmap server to use aws_s3 as storage type. It crashes on start with java.lang.NoClassDefFoundError.
I am using CloudFlare R2 and I did make sure the server is working correctly with the default config before I switched to R2.

  • Dynmap Version: 3.7-beta-4-forge-1.20
  • Server Version: forge-1.20.1-47.2.0 with Java 17
  • Pastebin of Configuration.txt: https://pastebin.com/spAu3U2v
  • Server Host (if applicable): Self hosted
  • Pastebin of crashlogs or other relevant logs: Crash-report and Server Log
  • Other Relevant Data/Screenshots: The only mod I installed is dynmap
  • Steps to Replicate:
    • Download forge and make a server
    • Download dynmap from curseforge
    • Run the server with dynmap to make sure default config is working fine
    • Close server
    • Edit configuration.txt to use R2 as storage
    • Run server and observe the crash.

[✔️] I have looked at all other issues and this is not a duplicate
[✔️] I have been able to replicate this

I have also made a post on the dynmap discord help channel but I did not get any help there. It got somewhat technical as I tried to figure this out. The discord thread

From the log and call stack, I can see it tries to publish static web files to the bucket, and if I understand it correctly, some sort of dependency injection failed and caused the crash.
image

Then I look through the related code and I believe the exception is thrown in this function.

https://github.com/webbukkit/s3-lite/blob/be0f3eee4c6f37a0aef01b077048a0c6b5b3e6df/core/src/main/java/io/github/linktosriram/s3lite/core/client/DefaultS3Client.java#L193-L211

    private static S3Exception handleErrorResponse(final ImmutableResponse httpResponse) {
        final HttpStatus status = httpResponse.getStatus();
        if (status.is3xxRedirection()) {
            ...
        } else {
            return httpResponse.getResponseBody()
                .map(inputStream -> {
                    try (final InputStream input = inputStream) {
                    	XMLInputFactory fact = XMLInputFactory.newInstance();
                    	XMLEventReader rdr = fact.createXMLEventReader(input);
                        ...
                        }
                    } catch (final IOException e) {
                        throw new UncheckedIOException(e);
                    } catch (XMLStreamException e) {
                        throw new RuntimeException(e);
                    }
                })
               ...
        }
    }

In which, XMLInputFactory fact = XMLInputFactory.newInstance(); is causing problems as seen in the stack trace.

Since this is the error handling code for the s3 API, I know my config is probably wrong because it failed to publish the files. However, due to the unhandled error in the "error handling" code, I don't know exactly what is wrong and what is the status code returned from R2.

I have tried multiple JDKs (Oracle, OpenJDK, Microsoft) on both Ubuntu and Windows, and it all crashes with the same exception.

I did some more google searches and I think I found some related issues (i am not sure)
https://stackoverflow.com/questions/64979229/eclipse-osgi-java-11-jaxb-and-the-classloader
https://stackoverflow.com/questions/72093691/jaxb-with-jdk17-classloader-issues
It seems there are some problems with higher versions of jdk.
So tried Java 11 and 13 but they can't even load the launcher jar.

Error occurred during initialization of boot layer
java.lang.module.FindException: Error reading module: libraries\cpw\mods\bootstraplauncher\1.1.2\bootstraplauncher-1.1.2.jar
Caused by: java.lang.module.InvalidModuleDescriptorException: Unsupported major.minor version 60.0

I guess I have to use Java 17, and I don't know what to do now.

commented

S3 support is preliminary and basic, to quote the dev:
the S3 thing is just what I expected to see: the upgrade was a PR that promised to support non AWS S3 implementations, and I made it quite clear that:
a) I wasn't going to test the arbitrary number of non-AWS S3 implementations out there, as they cost money and time, and almost nobody uses the feature
b) Consequently, it is an experimental and thus unsupported feature - folks can contribute fixes for specific implementations, but actively supporting the dozen plus 'mostly S3 compatible' implementations out there - along with whatever voodoo each of them has, if any, to enable an S3 bucket to act as a web site - is just not practical.

it either works or doesn't according to the guide on the wiki, the dev is not going to fix it as he doesn't have time to, and I don't know where to even start.