spark

spark

26M Downloads

40 TPS?

ruscalworld opened this issue ยท 4 comments

commented

I'm using spark on my production server that runs on Fabric, and sometimes I mention that /spark tps shows 20 TPS too often, even if my server is lagging, and it seems like TPS is much lower. On the screenshot below you can see that MSPT is very high and it does not correspond with shown TPS.

image

Then I've tried to use API of Spark to get the real TPS. I've made a simple mod that prints TPS from API to console every second. It looks like this:

public class TestMod implements ModInitializer {
    @Override
    public void onInitialize() {
        new Thread(() -> {
            do {
                try {
                    DoubleStatistic<StatisticWindow.TicksPerSecond> tps = SparkProvider.get().tps();
                    if (tps == null) continue;
                    System.out.println(tps.poll(StatisticWindow.TicksPerSecond.MINUTES_1));
                } catch (Exception ignored) { }

                try {
                    Thread.sleep(1000L);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            } while (true);
        }).start();
    }
}

After the server starts, it prints this to the console:

[16:28:45] [Server thread/INFO]: Done (1.543s)! For help, type "help"
20.0
20.343782390480282
20.694595471877648
21.057511195379373
21.433255322059033
21.823272870313037
22.227951723756263
...
37.51662772112439
38.72534919341197
40.01767714191615
39.99912561911397
39.99956400475235
40.001172034340605
40.000794549115724
39.99974800158759

For this experiment I've used Fabric 1.16.5 server with Spark 1.5.2 and Fabric API 0.32.5. Maybe I'm doing something wrong? Or it's a bug of Spark?

commented

Thanks for the report - can't beleive this slipped by for so long.. oops!

Should be fixed by the above commit

commented

It looks like it doesn't want to start at all now :)

[17:07:26] [Server thread/ERROR]: Encountered an unexpected exception
java.lang.RuntimeException: Platform has already been enabled!
        at me.lucko.spark.common.SparkPlatform.enable(SparkPlatform.java:123) ~[spark-fabric.jar:?]
        at me.lucko.spark.fabric.plugin.FabricSparkPlugin.enable(FabricSparkPlugin.java:61) ~[spark-fabric.jar:?]
        at me.lucko.spark.fabric.plugin.FabricServerSparkPlugin.register(FabricServerSparkPlugin.java:56) ~[spark-fabric.jar:?]
        at me.lucko.spark.fabric.FabricSparkMod.lambda$onInitialize$1(FabricSparkMod.java:51) ~[spark-fabric.jar:?]
        at net.minecraft.server.MinecraftServer.handler$zzc000$beforeSetupServer(MinecraftServer.java:1738) ~[intermediary-server.jar:?]
        at net.minecraft.server.MinecraftServer.method_29741(MinecraftServer.java:645) ~[intermediary-server.jar:?]
        at net.minecraft.server.MinecraftServer.method_29739(MinecraftServer.java:257) ~[intermediary-server.jar:?]
        at java.lang.Thread.run(Thread.java:834) [?:?]
[17:07:26] [Server thread/ERROR]: This crash report has been saved to: D:\Desktop\TestServer\.\crash-reports\crash-2021-05-23_17.07.26-server.txt
[17:07:26] [Server thread/INFO]: Stopping server
[17:07:26] [Server thread/INFO]: Saving worlds
[17:07:26] [Server thread/ERROR]: Exception stopping the server
java.lang.NullPointerException: null
        at net.minecraft.server.MinecraftServer.method_3723(MinecraftServer.java:572) ~[intermediary-server.jar:?]
        at net.minecraft.server.MinecraftServer.method_3782(MinecraftServer.java:599) ~[intermediary-server.jar:?]
        at net.minecraft.class_3176.method_3782(class_3176.java:567) ~[intermediary-server.jar:?]
        at net.minecraft.server.MinecraftServer.method_29741(MinecraftServer.java:707) ~[intermediary-server.jar:?]
        at net.minecraft.server.MinecraftServer.method_29739(MinecraftServer.java:257) ~[intermediary-server.jar:?]
        at java.lang.Thread.run(Thread.java:834) [?:?]

Using the same server with Fabric 1.16.5, Fabric API 0.32.5 and build 205 of Spark.

commented

Ah my bad, try again with the latest

commented

Now it works well. Thanks