
Unable to use spark: permission not registered: spark.all
CynicalBusiness opened this issue ยท 2 comments
Description
We currently have a Forge-powered server running behind a Velocity proxy, but are unable to use Spark on the server for profiling.
We want to profile the Forge server itself, but usage of any spark commands results in an exception (see below). We are using LuckPerms, and have granted the relevant users both OP and any permissions we could think of (spark.all
, spark.*
, spark
, spark.profiler
, etc.) but none of these are working. It also does not work from the server console for the same reason.
While sparkv
and sparkc
do work fine, they profile the proxy server and client, respectively, which does not help us.
Reproduction Steps
- Execute
spark profiler start
- See error in server console
Expected Behaviour
Spark's profiler starts
Platform Information
- Minecraft Version: 1.20.1
- Platform Type: Server/Proxy
- Platform Brand: Forge
- Platform Version: Forge 47.2.20 / Velocity 3.3.0
Spark Version
v1.10.53
Logs and Configs
The exception stack trace is the following:
java.lang.IllegalStateException: spark permission not registered: spark.all
[21:34:40] [spark-worker-pool-1-thread-2/ERROR] [spark/]: Exception occurred whilst executing a spark command
at TRANSFORMER/[email protected]/me.lucko.spark.forge.plugin.ForgeServerSparkPlugin.hasPermission(ForgeServerSparkPlugin.java:204)
at TRANSFORMER/[email protected]/me.lucko.spark.forge.ForgeCommandSender.hasPermission(ForgeCommandSender.java:76)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.lambda$allSenders$0(CommandResponseHandler.java:74)
at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:178)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:734)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.allSenders(CommandResponseHandler.java:75)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.broadcast(CommandResponseHandler.java:92)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.CommandResponseHandler.broadcastPrefixed(CommandResponseHandler.java:112)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profilerStart(SamplerModule.java:229)
at TRANSFORMER/[email protected]/me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:144)
at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:431)
at TRANSFORMER/[email protected]/me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:336)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
Sparks's LuckPerms permissions tree:
Indeed spark.all
is not present.
Extra Details
All configs are their defaults.
This occurred after we moved the server to new hardware; it had worked fine before this, AFAIK.
You're using an old version of spark, this issue should be fixed in newer versions
We are running the latest version of Spark available for this version of Minecraft. Even still, after some testing, the issue persists even on the latest version with MC 1.21.
We were able to "solve" the issue by patching the offending method with a bit of a bodge, but it works enough to let use Spark again.
public boolean hasPermission(CommandSource sender, String permission) {
if (sender instanceof ServerPlayer) {
return ((ServerPlayer) sender).hasPermissions(2);
} else {
return true;
}
}