LuckPerms

LuckPerms

988k Downloads

[Question] The last packet successfully received from the server was 359,978 milliseconds ago.

Closed this issue · 10 comments

commented

As the title says. After testing around for a bit, I found that using --max-connections=500 --wait-timeout=600 or any of them to start the MySQL server for LuckPerms(bukkit & bungee) will make LuckPerms complain, for example:

17:09:49 [SEVERE] [LuckPerms Pool Thread #724] WARN com.zaxxer.hikari.pool.PoolBase - luckperms - Failed to validate connection com.mysql.jdbc.JDBC4Connection@6e9b20d9 (Communications link failure

The last packet successfully received from the server was 359,978 milliseconds ago.  The last packet sent successfully to the server was 1 milliseconds ago.)

Using no flag to launch the MySQL server seems to be fine. So, I wonder what would be the best value for them?

commented

I'm not too sure to be honest.

Here are the settings used by the LuckPerms pool:
https://github.com/lucko/LuckPerms/blob/master/common/src/main/java/me/lucko/luckperms/common/storage/backing/sqlprovider/MySQLProvider.java#L74-L101

You might find some more info to help here as well:
https://github.com/brettwooldridge/HikariCP/wiki

commented

I'm still testing. I was using MySQL server, I switched to MariaDB server and the problem seems still exists. Maybe you want to keep the issue open so I can post further more information here, so if anyone else have the same problem we can discuss here?

commented

Okay, I found something interesting.

According to my research on Google, people say that the

The last packet successfully received from the server was 359,978 milliseconds ago.

exception can be caused by a MySQL db connection which has been marked as timed-out by the MySQL server, thus the DB driver won't be able to use the same connection again.

Adjusting wait-timeout of the MySQL server should solve the problem. I'm still testing.

commented

I get the same error sometimes. I believe it just creates a new connection when the old one goes inactive so you shouldn't have any problem with this. I tried before to increase the timeout but then I get much worse performance with luckperms so I just left it like this.

commented

@simon44556 Left it like what?

commented

The mysql configuration I have it set like this I think it is the default config:
max_connections = 300
connect_timeout = 5
wait_timeout = 600

commented
commented

I saw this from the LiteBans config:

  # LiteBans utilizes connection pooling for extra performance and reliability.
  # min_connections: Minimum amount of pooled connections.
  # max_connections: Maximum amount of pooled connections. See: https://github.com/brettwooldridge/HikariCP/wiki/About-Pool-Sizing
  # timeout: Connection timeout.
  # idle_timeout: Maximum amount of time a pooled connection can remain idle before it is closed for inactivity.
  pool:
    min_connections: 1
    max_connections: 10
    timeout: 30 seconds
    idle_timeout: 1 minute

LiteBans uses the same MySQL driver, so LuckPerms could implement the same thing too?

commented

I linked you the LP settings above. It already uses similar options, and the pool size is configurable.

Idle timeout is whatever the default for Hikari is.

commented

I've been watching this problem for several months now. The source is not yet understood but I do know that this is a third-party plugin. If you remove ScoreboardStats, then this problem is not observed. But on the other server with ScoreboardStats this error is absent. The problem is very serious, although I'm a little lazy to fix it.