HawkEye

HawkEye

192k Downloads

Archive instead of delete records

todd2982 opened this issue ยท 2 comments

commented

On my server we use mcbans.com for a banning solution so we need to keep our logs for 60 days. This makes the hawkeye database huge and slow for players to search.

As a workaround I have created a second table, hawkeye_archive, with the same schema and I setup a cron job to run daily that exports records older that 15 days to the archive table then deletes records from the production table that match the archive.

These are the SQL queries i'm using:
INSERT INTO hawkeye_archive SELECT * FROM hawkeye WHERE DATE_SUB(CURDATE(),INTERVAL 15 DAY) <= date limit 100000;
delete from hawkeye where data_id in (select data_id from hawkeye_archive where data_id is not null) limit 100000;

I'm sure many people will have a better way of doing this but I'd like to see it incorporated in the plugin. The simplistic way I can see people getting information from the archive would be a drop down box in the web interface.

Even if this doesn't get added, keep doing what you are doing. This is a great plugin and a huge asset for my server!

Thanks again!

commented

Just for curiosity sakes, how many rows did your hawkeye table contain before it got noticeably slow? This might be an issue with indexes, I'd rather like to see them being improved instead of archiving the table itself.

I'll turn on slow query log for my server, see if this there are any bad queries from hawkeye.

commented

One reason that this is slow is that MYSQL and Bukkit run on two separate systems and the MYSQL server does not have near the same resources. Although Hawkeye is the only place I notice a performance hit. With my current amount of data in one table, my players keep using the tool and search repeatedly causing lag to the database. This also causes lag for hawkeye writes to the table as well.

Current size and count of hawkeye and hawkeye_archive:

table, rows, data length, index length, size in MB
hawkeye, 242914, 25214976, 23134208, 46.11
hawkeye_archive, 8910083, 884998144, 751648768, 1560.83

I revised my archive query to something a little bit faster as well and I'm archiving data older than 5 days with a stored procedure:


-- Routine DDL
-- Note: comments before and after the routine body will not be stored by the server


DELIMITER $$

CREATE DEFINER=jump@23.23.198.36 PROCEDURE HawkArchive()
BEGIN
--Get current date and go back 3 days to find oldest 'data_id'.
-- This will not work well until the 4th day of the new month due to schema not using string instead of date.
select min(data_id) into @Cutline from MC.hawkeye where LEFT(date,10) = (DATE_SUB(CURDATE(),INTERVAL 5 DAY));

-- Copy everything before the data_id cutline to archive.
INSERT INTO MC.hawkeye_archive SELECT * FROM MC.hawkeye WHERE data_id < @Cutline;

-- Remove everything from production before cutline after copy.
DELETE from MC.hawkeye where data_id < @Cutline;
END

As I said it's just a feature request and the problem could very well be that I need better hardware for my database server.