os.epoch discussion.
Wojbie opened this issue ยท 4 comments
Just wondering but are os.epoch values supposed to be returned in miliseconds for "utc" and "local"?
Is it supposed to be unix time or just a epoch?
Cause if its supposed to be unix time then the value returned has to be divided by 1000 to be in seconds instead of miliseconds.
As for "ingame" epoch.
Is it supposed to be something completely unrelated or is there a reason its calculated like that?
Sidenote - I do realize you can get unit epoch time by going math.floor(os.epoch("utc")/1000). But that does seem kinda illogical.
I've noticed issues with ingame
in a port. See cc-tweaked/cc-restitched#123 and cc-tweaked/cc-restitched#124
Looking at the source of os.epoch: here, and most specifically here
It's just the number of milliseconds since epoch 0 stored in a long, returned by calendar.getTime().getTime()
.
This long is directly given to LuaJ and not casted manually to a different numeral type.
It looks like there's no specific reason this is done. It's just a matter of preference.
In my opinion, I think this is the correct way to represent it and you can easily convert it to unix time by dividing it by 1000 and flooring it, like you described.
Maybe a little off topic, but I'm very excited because it's something I've needed for a long time:
A huge advantage to this new feature is that we can have millisecond-accurate performance profiling where we were previously restricted to a precision of 50ms (os.clock
).
@Wojbie @CrazedProgrammer
Is this Real-life epoch or in-game?