If you want to measure the time that it takes to run an operation, a common solution is to look at the current time before and after and compare the results. But what if the computer's time is changed between the two measurements? Then the result could be anything.
To solve this issue, computers provide a "monotonic clock", which doesn't change even if the computer's time does (maybe the system synchronizes with a time server, or the user just manually changed it). Some languages provide ways to access this value, so you have to decide what clock to use based on what you're doing with it.
Go has an interesting solution: time.Now()
returns the current time, and includes the "monotonic value". For example:
^ this is it
Then, the functions that deal with ranges decide whether or not to use this value. For example, t1.Sub(t2)
(which calculates a duration between t1
and t2
) uses these value so you can trust that it will return the proper duration even if the computer time changes.
I recently had a problem with this where I was storing the result of time.Now()
in a database, which lost the "m" value, and was then retrieving it back and comparing with another instance of time.Now()
. The solution was to "round" the time so that this value was stripped: time.Now().Round(0)
.
Cloudflare had an issue back in Jan 2017 where their service was down for a couple of hours due to a leap second. The problem was related to code written in Go, which at the time didn't support monotonic clocks.