If you want to measure the time that it takes to run an operation, a common solution is to look at the current time before and after and compare the results. But what if the computer's time is changed between the two measurements? Then the result could be anything.
To solve this issue, computers provide a "monotonic clock", which doesn't change even if the computer's time does (maybe the system synchronizes with a time server, or the user just manually changed it). Some languages provide ways to access this value, so you have to decide what clock to use based on what you're doing with it.