Description
Describe the bug
I am currently unsure if this is desired behavior, so if it is, please let me know.
When invoking a delay with a Duration
, the value of that duration is coerced to milliseconds and then delegated to delay(Long)
. It is not documented behavior that delaying with a duration loses granularity for nanoseconds or microseconds. For example, delaying with 998.75909ms
gets coerced to a call of 998ms
when delegating to delay(Long)
.
The offending code is the following:
internal fun Duration.toDelayMillis(): Long =
if (this > Duration.ZERO) inWholeMilliseconds.coerceAtLeast(1) else 0
What happened? What should have happened instead?
The symptom of this issue is that when working on implementing a cron-like API for delay based job scheduling, I was having jobs invoked up to 600us before they should have been invoked. The end result was that jobs scheduled to occur at the beginning of a second in a certain time zone would invoke early once and then as expected afterwards in quick succession.
The above function Duration.toDelaymillis
should likely coerce durations which have a nanosecond component up duration.inWholeMilliseconds + 1
.
Locally this implementation is working well for me.
internal fun Duration.myToMillis(): Long = if (this > Duration.ZERO) {
val millis = inWholeMilliseconds
if (millis * 1_000_000 < inWholeNanoseconds) millis + 1 else millis
} else 0
Provide a Reproducer
@Test fun foo() = runTest(timeout = 1.days) {
withContext(Dispatchers.Default) {
while (true) {
val expectedDuration = 998.milliseconds + 902752.nanoseconds
val actualDuration = measureTime { delay(expectedDuration) }
val difference = actualDuration - expectedDuration
assertTrue(difference.isPositive()) // fails
}
}
}