Robert Važan

NLog timestamps with millisecond accuracy

What was wrong with the old NLog timestamps? Why was the new time source API needed? Because timestamps are important and not everyone was happy with them.

DateTime.UtcNow and DateTime.Now properties normally return time with about 16ms accuracy, which is normal in Windows. Windows however has a nice feature that allows multimedia and other real-time applications to temporarily increase time accuracy using timeBeginPeriod and related Windows API functions. Once the accuracy is increased to 1ms, DateTime.UtcNow returns time in 1ms increments. Cool, let's use NLog to measure how much time each part of the code consumes. Unfortunately, to my surprise, NLog still outputs timestamps in 16ms increments. How is that possible? And how to fix it?

All NLog code that needs current time passes through the following property getter:

internal class CurrentTimeGetter
{
    private static int lastTicks = -1;
    private static DateTime lastDateTime = DateTime.MinValue;

    /// <summary>
    /// Gets the current time in an optimized fashion.
    /// </summary>
    /// <value>Current time.</value>
    public static DateTime Now
    {
        get
        {
            int tickCount = Environment.TickCount;
            if (tickCount == lastTicks)
            {
                return lastDateTime;
            }

            DateTime dt = DateTime.Now;

            lastTicks = tickCount;
            lastDateTime = dt;
            return dt;
        }
    }
}

Apparently NLog is using Environment.TickCount to optimize calls to DateTime.Now. The first thing that surprises me about this code is that it is curing symptoms instead of the cause. Just replacing DateTime.Now with DateTime.UtcNow would likely resolve the performance issue. It would also help with consistent use of local/UTC time across NLog. Besides the code being convoluted, it's also buggy.

Even with Windows timers cycling at 1ms, Environment.TickCount still counts time in 16ms increments. Why doesn't Environment.TickCount respect the increased timer accuracy? I guess it's for the sake of consistency. Many apps likely depend on Environment.TickCount to run at particular frequency. So this code limits accuracy of NLog timestamps to 16ms even when DateTime.UtcNow is accurate down to 1ms.

I couldn't come up with solution that would satisfy everyone, so I decided to make time sources configurable. The old code was left as the default option. It was complemented with an accurate option that doesn't use the Environment.TickCount trick. Besides allowing for accurate time, pluggable time sources allow for choosing between UTC and local time at time source level. There is an option for application developers to supply their own time source, which is useful in many scenarios.