Open
Description
Hello 🙂
I think that, for any serious application instrumentation, millisecond precision is necessary. It is a good middle ground between second (too big) and nanosecond (too small) precisions.
Making it the default one will simplify setup for new apps. It will also help avoid surprises. When an app sends several events per second, but only one of them is preserved (by default), it is quite a surprise. It was certainly a surprise for me.
What do you think?