- Where Developers Learn, Share, & Build Careers
I see the code in the window in C ++.
It is mentioned that 1 tick is equal to 100 nanosecond Is it typical for windows? Or is this a common standard, if this is the standard name? Is this the same on other OS?
To ask the above questions, I have to write a platform independent code, if this is Windows specific, then I have to add #ifdef WIN32 for this part of the code.
This is Microsoft specific:
Tick the smallest unit of time That is equal to 100 nanoseconds. A tick can be negative or positive.
In the Linux system you can use a high granularity timer to get the accuracy of 100 nanoseconds, but you have to handle them differently.
Comments
Post a Comment