Have aproblem with the timer object. I guess everyone knows that Timer objects are not very precise, that is, if I want the timer to run for say 1min of actual time with 100milliseconds interval, it will only run for about 58000 milliseconds of the actual 1 min. there is a difference of about 2 seconds(not sure but there is diference) between the actual time and the timer time.
Is there any way I could correct this or write a timer function using an Api such as timeGetTime etc.