I just realized I was thinking in binary when said "several tens", LOL. Still too much though.Microseconds (10^(-6) seconds) are only three orders of magnitude smaller than milliseconds (10^(-3) seconds).
That really depends. If game calculations frame takes about as much as timer granulation time, then it runs fine. If takes much more or less, there would be noticeable computational errors, especially if there's fluctuations in time consumption.In most cases, 0.8.0's delta time is not noticeably less 'smooth' than 0.9.0's.
Because why not? And yes, I perfectly realize that any speed difference is neglectible. But I stick to my philosophy of making it fast whereever it's possible. You know, you make slow this and make slow that, and then you realize that the whole thing is god damn slow. I kid you not, I once installed a dosbox front-end that took TWO MINUTES to startup on my high end PC with 100% CPU load, and every screen re-draw operation was taking between 1 and 10 seconds. "Ridiculously slow" doesn't even start do describe it. That's what happens when you don't pay enough attention to performance and "just write". Also, the british stocks server failure was also due to inadequately performing software so they switched to another one; they sure also switched to Linux from Windows, but that doesn't makes speed difference of program execution in order of several times.I'm confused about why you chose to use local variables in the loop, because the code is a bit harder to understand now (partly due to the naming) and because if a few local instead of global variable accesses in the outermost run loop is causing measurable performance differences, it's likely there is little or no other code at all in the loop.