Unit testing for speed?

| 1 Comment

Simon is thinking about using unit testing to help with performance testing. Whilst I've found it useful to use unit tests as very focussed ways to run a profiler on a selection of code I don't think it's a good idea to tie the success of your test to it completing within a particular length of time.

After all, you might decide to run all of your tests under a tool such as Boundschecker and it would be a pity to get false failures due to the fact that the test is running slower just because you're using some other tool or running on some other machine.

Then again, I think it could be useful to display the time it took to run the test, that way you may notice that the tests are suddenly taking longer due to a design change, or whatever...

More thought needed, I think.

1 Comment

I don't think that it's a bad idea. Going down that path could lead some interesting results, but it would need well...some testing.

The unit tests would need to be relative to something. Maybe seconds/Mhz would have to be less than .1. Or you could compare the runtime between different sizes of datasets to get an approximation of the big O speed. That way you could find out if someone shoves some really slow code in the middle of a well optimized algorithm.

In the end, if a requirement is that it has to run in 10 seconds on a 1000Mhz computer, or it has to get 30fps on a 500Mhz computer, one should test for that. The question is how do you make the tests more precise to narrow down what code changes actually makes it fail.

Leave a comment