On Tue, Oct 17, 2017 at 09:02:18AM -0500, Dan Rue wrote:
What I would like to see, and I don't know if it is even possible, is something that actually measures test coverage based on code paths in the linux kernel so that we have a means to actually measure our effectiveness. If we knew we were testing 43% (number pulled out of thin air) of the linux kernel code paths, then we would know what areas to focus on to bring that number up, and we would know which subsystems to have some confidence in, and which are uncovered.
Please read: http://blog.ploeh.dk/2015/11/16/code-coverage-is-a-useless-target-measure/
I worked with a team of developers over a decade ago trying to help with code-coverage analysis of the Linux kernel (many of those tests ended up in LTP). I'm pretty sure the ability is still there, but it turned out, in the end, that it means nothing at all.
Heck, even when you turn on fun things like "fail kmalloc() X% of the time to exercise error paths", you still don't really test the overall system.
So please, never think in terms of code coverage, but feature coverage, like what LTP is trying to accomplish, is a great metric to strive for.
thanks,
greg k-h