I have unit tests. If one of them fails, my build fails.
I would like to apply the same principle to performance. I have a series of microbenchmarks for several hot paths through a library. Empirically, slowdowns in these areas have a disproportionate effect on the library's overall performance.
It would be nice if there were some way to have some concept of a "performance build" that can fail in the event of a too-significant performance regression.
I had considered hard-coding thresholds that must not be exceeded. Something like:
Assert.IsTrue(hotPathTestResult.TotalTime <= threshold)
but pegging that to an absolute value is hardware and environment-dependent, and therefore brittle.
Has anyone implemented something like this? What does Microsoft do for Kestrel?