it seems an absolute truth in development community that Unit Test is a must have, and you should add it whatever it costs (I know it is not 100% like that). Let me play devil's advocate here.
Management wants to introduce Unit Testing in the hope to minimize regression dev mistakes in every development cycle. <- Here is where I think we may be using the wrong remedy.
It's a MVC web application, with a good level of decoupling but with extensive not readily testable .js code, stored procedures, etc. Many times the regression errors happen due to wrong implementation or merge errors.
So I'm not asking how to add Unit Testing to an existing code, that is plenty answered in bottom link. My initial plan was to build integration tests with many scenarios, which would cover the "whole" app. It seems more valuable than 5000+ unit tests. Then we could try to add Unit tests as we go and we could see the benefit prove itself, if it really does.
In addition, some of the benefits of Unit Testing seem vague to me, it allows you to replace frameworks without breaking app, and it allow you to refactor code without breaking app.
Now, I ask:
Does it effectively minimize regression errors?
Can you write unit tests without rewriting the app substantially?
Can you promise refactoring code won't generate expensive new bugs? (I know this is not a valid question) How to you explain to business that you broke the app refactoring?
What about code history? Sometimes it is very much important for auditing to know why some code was introduced and refactoring loses that value, if lucky, you will only find it navigating long time through source control.
I know you read this and I'm one of those closed-minded people who won't change opinion, I promise I'm not!
At the end of the day, what we need is stability much more than avoiding a handful re-opened defects. And I'd like to find the most effective path to start.
Last but not least I did read this other threads which is brilliant.
Please share your thoughts.
Thanks