I thought I would throw this out to the group and see if anyone might be able to assist me:
Are there any industry standard(s) on calculating working hours for defect creation, development & validation (this would be a single hourly cost multiplier for all three areas)? The multiplier would probably be different with each phase of the SDLC (which I would need to know as well). I know that companies like Forrester, Gartner & Info-Tech exist for these types of issues, but just don't have access to the data.
I've wondered the same thing. There's probably reports out there, but when I think about how to come up with a standard here, it boggles my mind. It would be so specific as you pointed out...depending on phase, company size, programming language, progam size/complexity, product cost, defect priority/severity, etc. And some of these are so subjective.... It is hard for me to imagine the worth, but I'm sure it's been done, especially with respect to a company rather than an industry.
Probably for any company that meets a specific data set then sure, there might be parallels in the numbers but I have never seen a standard on this. Like most numbers, which can be manipulated in any way you want, would it be worth it? There are generic graphs that show the cost increase of detection of defects and fixing them through the SDLC but a formula for that might be going too far. What are you trying to do with such a formula or a number?