What struck me -- from the time that Tom first suggested this as a topic -- is the difficulty that most people would have in QAing a spreadsheet. Unless you're a wizard with Excel (or a competing product, but we all know it's mainly Excel), you just plug in data and formulae and hit a couple of Sum or Avg function buttons. I've never seen a really decent QA process that ensures a spreadsheet's accuracy. Granted, I've never gone out of my way to look... but software development, at least, acknowledges that the person who enters data may not do so correctly.
It's harder to test anything that is:
- developed without any real requirements
- has no documentation or comments
- is created by non-professional developers
- has no change control
Often (although not always), spreadsheet creation isn't really looked at as "software development" and thus not worthy of a real "QA process".
To me, it's not the fact that there's a spreadsheet involved that makes this situation a challenge. The same problems would exist if you allowed an accountant to build financial systems in C++ and didn't require any real process.
Too many testing organizations use spreadsheets to track defects, requirements, and test case status. There are just too many places in a spreadsheet for a tester to make mistakes. Spreadsheets end up growing, and growing becoming more and more complex to point where no one can manage them effectively anymore. Testing organizations need to invest in the proper testing tool to ensure their progress is tracked correctly.
Something tells me this handlful of examples is the tip of the iceberg with regard to accidental (or not) accounting-related spreadsheet mistakes. Given the complexity of some spreadsheets, it's almost unthinkable that such errors aren't the norm.
"Joe Strazzere"'s 1st post in this thread made excellent points.