In my last post, I said most people are arrogant and overconfident when it comes to reporting in Excel. It’s human nature to be overconfident. But, the fact that accountants and analysts don’t test Excel spreadsheets in any proper manner is a serious issue! My background is in Finance, Business Intelligence, and corporate performance management, with a healthy dose of programing thrown in. The funny thing is, I run tests for everything except my spreadsheet work! So, yeah, I am just like you when it comes to Excel and testing. But I have chosen to change.
This led me to see what I could do to improve my spreadsheets. First, I had to have a stern talk to my ego. When it comes to spreadsheeting, I had to remind myself that there are errors and I must test for them. Remember, 2%-5% of all formula in spreadsheets that are used in corporate decision making have errors in them. And the larger the workbook and connecting workbooks, the greater the margin for error. Yes, material errors!
When it comes to programming, the percentages are the same. Except, of course, for the testing. After testing though, errors are reduced to 0.1% to 0.3%. Much better!
One way to achieve near error-free Excel reports requires you to move your deadlines and add more people to the team. This will give you enough time and man power to review and test the spreadsheets for errors. So every month, you will be reporting later, but more accurately and you will have increased your team’s headcount by at least 3 people to catch about 60% of errors in your spreadsheets.
Fagan (1976, 1986) developed a methodology at IBM that is best used for removing bugs from spreadsheets. It’s a logical inspection method that requires ALL cells to be reviewed by this team. Tests have proven that individuals find fewer than half of all errors – only about 40% or less (Basili & Selby, 1986; Johnson & Tjahjono, 1997; Myers, 1978; Porter, Votta, & Basili, 1995; Porter & Johnson, 1997; Porter, Sly, Toman, & Votta, 1997; Porter & Votta, 1994). Johnson and Tjahjono’s (1997) study found that on average, a team of 3 people can find significantly more errors than just 1 person.
Panko’s study (1999) found CPAs caught slightly more errors than MBAs and experienced spreadsheet users didn’t find more errors than inexperienced ones. Essentially, you need a new team and that will cost you an extra $300k a year for CPAs because they find the most errors.
For detail on the best practice to achieve the best and most accurate reporting with Excel, you can read Panko (2007): Recommended Practices for Spreadsheet Testing.
Instead of investing in a new team of testers and slowing the reporting process down, get a BI reporting tool like CALUMO. It will resolve the problem and save you hundreds of thousands of dollars in the process.
BI reporting tools like CALUMO take the complex business modeling out of Excel into a database, cubes and tables that are easily tested and reviewed, thus removing significant errors and risk. When new data is added, new spreadsheets aren’t added with more errors, the data is added to a trusted model. If a new report is required, the model is exposed in such a way that the required calculations and aggregations are available by dragging and dropping.
You will also save time and money. You save time by removing the manual work with Excel. I have clients who once spent 2 weeks preparing their month-end reports. It now takes 1 day with CALUMO. They also let 2 people go from the finance team because those 2 people spent all their time in Excel creating, maintaining and “checking” excel reports.
So your choices are:
1. Stay with the error-ridden and high-risk spreadsheets.
2. Add 3 people to your team and extend your deadlines.
3. Use at BI tool like CALUMO and have near error-free reporting, much faster than you do today and save money along the way.
Basili, V. R. & Selby, R. W., Jr. (1986), “Four Applications of a Software Data Collection and Analysis Methodology”, Software System Design Methods. Ed. J. K. Skwirzynski. Berlin: Springer-Verlag. 3-33.
Fagan, M. E. (1986), “Advances in Software Inspections”, IEEE Transactions On Software Engineering, July 1986, pp. 744-751.
Fagan, M. E. (1976), “Design and Code Inspections to Reduce Errors in Program Development”, IBM Systems Journal, 15(3), 182-211.
Johnson, P. & Tjahjono, D. (1997, May), “Exploring the Effectiveness of Formal Technical Review Factors with CSRS, A Collaborative Software Review System”, Proceedings of the 1977 International Conference on Software Engineering, Boston, MA.
Myers, G. J. (1978, September), “A Controlled Experiment in Program Testing and Code Walkthroughs/Inspections”, Communications of the ACM, 21(9), 760-768.
Panko, R.R. (2007) “Recommended Practices for Spreadsheet Testing” EuSpRIG 2006 Proceedings, Cambridge, UK, Sponsored by the European Spreadsheet Risks Interest Group (EuSpRIG), July 5-7, 2006.
Panko, R. R. (1999, Fall), “Applying Code Inspection to Spreadsheet Testing”, Journal of Management Information Systems, 16(2), 159-176.
Porter, A., Votta, L. G., Jr., & Basili, V. R. (1995), “Comparing Detection Methods for Software Requirements Inspections: A Replicated Experiment”, IEEE Transactions on Software Engineering, 21(6), 563-575.
Porter, A. A., & Johnson, P. M. (1997), “Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies”, IEEE Transactions on Software Engineering, 23(3), 129-145.
Porter, A. A., Sly, H. P., Toman, C. A., & Votta, L. G. (1997), “An Experiment to Assess the Cost-Benefits of Code Inspections in Large Scale Software Development”, IEEE Transactions on Software Engineering, 23(6), 329-346.
Porter, A. A., & Votta, L. G. (1994, May 16-21, May 16-21), “An Experiment to Assess Different Defect Detection Methods for Software Requirements Inspections”, Proceedings of the 16th International Conference on Software Engineering, Sorrento, Italy.