Report Bug Tracking
February 12, 2010 11:33 AM   Subscribe

I have been tasked with leading report writing efforts for a new core banking system. We have spec'ed out around 500 reports and will be using OBIEE as our reporting / business intelligence tool. Has anyone worked with a methodology for developing and testing reports? It is easy to find articles and books on software development and testing, but reports seem to be a different animal. Maybe not? I am looking for process, procedures, and test cases that could be applied specifically for reports. Anyone have any experience with large scale report writing on a new enterprise system?
posted by kaizen to Computers & Internet (5 answers total) 1 user marked this as a favorite
 
I've worked spec'ing out reporting functionality in a couple of enterprise software products. Pure gruntwork. You design the reports users request or that you imagine they want and then you run a bunch and manually check to ensure they show the right thing. No magic really. If you've got 500, you either do it 500 times or do it 100 times and hope your failure rate is constant.
posted by GuyZero at 11:36 AM on February 12, 2010


Also, like everything software your reports will not launch 100% perfect so you need to have a bug reporting channel for when users find stuff that's wrong. Maybe that's obvious, but unless you do infinite testing and/or formally prove each report generation query you can't guarantee anything so you're going to need a feedback mechanism.
posted by GuyZero at 11:39 AM on February 12, 2010


What type of banking are we talking about here?

Considering just a couple of broad divisions - retail or investment banking - would offer different opportunities for corroborating the reports you're proposing to implement.

The specifics differ, but a large number of the reports you're likely to be creating for either group will tie back to regulatory driven reporting. This linkage will provide a natural control mechanism, as what's (presumably already) being reported for regulatory purposes can be, for all intents and purposes, considered your gold standard; after all, the regulators are signing off on these reports. As you develop reports they will, either partially or in entirety, tie back to regulatory driven reports.

Aside from that you're going to have to structure your implementation / Q&A testing / user acceptance testing hierarchically, so you're always building upon reports that you know are proven to be correct. Reports detailing nothing of interest to the regulators will probably, depending upon complexity, require the most time to validate.

Pay a lot of attention to creating appropriate test cases; data here, carefully structured, can help identify bugs long before product ends up in user acceptance testing.

Keep in mind that pretty much any bank will have very simple and very, very complex cases in actual practice; data mining can help effectively drive the creation of appropriate test cases. Its not uncommon to see a baseline number of test cases sharply increase (doubling or tripling perhaps) during regression testing.

Once again, more than likely test cases will already exist that were used to confirm regulatory driven reporting.

Rather generic advise I realise, but can't be more specific without knowing the type of banking we're talking about here.
posted by Mutant at 12:18 PM on February 12, 2010


My experience with reports is that the initial specification of what the user wants to see will invariably be wrong. Once they see a copy of a live report, they will have about 10 changes to make. They they will have one change every 6 months, forever. Bake that kind of uncertainty into any estimates.
posted by cseibert at 1:27 PM on February 12, 2010


When running new or changed reports over existing files or views, enable and pay attention to the database query optimizer output, and any analogous functions in your BI product. Reports that cause unique temporary indexes, or access paths, to be created by the query optimizer will be suspect for missing records, or mishandling data, for reasons such as join problems, or data type mismatches. There is always a tradeoff between building task based logical views over data files, which slow record intensive database functions, and the speed/independence you may gain when running reports over the purpose built logicals, but keeping an eye on those logicals (in terms of stats) can also help pinpoint report problems. New reports built over frequently used logicals, that employed tested joins, sorts and other manipulations are a lot easier to troubleshoot.

Writing reports that are user parameterizable at runtime takes a little longer, but, in my experience, makes the resulting report a more useful tool for users, and one in which they are more reluctant to make fundamental changes.
posted by paulsc at 9:47 PM on February 12, 2010


« Older Getting Excel to reconcile two types of data   |   Stop nursing now or taper off more? Newer »
This thread is closed to new comments.