The Data Analysis Maturity Model – Level Three: Distributed, consistent reporting systems
I’m covering a series of data analysis maturity levels, which are essential to performing Advanced Analytics. We’re often quick to adopt a new way of evaluating data, while sometimes ignoring the fact that analysis is built on trustworthy data. Following a series of steps in the organization starting with proper collection through a good storage and processing strategy is essential. As an aside, there’s a great reference to an ensemble approach to data storage and processing here. In the third level of analysis maturity, we concern ourselves with distributed, consistent reporting.
Unlike the previous level, consistency and consolidation (in the toolset) is not as important for reporting systems. The key in this maturity level lies in a central Data Dictionary and the use of abstracted views. If users know how to find the data they need and can quickly display it, the method they use (Excel, Access, Power BI, what have you) should be left up to them. In my estimation, up to 80% of an organization’s analytic needs can be met with effective reporting alone. This of course relies on the base data being clean, and the processing systems fashioned such that there is a good level of abstraction (such as database views) and secured properly so that the user accessing the data has the correct controls on what they can access.
That tool proliferation comes with a caveat, however. The users need to know how to use their reporting tool properly, so this involves access to training. For most every reporting tool there is, online training (most of it free) exists. The trick is to get the users to take that training, and this should be part of any IT department’s charter – user education.
Another important consideration in this level is security. This is where flat-file systems are less useful for base data, because there isn’t an inherent mechanism to limit what someone can see based on who they are, or to layer that access. An RDBMS or some other engine that has a tracking mechanism, is patched, and has constructs for role-based access is more useful for securing the data. Views or abstractions are other ways to secure the data so that the proper access is granted.
Past that, I am in favor of allowing the users to put the data reporting “wherever they live” – I’ve even made reports that output KPI’s inside an Outlook panel. If the users lives in e-mail, allow them to work with their data there – if in Excel, put the reporting there. Again, the adoption is key. The more the users can access the visualizations and outputs they need, the more effective the organization is as a whole.
What I normally set up within a reporting project follows what I call a “clicks paradigm”. It has to do with the way data is pushed out to users - whether that is based on OLAP or OLTP data.
A "zero-click" report is "data that finds the user". It's an automated report or some other output mailed to the user or sent to their mobile device. I tend to keep these short and only for important information that needs to be reviewed or acted on quickly. No training is required to use this report, and its information should be designed to be “self-documenting”, in that the report itself explains how to interpret the data.
A "one-click" report is a static view that shows the results of a query either graphically or in tabular format (or both). Most company users live here. This type often includes things like role-based Dashboards, and once again, should be self-documenting with hover-over explanations for interpretability.
A "two-click" report is a standard view (web or otherwise) that has dynamic selections to show various slices of data. Most managers live here. This is where user training becomes important, because the number of slicers or filters may require some sophistication to be able to interpret properly.
A "three-click" report involves a tool that can create reports, such as Excel or Power BI that can select data objects and display their output. The users at this level need to be documented, tracked, and trained to create these reports, and I often find that a repository of standard reports can be useful so that users have a catalog of finding a report they can use or modify. This increases overall organization efficiency.
In the next article, I’ll cover the next level of analysis maturity – practical Business Intelligence implementations.