<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=701837490009717&amp;ev=PageView&amp;noscript=1">
Artikel teilen:

Data Analysis is not an End in Itself. The Truth is: Not Everything that can be Counted Counts.

If you do not know what you actually want to know, it is very easy (if the necessary tools are available) to convert every micro-development in the company, every process, every result into colorful charts with the tool of your choice. So, you are busy for a while and have something to present to your colleagues that clearly shows everyone that someone has been working really hard the last few days. The best thing about it: the more charts you produce, the more certain you can be that nobody will look at them and point out embarrassing mistakes. Triple Win: The colleagues are impressed, the superiors are satisfied and you can sit back and relax. Voila!

In this narrative, obviously not only the questionable work ethic of the protagonist is the problem, but above all the fact that data analysis is carried out here purely for its own sake. And as far-fetched as the described scene seems: Unfortunately, there is an element of corporate truth to it, to this day. "The month is over - we have to report something there." All fancy charts that can be found (or that can be produced fast) are quickly put together, and the chore is done. But: The analysis of data material actually has a completely different purpose than pure occupational therapy for producers and recipients. It serves to gain knowledge - wherever knowledge can be used effectively.

 

No, Gaining Knowledge is not an End in Itself Neither

Because: More information in itself is initially nothing more than exactly that: more information. In most cases, I don't need as much information as possible (see also my article Data is not the Highest Good of Digitalization), but the right information to guide my actions and to give the right direction to decisions. So it is usually not about mass, but about choosing the right data - and that depends on the objective of the respective user group:

 

1. Track Target Achievement: Less is More!

Regardless of whether classic sales or profit targets from the corporation, funnel goals from sales and marketing, or interdisciplinary goals from the corporate OKRs: If you want to work in a target-oriented manner, you should focus on a few key figures and, in reasonable intervals, only compare these with the status quo. If necessary, a few (a few!) support measures can be helpful in reporting to gain more understanding of the context and thus drive supporting processes towards goal achievement as well as the respective core process itself. Here is an example of what such reports can look like.

This means: To track my target achievement, I don't need comprehensive dashboards that ultimately only water down my work focus, but rather clearly tailored reports/dashboards that contain no more than the key figures that are relevant to me. The only exceptions are places where several target trackings converge, such as at the corporate level or in sales management, being a superior position of several individual teams.

 

2. Guide for Operational Work: Less is More!

In operational terms, data (apart from the regular plan/actual reporting, which is of course also relevant here) must generally guide the action. Whether sales figures, margins and return rates for the category manager or impressions, clicks and order values ​​for the marketing manager: The prepared data generally influences many small and large decisions every day - and should therefore be clearly customized based on the tasks of the person concerned.

This means: Too much information in dashboards and reports should not distract the user from the urgent tasks of the generally tight schedule for daily work, nor should he or she be forced to work out the relevant key figures themselves through extensive analysis. A good way is, for example, individually configured alarms for relevant developments (this can look like this in the implementation) as well as in-depth, but simply designed, reports that make the relevant information quickly understandable

 

3. Observe, Compare, Control: Less is More

A distinction must be made between systematic target tracking (see point 1) and active analysis work (see point 4 below) and simple observing, comparing and checking developments using analytically-prepared data. In management functions, in some operational areas and, of course, in controlling, it can make sense to monitor developments of the responsible (or related to the field of activity) positions in addition to the systematic plan/actual reporting and, if necessary, compare these developments with developments from, for example, comparison periods, or the competition, to stay informed and/or to ensure that things are going as they should.

This means: For such operational and controlling purposes, individually configurable, easy-to-filter dashboards are well suited in many cases - consumed directly in the BI/analysis system, sent by email in a daily/weekly/monthly rhythm or in the office, projected onto the screen on the big wall. The biggest trap in this scenario is surely spontaneous moods of "Oh, that's interesting too!" - because if the monitoring dashboards are too full of information, the regularity of the reception is also gone with the clarity. They quickly degenerate into email spam, which is deleted immediately upon receipt.

 

4. Discover Risks and Potentials: More Can Actually be More

What remains is the active analysis work - a task that is typically left to specialists due to the required expertise. In this case, more can actually be more, because more (or more thorough) analyzed data material increases the chance of discovering untapped potential or undetected risks. This applies even more when the guiding question is less specific; if the analysis is purely explorative, the motto may actually be quite frank: more is more!

This means: Where there is a need to work out (still) unknown risks and potentials from large amounts of data, in principle it cannot be analyzed enough - so the key here is ultimately to let the machines work where people can no longer do it efficiently. Rule of thumb: The less specific the question (i.e., the more exploratory the analysis), the higher the analysis effort - and the greater the shift from humans to machines.

 

Conclusion: Know Your Use Cases

In order to distinguish these four use cases from each other and to provide each user group with what they need to be able to work with data in a targeted manner, it is crucial that the corresponding use cases are already known when the underlying BI solution is set up and are actively incorporated in its design (or incorporated into the initial user training for a purchased, use-case-oriented BI solution such as minubo).

Unfortunately, the BI project teams of companies are often still very IT-heavy - a big mistake, because without representatives of the specialist departments, no use-case-oriented solution can be developed (see also the Commerce Intelligence blueprint from minubo). But surely nobody wants to end up like the gentlemen in the cartoon above.

 

 

As always, I am looking forward to personal communication on the subject - just contact me at lennard@minubo.com and I'll get back to you as soon as possible.

If you would like to be informed about this and other articles, please sign up for the automatic email update - there will be a new post approximately once a month.

SUBSCRIBE TO BLOG

Artikel teilen:
    

Related Posts

If you do not know what you actually want to know, it is very easy (if the necessary...

If you do not know what you actually want to know, it is very easy (if the necessary...

If you do not know what you actually want to know, it is very easy (if the necessary...

Einen Kommentar verfassen

Stay up to date