Messy data inertia hindering your performance management? Here’s how to clean it up and put it to work

One of the common hurdles companies face when digitalising their performance management processes, is starting with ‘messy data’. Cleaning it up seems like a daunting task; tearing up  systems, disrupting business as usual and above all, costing time and money. It’s not difficult to see why operational transformation projects fail to get off the ground.

The reality, however, is not quite as painful as you might think. It’s more like getting fit – much more doable than you first expect. As long as you have a sustainable action plan, you can whip your performance process into shape. 

For data to be useful, it needs to go through four steps. If your data is messy, then there’s a problem somewhere within this process. This is what the four-step process looks like:

Step one: Recording

Often people think that their data quality issues originate right from the source. However, our client work shows that for most companies this is not actually the case. The quality of Point of Sale, Stafffing and procurement solutions means more data than ever is being accurately recorded. 

Step two: Aggregation

For around 85% of companies, the issues start at the aggregation level. Sales has a different system to merchandising, procurement, and finance. The key is to ensure that everything is consistent. The answer is not consolidation – there is still some benefit to keeping some team-specific specialisms. It’s all about aggregation. To get it right, you need:

  1. Timely and automated connections to the right information
  2. Common definitions (e.g. of gross margin) in the hierarchy, so we know what people are talking about
  3. An appropriate Master Data Hierarchy of how data inter-relates to business users. For example, matching the financial outcomes with the operational drivers, grouping the right business units together. 

This often makes companies nervous – it seems like a huge undertaking. It doesn’t have to be, however. One of the companies we worked with, in the food services industry, have nine different EPOS systems. It took on a three-year project to try and consolidate everything onto one system. Unsurprisingly it failed, huge inertia of rip-up and replace putting off any process. 

What the customer really wanted was to be able to look at information in a consistent way. The EPOS was capturing that information, just storing it with different codes. The best way to solve this was: 

  1. create output feeds from each system, 
  2. create common definitions for all product codes (e.g. making sure croissants were all being called croissants consistently
  3. working with the client to ensure we appropriately captured the operational drivers that fuelled business performance (E.g. Croissants were a function of coffees sold). 

By taking these steps, it only took three weeks with an aggregation approach.

Step three: Analysis

Around 80% of a business finance analyst’s time is spent pulling together pretty standard analyses, such as variance analysis. Poor quality aggregation means that these reports are often bespoke and not replicable. They also consequently lack sufficient time to actually fix the anomalies they have uncovered by connecting their analytics into the action workflow. 

The good news is that 75% of this type of analysis can be automated. At one of the top retailers in the UK, an army of analysts was producing variance analyses around which food lines were underperforming each week. However, the store managers couldn’t connect their actions to improvements. By automating variance analysis in a strong data hierarchy and connecting it to the workflow in stores, we could map the financial result directly to operational actions and impact over time. 

At one of our clients, we’re running 38 billion calculations a week, which wouldn’t be possible in excel (or BI). It allows you to continuously look at all of your sites at a granular level. Not just how revenue indexes by store but down to, for example, how are bananas indexing by store. It provides much more specific and qualifiable areas of action. 

Step four: Reporting

Reporting data needs to do three things to be useful for performance:

  1. Prioritise the information you need to focus on, removing the irrelevant information
  2. Communicate it in a way that is understandable; 70-80% of managers say they struggle with data literacy. The options are to train them better, (which takes time and is expensive), or communicate data in a way they understand
  3. Link it to something actionable. Just showing someone they are off-budget or down on last year doesn’t help. Let them know what levers they can pull in order to improve that performance

Once you’ve found the source or sources of the problem, the next step is to put this data into a habitual weekly routine in your organisation. You need to be doing four things:

  1. Surfacing insights: picking out where there is a variation to potential and improvement can be made
  2. Taking action: prioritising the three-to-five things that are most likely to improve and focusing on them
  3. Measuring impact: matching continuing analytics to the workflow to measure the impact of actions
  4. Scaling success: picking out what does and doesn’t work and sharing it across the organisation

We’ve been helping many medium-to-large retailers and hospitality companies spot the issues within their four data steps, solve those problems and digitalise the performance process. If you would like a free consultation with our data team, get in touch.

This article was co-authored with Chris Argent, GenerationCFO and Phil Thorne, CFO at Quorso.

Quorso’s platform is designed to make granular, operations-linked performance management easy for businesses of any size. Visit Quorso’s website to discover more

2 thoughts on “Messy data inertia hindering your performance management? Here’s how to clean it up and put it to work”

Leave a Reply