A password will be e-mailed to you.

It takes time – and the right questions – to really embrace modern data modelling and analytics

Excel used to be my go-to tool for data analysis and reporting. My reports were mainly created in a silo and shared with the standard stakeholder base, the executive team. There were times, however, when colleagues in professional services, sales and marketing required certain metrics, which I would generally create as one-offs.

After a very short time in this rapidly growing firm, I started to see the limitations of my traditional approach. The manual effort required to produce regular outputs of custom reporting wasn’t going to be scalable in the long run. But I took some convincing that an alternative tool could provide the solution.

A lot of my workbooks became quite large with 10, 20 or sometimes 30+ sheets. When I returned the following month to update the numbers with new data, I spent time refreshing myself on what I’d built, why I’d built it that way and where the numbers came from in the different sheets.

It was also difficult to interrogate during meetings. I’d be tracking down numbers buried in the depths of the workbook. I didn’t feel like I was giving the optimal experience to my colleagues but this was the traditional approach so I didn’t really question it too much.

When I was shown a modern data analytics tool for the first time, I almost immediately became guarded and defensive over the different methodology; it wasn’t tried and tested like Excel. Eventually, I opened my mind to it.

Excel is a hybrid tool: it’s a database and an analysis tool. You can plug data straight into a sheet and create a bar chart straight from that data. As a database, it can only handle around 200k rows by 20 columns before it starts to slow down, build out some complex calculations and visuals on top of this and you’re into ‘spontaneous crash’ territory.

I’m not saying we should all move away from Excel. Excel is sufficient in certain circumstances but we’re updating our paradigms, we’re evolving with the needs of stakeholders so we’re saying we’re Excel is here to stay but I need a modern data analytics tool to provide value to the business.

Approaching Data Analytics – Evolution

There are a few keys questions to consider at the outset of data analysis. This is where the evolution of our work comes in.

Before looking at those questions, it is worth mentioning that you can and should ask for help during this process. It isn’t all down to you to make all the improvements, not only should you involve your finance team you should also involve other functions in the business as well.

It is also worth considering expert help from those in your business that already use a modern analytics tool, even if that takes a bit of humility. You’ll probably find that others are pleasantly surprised you’re reaching out to them for help. Ask your IT team, who know your IT systems and databases, to help you understand the datasets available now and what you could access using modern tools (99% of data is accessible so don’t take no for an answer).

Here are just a few of the questions I’m now asking myself before I approach analytics work. I’ve included the full list later on.

  • Time – how much time do you need?
  • Sharing – who will this be shared with?
  • isualize – comprehensibiity and digestibility?
  • Automation – how do I automate the data refresh?
  • Live – how do I make it live?
  • Data, design and UX – how do I design this so it enables and empowers the users?

How do I start visualising data?

First, you need some data. Ideally, you’re connecting to the data in a source system either directly or via a middle-man database. You might connect directly to Salesforce, for example, and analyse the data in the source tables. Or you might have an extract job which pulls the data out of Salesforce into a middle-man database like Microsoft SQL Server.

The advantage of being connected to the source data is that your analytics is based on the most up to date information. The source data can be referred to as the ‘single source of the truth. It’s worth bearing in mind that the more layers you add between your source data and the analytics tool, the further away that tool goes from the source. Therefore, each layer adds a degree of risk of variation to the source data. If, however, disciplined data modelling is deployed, the risk can be mitigated. The least ‘best practice’ scenario is exporting data into Excel and doing a load of custom calculations and data manipulation, this you can perhaps appreciate is the riskiest way of handling data in this scenario.

Second, you need an analytics tool. Example tools include:

  • Tableau
  • Power BI
  • Qlik

These are the top three leaders the Gartner report. Gartner produces this report each year. Check out Gartner’s Magic Quandrant for the rankings.

This is really just a brief glimpse into how our roles are evolving and how modern tools are playing an integral part. These tools provide the key to us unlocking our potential to add new value to businesses but it needs us to be open-minded and progressive, the opportunity to remain relevant and valuable to business is right there in front of us, we just need to reach out and take it.

Authored by Sam Ellis.

Sam Ellis is a data analytics expert and company accountant at InterWorks.