I love data and sharing it with anyone that’s interested, or better yet curious. But, as they say in the movies, “with great power comes great responsibility”.

If we’re giving people the chance to view their data in new ways, in ways they didn’t realise they ever could, we need to make sure that what we’re opening up is well structured so they can dig about, but also the data must be accurate.

Your business end users don’t want to hear an explanation of scripts, joins, views or whatever. It’s not that they’re not absolutely fascinated by how complex, or simple, the transformation script is … or that the data warehouse was built on a particular platform … but … well … they’re not interested! What they care about is that if they do a sum of all sales … that is what they get! If they want to be able to review figures by region, they have a field available to do that. Confidence in the figures they get will be a battle that, once won, will open up their world … but you need to be prepared for them to mistrust … especially, and ironically, if it’s “too easy”.

So, how do we go about making this new open world of data a place in which people have confidence? Well … one of the things that makes be a *bit* of a geek at times is my borderline OCD tendencies when it comes to data quality.

*cue groans from the back row*

But come on, you all know I speak the truth. If the quality of your data isn’t good, you’ll lose the trust of the business in a heartbeat and winning it back is a much harder process. So, documenting the data transformation, including quality and validation steps are vital. There, I’ve said it … now, don’t go running off and sticking your head in a bucket … data quality checking can be fun, like a game. Guys? Are you still there? … just me then.

But seriously, you must factor in documentation, data quality and validation.   These steps are SO important it is amazing how many places don’t focus on them more. However, we all know they are frustrating and unpopular because if you’re under pressure to delivery something that has the “wow factor” on a tight timeframe these steps don’t appear to offer an immediate return on investment. But it does … the minute someone has had their “wow” moment, they’ll look at a number, and, if it’s wrong, “wow” will become “whoa”, “ooooOOOooo” will become “oh” and all sorts of other negative noises. And while you’re discussing ways to check and revisit logic, whoever is in that meeting is more likely to doubt everything it will show.  

One key thing to do is that if you’re tight on time, start simple. For a POC be clear what concept you’re testing – data accuracy or the art of the possible. If you have a complex project, or multiple visualisation tools that need to be served, perhaps look at technologies that could help – TimeXtender is a fantastic example of a tool that can help answer the key challenges of governance, data quality, data security, and multiple data sources. But it is all only as good as the people that build it. So, check, check and check again … you could have a sparkling picture of a sunrise that explodes into an image of the earth and beautifully illustrates your stock usage pattern globally for a year … but if the numbers are wrong, you might as well have used chalk on slate.

Remember, data is at the core of what you’re doing: it is the foundation on which the “pretty” and the “wow” sits. You want to build on rock, not sand!

This article was written by Emma Doherty, Process & Systems Lead at Vodafone, Emma comes from a business and consulting background which gives her great insight into identifying user requirements and where real value can be added.