Becoming data-driven: Simplyhealth's governance journey

1 August 2019

Charlie Hutcheson from Simplyhealth answers our questions about their centralised governance structure. You can also read Charlie's Learning Tableau blog - where Charlie writes educational pieces about his Tableau journey through Makeover Monday, Workout Wednesday and more.

Which governance model best fits your organisation?

We're currently in a delegated state, having started with the intention of a centralised state (from a data provision perspective). The issue we have encountered was that the initial data provision was fairly aggregated and then lacked the agility to disaggregate that data to answer the 'deeper' questions the business began to ask. That then resulted in pockets of data appearing based on custom SQL which was either disaggregating data or getting new / different data from new / different systems and sources.

How do you accept requests for data or content from the wider organisation? Is there a process for requests?

We use Jira to log calls requesting new data, but it isn't a process that has been sufficiently enforced, hence the appearance of these custom data sources. We're now transitioning to a more centralised model whereby we'll establish a formal data warehouse which theoretically will capture the full current requirements of the business and will formalise the data source issues we have. As part of that project, the enforcement of requests for new data or content will be more structured.

How do you work with the requester to create the data source or content?

To work with requesters regarding data or content, we ask our analysts (our Creator license holders) to act as Babel Fish and translate the end user requirements into technical terms for our Data Engineering team if it is a data request. We've learned the hard way that it's best to ask questions of the requester (why do you need this data, what further questions will it prompt, what actions are you hoping to drive? etc.) in order to mitigate the need to subsequently extend that dataset again. 

When people request content, again we've been on a learning journey. We started off by meeting face to face to discuss the requirement, we'd jot down a few notes, pretty much launch a prototype straight into production and then iterate to refine things. Since then we've taken more of a 'business partnering' type of approach, so we'll try to create a draft dashboard layout face-to-face with the end user, so we can show them how things might look and interact. We'll then get to the prototype point and sit desk-side to show the unpublished dashboard in action, we'll make changes from that meeting where required, then publish. We then follow-up with the requester after two and four weeks to check that all is as expected.

What user testing or certification processes for data and content do you have?

For certification, we first of all conduct a bunch of offline reconciliations to validate that the reshaping of data to fit our needs hasn't fundamentally changed it. Sources on the server are then tagged as certified or not (and all of our cheeky unofficial ones aren't tagged). In terms of certifying content we peer review to ensure that our content creators agree that the new dashboard complies with best practice and is logically laid out etc. We obviously also have the 'OK' from the requester after our feedback meeting to confirm that the design and content is as per their expectations.

How do you monitor who has access to what on your Server? Do you need to?

Server access is administered by our Data Engineering team, but is largely 'open'. We have a couple of Projects where permissions are restricted and the user base is controlled, but in general terms people can pretty much access anything as nothing 'needs' to be restricted.

When starting out, did you identify quick or big wins for data and content? Did you identify data or content that would guarantee users accessing Tableau? If so, how?

When starting out, my main aim was to deliver content to areas which had the most pressing need for access to timely data. For us that was sales areas, call centres and claims handling teams. Not only did those teams want up to date visibility of performance, but they also account for about 50% of the workforce. Delivering effective content there meant that we'd immediately get engagement from a big chunk of the organisation.

To further catalyse the transition from existing reporting which was basically showing the same stuff, we simply decommissioned the old content as there was no value in duplicating it, especially as the original content was manually refreshed as opposed to automatically updated via the published data sources.

In addition to that, we also gained buy-in from the middle-management layer who essentially expected their staff to use the reporting to monitor performance and opportunities, and use the content as tools to drive change and explain things.

What do you think works well about your governance model?

Our current governance model works well in the sense that the majority of content is serviced by certified data sources. Where it doesn't work well is in terms of consistency of output and management of new data requirements. By re-establishing a centralised approach we should overcome those issues. Ultimately we'll also adopt a physical structure where the provision of BI / MI becomes centralised too, which should result in more effective management of data and content requests, and a more consistent look and feel to our end output. I hope that it also drives a bit of enthusiasm and passion for data at the organisation because there isn't enough at the moment - we need to establish a CoE culture.

What would you do differently if you were to do it all again? (Lessons learned)

1 - You can't rely on data literacy at an organisation. Even at Exec level it remains a surprise at how poorly people can articulate requirements, and how poorly they can interpret charts. I don't know the solution to this! We have done what we can to tease out requirements and understand intentions, but it has been hard.

2 - Centralising EVERYTHING would work best for us. That's from data provision through to content provision. One team handling all requirements to drive efficiency, consistency and establish a motivated, can-do culture.

3 - At our organisation data is not treated with enough gravitas. Our analysts typically produce content as a secondary facet of a main role (as an example, I'm an underwriter but occasionally I have to produce content relating to sales, web conversion rates, fraudulent claiming etc.). Organisations need a focused analytical team in order to maximise the ROI on their data architecture and reporting infrastructure. Failing to resource with the appropriate focus and capability presents the risk of unraveling everything else.

Author:
Emma Whyte
1st Floor, 25 Watling Street, London, EC4M 9BR
Subscribe
to our Newsletter
Get the lastest news about The Information Lab and data industry
Subscribe now
© 2025 The Information Lab