
You may have heard this phrase.. it’s a ‘catch-all’ for clarity, alignment and confidence (aka trust).
Data like most things is subjective, certainly it terms of its interpretation. Interpretation happens through out the journey of any data point; it can happen in the consumption, processing or presentation stage. the interpretation can happen uniformly (via automated logic) or subjectively (via human review and intervention). knowing where and agreeing how these elements exist and behave, and perpetuating that information is key to the trust element. Having the domain knowledge and technical expertise to identify and define those aspects of any initiative designed to create a SSOT, is where the focus must be, if you want to succeed.
In theory, the idea of a SSOT is brilliant. Everyone from the CFO to the social media exec working from the same numbers derived in the same way from the same raw data , asking better questions, and making faster decisions based on flawless data.
However, the SSOT feels elusive and something that people know they want but don’t quite know where to start. The reason for this is because to achieve a true SSOT – one that genuinely streamlines data operations, ensures data quality and gets teams aligned – is a pretty massive undertaking, requiring a combination of technical expertise, deep understanding of a business & its stakeholders, and an ability to introduce and enforce transformation in how people work. The reality is that it can’t be bought or achieved purely through technology, and it’s certainly too big a task for one person.
Of course the scale of the challenge does vary with the size of the organisation and the complexity of its data, but generally speaking the places where the benefits of (and need for) a SSOT are the larger organisations, handling larger budgets with more disconnected teams.
We’re going to discuss the technical requirements for a true SSOT so that we know what we’re going to have to put in place as well as the organisational changes, roles and challenges we’re going to face to roll it out, maintain it and ensure it can evolve as your organisation’s requirements inevitably change.
Whether you are starting out on your journey to create a framework for a single source of truth or deep into a project or transformation initiative, it’s useful to think of the SSOT as having 2 key aspects:
The technical aspect of the project may be defined as follows:
“A reporting framework underpinned by an automated and consistent approach to acquiring data and a centrally-defined, shared data model that guarantees that all users of a reporting system will have their data requests satisfied by the same raw data, in a consistent way, regardless of the context of their data request.
By breaking this statement down into actionable ‘chunks’ you can start to identify the stakeholders who need to be involved in your project, ensure that they understand each component of the project, identify other team members and partners who need to be involved, and begin to look at the technology and expertise required, not just to deliver the project, but to maintain and extend it over time.
A single source of truth isn’t something you can buy : it’s something you build – carefully & collaboratively.
It’s about much more than adopting more technology : it’s about agreement, understanding and processing data in accordance with defined logic, applied at agreed stages, the implications of which are widely agreed and published among stakeholders and end users.
We’ve deployed Bright Analytics reporting across some massive organisations, where multiple internal teams, departments and agencies are all (initially) doing things in their own ways : implementing an approach to reporting & analytics that genuinely can be described as bring based on a single source of truth is about so much more than a common set of metric definitions or having a reporting tool that everyone uses : you have to get all the elements right, or none of it is right. This is why it’s elusive, but it’s possible.
Here’s our list of the critical points to start thinking about if you’re fed up of looking at wonky data and are serious about trying to deliver the elusive single source of truth!
It all starts here. If we’re all working off data that has been pulled in different ways, at different times, with different levels of granularity, then frankly we’re all stuffed and should give up now! Everything starts with a granular, automated, consistent approach to fetching the raw data all your reporting will be built upon.
This is where a robust ETL solution comes in. You need a solution to automatically import and store your data in a format that is reliable, accessible, flexible and accurate.
Probably the most obvious characteristic that jumps to mind when thinking of a single source of truth : if we’re not all using the same calculation for spend (‘Finance include VAT, Agency does not but they both call it ‘Spend’) then we might as well give up now! It’s imperative that everyone involved in reporting & analysis not only knows and understands and agrees on a common set of definitions, they have to be prevented from using any other definition!
This is where you need a tool to centrally define and publish a shared data model that all consumers of your data are accessing their data through. You need expertise and business-context to define KPIs correctly, and ensure everyone agrees to the definitions. (One team’s definition of ‘new’ customer may not align with another team, and this is where you need not only the clarity and visibility of the definitions, but project stakeholders who will communicate and ensure alignment.
You cannot have everyone simply working off the same raw data, but then expect them to all apply the exact same approach to its analysis : you have to have a system and guardrails that forces the data to be put through the same analytical lens for all users.
If I’m grouping data up using one definition of ‘Campaign Type’ and you’re using a different definition then even if our KPIs are identical our results are going to differ. Dimensions must be unique, serve a single purpose, and have a very clear definition behind them. If driven by rules then the rule logic must be visible in a UI and not buried under mountains of SQL that nobody understands and that no-one has touched for years.
Rock-solid Dimension definitions are only really achievable with a solid approach to taxonomy underlying all the names you are creating, so this is where you need to think about adopting a proper taxonomy tool. Multiple versions of an excel doc built a few years ago, with no validation and free form inputs all over the place does not count!
It’s all very well introducing a solid taxonomy but if people aren’t using it – or are regularly bypassing it – then your data is being impacted. Automated monitoring and exception reporting so that you can see which campaigns are not compliant and can intercept and fix them – is essential. Being able to trace the creator of the rogue campaign is important so you can track the culprit down and politely remind them that we have a taxonomy tool for a reason and it needs to be used!
Filtering and Dimensions must be intricately linked : you must only be able to filter on the values produced by your common set of Dimensions. If our filtering is different then we can have identical KPI definitions, identical Dimension definitions, but again have different outputs for the data.
This is where it’s important to have a suite of centrally defined and locked-down dashboards that are built and maintained by a a core team who know the numbers that all teams need to be accessing, and can easily build and customise these core reports.
We understand that there are some technical requirements and checkboxes before it’s a possibility, but what are the skillsets and changes needed to deliver this?
Getting teams to change their ways, abandon familiar processes and embrace your single source of truth project requires the ability and the skillset to get people to listen, to see what the benefits are, to ensure people are not lapsing back into old ways.
This is where you need influential stakeholders backing the single source of truth project : you need to be able to influence internal processes and behaviour and effect permanent change, otherwise all your efforts will gradually come to nothing as people fall back into their old habits.
This may sound odd given that a lot of emphasis with a single source of truth is on ensuring consistency and adopting rigidity in many areas, but having the flexibility to build and support new use cases and evolving requirements – without going back to the drawing board to revise fundamentals – is a fundamental attribute and requirement of a SSOT.
Inevitably different teams and stakeholders will want their own views of the data – and it’s important that this can be delivered quickly and easily. If small changes require a round trip through an over-stretched technical / data team and taker 2 weeks to roll out then the project will never build the momentum it needs to become the only way for reporting to be delivered.
Remember, a single source of truth does NOT mean a single view of the data … it has to support the different detail and context that each team needs.
A social team having the flexibility to easily see a view of ad-level performance for a specific campaign or audience is critical to them doing their roles well : the single-source of truth means that the CFO is viewing the same rows of data through the exact same lens when looking at Total Social spend by Audience.)
It also has to be able to respond to time-sensitive data requirements. All too often when it’s an emergency teams will fall back on bad habits and revert to source platforms.
Without the ability to flex and evolve hte data your can include and reporting on – without compromising the SSOT – adoption will slow, users will go back to old habits and work arounds, and momentum will slow.
You absolutely have to have sponsors who can influence behaviour and enforce change. People need to be comfortable saying ‘No’. Data model authoring and customisation needs to be easy so that it can be selectively granted. You don’t want to create bottlenecks and inflexibility, but you can’t have a free for all. Which brings us on to …
A key feature of a true single source of truth is the structure and boundaries it defines to enforce rigour where it’s critical. You must control who has access to extend and amend the shared data model. You need access control rules and roles that define who has access to create and edit reports. You have to have guardrails to ensure the whole ship doesn’t come off the tracks an you end up back where you started. Beginning with extreme lockdown to all but a tiny handful of core uses, with flexibility gradually unlocked where it is appropriate is a good way to start.
Balancing the flexibility in who has report authoring rights vs ensuring consistent views is a tricky one to strike. The best approach is to build rich & detailed dashboards that contain enough detail that users do not need to go looking elsewhere.
Achieving a single source of truth is only half the battle. The real work is in maintaining it — and, occasionally, policing it. There are no shortcuts here. Like planting a tree, you cannot dig a hole, plant the tree and then forget it. You have to keep checking in, looking at the usage data, checking people are engaged and not slipping back into old habits.
A single source of truth isn’t something you can buy : it’s something you build, carefully, collaboratively and with an eye on the future. It won’t happen overnight, because even though great technology makes it easier, the hard part is getting the internal buy-in and momentum behind the project, because delivering a single source of truth is really a change project.
But once it’s in place?
That’s the kind of truth it’s worth investing in.
Our self service platform eats complexity for breakfast. With our team of friendly data specialists on hand to help, we will have a solution perfectly tailored to your needs up and running in under a week.
Get In Touch