In the past 10 years we’ve seen massive changes in the way data is viewed in business. Companies like Amazon, Facebook, and Google are high profile examples of organizations that have built their success on how well they capture, analyze, and leverage data.
As the role of data has changed, so too have the tools that we use to analyze and present it. Products like Tableau, Power BI, Domo, and other business intelligence and data visualization tools have taken analytics out of the spreadsheet and put it into the hands of everyday users.
If done right, easy access to fact-based information is transformative. If done incorrectly, the result is ineffective at best, and disastrous at worst. In a three-part series on data and dashboards, we'll focus on a proven methodology for dashboard design, evaluation, and reporting. In part one of this series, we discussed how to design a dashboard, and this blog will focus on evaluating your dashboard with three simple questions: Is it accurate? Is it actionable? Is it accessible?
While this may seem like an overly obvious question, there are subtleties many people miss in this area. You must be able to collect facts and distinguish against what is speculation. Accuracy can be evaluated thoroughly in three ways:
Data collection and analysis can be fraught with inaccuracies. Common issues include duplicate or incomplete records and inconsistent naming conventions.
If any single component has an error, users will call the entire report into question.
To prevent this, ask your team to carefully check the raw data and determine appropriate fixes or precautions to take during their analysis. Once the calculations are complete, it’s essential to run careful and thorough checks throughout the work.
As tedious as it is, each and every formula needs to be examined to ensure they not only measure what they’re supposed to, but function as they’re intended to at any level they are used. For a more pragmatic check, also examine anywhere that data is broken out to ensure components add up to aggregates.
Sometimes people forget one of the first lessons they learn in data science: significant digits.
You should strive to be only as precise as you can be accurate.
In large companies, revenue and profit numbers may be hard to verify with great exactness. Showing pennies, or even beyond the thousand or hundred thousand level, may not be appropriate (or even helpful).
When a figure doesn’t match what you’d seen elsewhere, it can call into question the accuracy of the report. To instill confidence your team needs be able to quickly provide details on their analysis when questions arise. These details should include:
Where the data was pulled from
When the data was pulled
What filters were applied
Any time delays known in the system
How component parts are calculated
How the metrics are defined
You will need to enforce the discipline on your team to document as they build to make those answers readily accessible. It’s more work upfront, but having those answers in your back pocket will save you major heartburn down the road.
If someone can tell you everything they know about where the data came from, then inaccuracies can be assumed to be in the source rather than the analysis.
2. Is it actionable?
As a self-proclaimed data junkie, I find almost all analysis interesting, but just because it piques someone’s curiosity doesn’t mean it’s noteworthy. To help determine if data is appropriate to use and share, consider the following factors:
I think of insight as presenting a unique view that highlights relationships, trends, and opportunities.
To ensure your data is providing insights, provide clarity between metrics and business impact.
The easiest way to determine if a report is insightful is to ask yourself if there’s information that will guide business actions. Metrics may need to be accompanied by additional information to drive an insight home.
There’s no tried and true way to ensure insight is delivered. If you ask yourself “So what?” and there’s no answer, chances are your team needs to go back to the drawing board. You may need to provide them with more context on the business goals or operations.
Just because you provide insight into something doesn’t mean it’s worthwhile for the end user. My favorite example of this is the finding that areas with more telephone poles have a higher number of cancer cases. There is correlation between the two, but is it relevant? There are more telephone poles and cancer cases because there are more people, not because telephone poles cause cancer.
That’s not to say correlation doesn’t matter. Just because you can’t prove one action drives another doesn’t mean they aren’t related.
The point is, you want the information to be at a level where the reader can easily see the potential connections.
Highly related to relevancy, this is about how reasonable it is to use the information. If you receive a report that shows your channel can double its revenue if the enablement team delivers 3x more trainings in the next 2 months, that may not be practical and therefore is not helpful information.
To evaluate practicality ask yourself two questions:
Do I have the power and influence to take the suggested action?
Is the suggested action feasible?
3. Is it accessible?
With the current business intelligence and data visualization advancements outlined at the start of this article, it’s getting easier and easier to make reporting accessible. However, that doesn’t mean just posting a dashboard makes it accessible. When you publish a report, make sure it meets the following conditions:
Web-based reports have made it much easier to find and interact with dashboards and reporting. However, those tools may not always be available in your organization, or they may not make sense for a particular scenario.
Discoverability is about ensuring the target audience can easily find the report and the links that take you to it.
Make sure your team has placed links in commonly used landing pages and announcements, or that emails are sent whenever the information is refreshed.
This may be one of the hardest conditions to determine. It’s an art form that far too many data-oriented people are ill equipped to deliver on. Receiving a giant table of data is overwhelming.
Good reports are not only interesting to look at, they tell a clear story.
To be an effective communication tool that showcases progress and successes, reports that you share with external teams or up to your leadership need to be easy to understand. Executives are busy and don’t have the time to figure out what the point is.
A consumable report will have 5 key characteristics:
Straightforward top line summaries showcasing the main data points.
Highlights that direct the audience to the information that is most relevant to them
Clear guidance on how to use filters and controls (related to #2)
Intentional and organized layout so it reads left to right or top to bottom
Appealing visual grouping of relevant information through the use of colors, spacing, or borders
As much as we’d like every report we build to show live data, that’s not always feasible.
The trick is to ensure information is received in a timeframe that allows you to take action on the insights it provides.
If you cannot get data in a timely manner, we recommend a best practice of timestamping your data tables and charts with “as of month/year.”
To help ensure a quick turnaround on your data, take the following actions:
Push data owners to supply source information as quickly as possible
Have automated refreshes to decrease turnaround time
Commit your team to a consistent data delivery date
Once your reports are accurate, actionable, and accessible you will be able to make better decisions in less time, with more confidence
You’ll also have a tool to communicate progress and success across your organization at your fingertips. Invest in better reporting and enjoy the returns in saved time and budget!