The presentation of data is such a huge part of reporting performance for a campaign – there is the assumption that numbers == truth. All you have to do is let them do the talking. Right?
I’ve been on the right & wrong end of a graph over the last decade – here are some of my favourite learnings from working with data.
Presentation is Key
It is no secret that the type of chart really makes the difference; Pies, Scatter, Stacked bars all make a difference (more below), but working with the data can be more subtle than that.
Take a look at these. Notice anything?
All four of these graphs are reporting on exactly the same data – did you get it? If so, good work. If not, then you’re just like the vast majority of people reading this post.
Let me explain:
- “Standard” daily report of sessions for the month – most likely the way you’ll view it normally
- A cumulative view of the data – i.e. rather than displaying the value on each day, it is added on top of the previous. Day 1, day 1 + day 2, day 2 + day 3 etc
- Similar to #1, except the Y axis is far, far taller.
- A weekly view of the same data – except the final point isn’t the full 7 days.
I’ve seen data presented in all four of these ways, some intentional, some by accident, yet all give a different impression of what has happened. The cumulative view, for example, doesn’t mean a lot without a target, or a comparative data series (i.e. comparing against last year). Some reporting software changes the Y axis, defaulting to a different Max or sometimes not zero’ing the Y altogether. What does this mean? It can make the differences seem greater/lesser depending on how you change it.
It is very possible to over-report something, to look at data too frequently. For some, this may sound strange – we have all likely encountered someone who hasn’t reviewed data enough. Worse-perhaps is checking back more often than is useful.
To illustrate, let us look at a time sequence of visibility. Ignore over-reporting at the moment, instead focus on WWYD (What Would You Do) at each point.
For starters, an optimistic all-time-high for organic visibility (as reported by Sistrix):
Perhaps time to celebrate, send a congratulatory email – maybe even put your feet up? Humor me a little, you can see the time-range along the bottom isn’t done yet, so this is obviously a loaded question.
Either way, it is easy how some people may decide that they’re winning right about now.
Easy-come-easy go? A fairly sharp slump undoing all the work so far & in such a short space of time.
Arguably you re-assign budgets/time, to investigate what went wrong, to try & turn things around. Unfortunately, this isn’t over yet.
A painful few weeks sees an all-time-low in organic visibility – which I can tell you has a direct impact on traffic & revenue.
The point I’m making here is that it’s quite easy to know what you should do in retrospect, but looking ahead is hard. This is an extreme example here as it happened over weeks and this data was acted on. But, when looked at in this fashion your perception of a campaign’s performance is influenced by what has happened, not what is going to happen.
You could glance at this data hundreds of different times and read it totally differently.
Now, imagine that daily, hourly even. That’s enough to drive you to the frayed edge of sanity.
How often should you review data?
To make a real impact, you need to review your data over a time frame roughly proportional to how long it’ll take you to impact performance on that element. I don’t mean, if it takes 1 month to make a change, you need to review monthly (it’s tougher than that), but for example; reviewing rankings daily, in the vast majority of circumstances is too frequently. You can’t accurately enough link actions to ranking changes and therefore making daily changes is akin to thrashing about and wildly hoping something sticks.
Data With Context
Here, through some short examples, we’ll cover some more ways that context is crucial for reporting.
Performance Against Target
One of the elements that is glaringly absent from reports which rely just on Google Analytics is targets. It’s strong at reporting data against what happened before, with segments, filters etc, but if you assume that being up Year on Year or Month on Month is “mission accomplished” you may be missing the point for reporting.
+240% Month on Month
Looks like a strong enough increase – ignore for the moment you don’t have the absolute change (more on that below) – however, think of these scenarios.
- This time last year saw a seasonal peak which saw a 800% increase MoM – we saw a peak, just x4 less
- OR, whilst we have a 240% increase, to see a Return on Investment we need something nearer 600%
Again, the name of the game is providing reporting which is credible and means something to the business. You can opt to show green arrows and provide some optimistic commentary, but what would be better is to frame the metrics against the business needs (or a more representative frame of reference) and THEN reflect.
Relative & Absolute Change
How you illustrate change is crucial. There’s a fine line between sublime & ridiculous, the context of the change is important here.
Looks like a great leap? Except
+1,000%, + 100 Users
So, technically the increase is 1,000%, but a large percentage from a small number is still… a small number. We love reports with green numbers across it, but including these kinds of figures risks removing all credibility from the rest of the report.
Quick-Fire Chart Tips
Finally, here are some quick-fire tips I’ve picked up for visualising data. There are many more out there but stick to these where possible.
- Pie charts are basically horrible – but any more than 5 segments should be illegal.
- A cumulative graph without a comparative data set or a target line is useless, bin it.
- Change is good – Scorecards & tables are good at presenting data, but are you reporting a change, trending up or trending down?
- Less is often more – are those gridlines & extra data labels helping or hindering? Delete them & review – do you still get the picture?
- Use trendlines to illustrate change in a metric over time, NOT comparing unrelated data.
- Never, ever, ever use 3D charts.
- If you have to use a pie, use it to show proportions, not quantities
Some may disagree, and even insist that a pie-chart can be worth it (maybe for representing mobile device split, or maybe gender etc), ultimately a lot of this is subjective and many read & react to charts differently.
- Be careful about how you present your graphed data, resist the urge to tamper with the scales of the axis or the reporting duration to massage your data. Find a version which tells the story & stick to it, don’t change as it feels convenient.
- Don’t over-report on data. If you can’t affect change within a day, don’t check it daily – you’ll just go mad.
- Report key metrics against a target figure which matters, not arbitrary ones.
- Understand when to use relative & absolute change. Any numbers under 100% should absolutely not be represented as % if you can help it. But more importantly, understand when the proportion of the change is meaningful or not.
- Pick the right chart for the job, avoid peer pressure to add pie charts – stay in school, don’t do drugs.
This was just a quick journey through some of my key reporting/data tips I’ve picked up over time. Got any more? I’d love to hear from you in the comments below.