Being Data Driven

5 Steps To Becoming A Data Driven Organisation

It’s a seriously tense atmosphere. Even for an above-average height guy, I’m feeling pretty small right now. Every pair of eyes in the room is either on me, or on the VP of Product’s increasingly flushed face.

I’m suddenly aware of how much noise the struggling AC is making. The whir of the fan in the overheating windows laptop at the end of the conference table is especially loud. I try not to look at it. I can’t break eye contact with everyone in the room. Stand tall.

I know I’m correct. I need to stand by what I’ve just said. We need to accept it so we can move forward. Everyone else in the room, apart from her, knows I’m correct. It’s all over their faces. All I’ve done is guide them down a discovery path to answer a few questions – I didn’t actually tell them anything they hadn’t told themselves.

How Naive I am…

2 minutes ago, I called out the VP of Product in front of her entire team. Through a series of questions, we established they aren’t a data driven organisation. Something the VP of Product passionately protested. So I went on to point out all of the previous product decisions they made, and how all of the data available to them at that time actually pointed to a different path to the one they followed.

I’m here to try and help this company integrate data into their decision making process. But I’ve just unwittingly accused a self proclaimed “Data Driven Evangelist” of ignoring her data at every major decision.

(I will later learn from an inside contact that the VP of Product was previously responsible for planning and implementing the organisations “Data Driven Strategy”, a project which contributed to her promotion to VP Product.)

Unsurprisingly, I’m swiftly (but politely) removed from the board room due to ‘an early end to the meeting’. In less than 5 minutes of declaring this organisation is not currently data driven, I’m in an Uber and on my way back to the office!

You wont be surprised to hear I didn’t make a sale of our Predictive Analytics platform. But I did leave with a strong validation for my questioning framework to uncover if an organisation is really data driven!

Here is a condensed version of the questions I asked in that boardroom, and what to do about each point.

1. Do You Have Solid Data & Analytics Foundations Throughout The Whole Organisation?

Too many organisations approach the technology of analytics “after the fact”. Retroactively adding any technology into an existing organisation is an expensive and daunting task.

Because of this, many organisations will find patchy and inconsistent coverage in different projects or teams. This means answers will have gaps and be incomplete at particular points in the customer lifecycle.

This can be made easier if you’ve chosen tools with modern API’s (this is certainly the use-case we depend on at Trakio to consume data) but in larger companies that have evolved over time, this isn’t always the case.

Does the following sound like your organisation?

The fairly new digital marketing team have a solid event tracking stack in place, pushing all usage events from the new iOS app into a data layer such as Segment.com. They can analyse app usage in multiple charting tools, like Mixpanel or KISSmetrics. They also invested in a modern marketing automation platform with a popular API, meaning it was quick and cheap to integrate it to the same charting tools.

This makes it easy for the marketing team to generate a weekly team report, and every team presentation or meeting includes graphs and charts of their actual usage and engagement data,

However, the same company is still be running on an older CRM setup for their sales reps. There’s no easy way to connect customer records from the Opportunity->Lead->Deal pipeline. Also, their self-hosted support desk is no longer supported by the original developers, and any kind of API integration is out of the question.

This means data & analytics of the customer journey during pre-sale, and during onboarding and technical troubleshooting, isn’t complete.

There isn’t a clear and consistent record of each customer journey. This is something we call a System Of Record, and it’s essential to establish this repository if you want to become a truly data-driven organisation.

  • You Pass If: Every customer touchpoint is tracked in every department.
  • You Get Bonus Points If: All of the data from different departments is connected together on a UUID (Unique User ID) to build a central System Of Record
  • How To Fix It: Include analytics & data strategy at the beginning of every new project. If you’re retroactively upgrading an existing team or department, separate out a small portion and run a pilot on the new, modern system. After ironing out any issues and teething problems, slowly transition technology and people over to the new system

2. Does Every Employee Have Access To Data?

If an organisation wants to be data driven, each employee in the company needs to have access to data and insights – with low technical friction.

Remember that to be data-driven, we want people to use data to guide decisions, to use data to add credibility to presentations, to build processes for sustained and repeatable growth.

Perhaps you’ve already provided a few shared logins to heads of each department, and you think you’ve got this one covered. But how many people do you think actually use that login? Do they understand what each report, graph and metric really means? Can they manipulate the settings and filters to translate their questions into answers?

Analytics is technical, complicated and often, daunting. This is a simple fact that is often overlooked by technically-minded people in the organisation.

I’m not just talking about the girl who can write SQL queries, but probably the intern who takes Google Analytics screenshots each week as forgotten just how much like hieroglyphics a lot of those reports might look like to a sales rep, or a support rep, or the CEO.

Providing access to data for everyone is much more than giving everyone a login to Google Analytics!

Its about translating data into insights in multiple layers of increasing difficulty, allowing employees to digest data at a level they are comfortable with

It’s about using multiple tools suited to multiple audiences. The underlying data is (hopefully!) the same consistent central database, but different UI’s and tools will work better for different people and different questions

It’s about providing self-paced training for all employees who want to empower themselves to make more data-driven decisions in their job. This doesn’t need to be deep training in R and Python… cover the basics that are overlooked. Does everyone really understand how to understand the Mailchimp newsletter reports?

  • You Pass If: Every employee, from interns to CEO, has self-service access to data via tools that are suitable to their experience and comfort level
  • You Get Bonus Points If: You’ve established an internal training resource for employees to upskill on the various tools and processes.
  • How To Fix It: Many analytics tools provide certification courses on their tools (Free or Paid). Employees are often motivated to complete these, especially if it’s a popular tool that makes them more valuable to your organisation and more employable in other organisations. Start with a pilot program and offer limited spots on one of these programs, and measure the uptake and completion rates. Start monitoring usage levels of your existing analytics tools, and for tools with low engagements from particular segments, find alternative tools and ways to query that data.

3. Is There A Dedicated Engineering Resource For Data Tasks?

As I discussed earlier, installing analytics retroactively can be a daunting task. But even once installed, a company needs to stay agile. New experiments may need to be run that were never anticipated before.

Marketing teams often use external design contractors or self-service website tools to remain as independent as possible. Using Unbounce or Instapage for example means that marketers can publish new landing pages without having to submit HTML/CSS work to the engineering team.

But when it comes to analytics or data, there’s rarely an other option other than going through engineering.

What you need to be careful of, is that such tasks are given their due priority from the engineering team. Shipping a new feature often seems much more exciting than reading through a spreadsheet and setting up 18 new tracking events in the iOS app.

If executives are the only people able to get their data/analytics tasks pushed through engineering, the rest of the organisation will start avoiding data bottlenecks. Marketers will get relaxed about using data. Sales wont wait for that report before booking the next sales call.

Simply put – people will avoid using data because data engineering creates a bottleneck that they don’t want in their projects.

This doesn’t mean all data engineering tasks should be bullied through ahead of product or sales engineering… but it does mean that a dedicated engineering resource should be put in place, just as you would with a sales engineer or quality assurance engineer.

  • You Pass If: There is a dedicated engineer (or engineering team) for data/analytics, and everyone in the company knows the process to submit tasks
  • You Get Bonus Points If: Each team has their own internal resource, physically sitting within their team for maximum bandwidth on collaboration.
  • How To Fix It: Assigning this resource should be relevant to the organisation size, but it needs to be dedicated. So if you have a team of 5 engineers, 1 of them will become the dedicated data engineer. This means anytime a data engineering task comes in, she immediately moves onto that task.

4. Are Teams Are Empowered To Build Their Own Data & Analytics Projects?

Just as everyone should have access to data, and access to a technical resource, people should also have the freedom to innovate with the way data is used and handled.

This sounds like a somewhat “fluffy” item, but it’s extremely important especially when an organisation grows in size.

I’m a huge advocate of using multiple analytics tools in the same organisation to suit different purposes and different peoples preferences. However, it’s basically impossible for these tools to get into the organisation if each team isn’t given the freedom and independance (and budgets!) to test these projects themselves.

For example, no one in engineering would try and tell the marketing team which marketing automation platform to use for their customer acquisition emails. Likewise, the Engineering team wouldn’t think of running their choice of server-monitoring tool past the sales team.

If teams are given independence, they can very quickly (and therefore cheaply) experiment with new analytics tools that are close to their teams function. In cases where this is something that can roll out to other departments, then this can be done from a solid base and experience.

Example: a marketing team using a new tool to build advanced customer segments. They will start using these segments to drive more accurate lifecycle emails, and therefore increase copnversions. Later, support may start using these segments to appear next to customer profiles on inbound support tickets. They could setup more advanced automated rules to route different tickets to different support teams. Later, the sales team may map these segments onto their historical deal pipeline so that they can learn which sales became the most active users, and try to replicate this sales process on future deals.

None of this would have been possible if the marketing team weren’t empowered to quickly test this new analytics tool just within their department.

One word of warning: be careful to not create more data silos. Make it clear that any new data tracked should eventually have a path into the organisations central data repository. This means buying into tools with modern developer API’s and ensuring data is kept clean.

  • You Pass If: Every department has the freedom to sign up to new analytics tools (Free trials & pilots) without bureaucracy
  • You Get Bonus Points If: There is a dedicated Innovation Budget, with a much more lightweight approval and due diligence process, to fund the projects past the free trial or pilot stages. Perhaps for 1 year
  • How To Fix It: Document the process (in a central Doc or Wiki) for how a team can try a new analytics tool. This includes the process for connecting to their data sources (i.e. who to email to ask for an API key to the marketing automation data, to the app clickstream data) as well as budget approval limits. Encourage experimentation and celebrate failures as much as successes.

5. Is Your Data & Analytics Strategy Clearly Defined Before Each New Experiment?

Running a data experiment means following good scientific principles, as well as following sound business strategic planning.

In practice hwoever, this isn’t always how I’ve seen things happen.

For example, you might want to increase the conversion rate of your free trials to paid users. Marketing might reach out to an external copywriter, and ask her to rewrite all 4 of their lifecycle emails. She sends over the new copy, and marketing put it live as the new lifecycle emails. After a month, Marketing present at an internal meeting on how the new emails improved (or otherwise) conversion rates,

On the surface, this might seem like a pretty data-driven experiment. They measured the conversion rate first and then measured the new rate afterwards. The difference (increase or decrease) was the results.

However, this is not how a truly data driven organisation would operate and it comes back to the initial analytics & data strategy. A correct approach would have been:

  1. Measure current conversion rate
  2. Set a target to achieve for the new conversion rate. The target should be decided by the companies overall goals, industry benchmarks, and some judgement.
  3. Record other factors that may influence the experiment. For example, will a new feature be shipped by engineering 2 weeks into the experiment?
  4. Set single (or multi-variate) A/B tests for each email and instruct copywriters to work in short, iterative cycles, so that they can react to the results.
  5. Ensure copywriters have access to the performance data, and have the training to understand the reports, so they know which emails are performing better
  6. After 1 month (4 weekly iterations) assess if the conversion rate is close to target, and if the last 2 iterations are producing significant increases or diminishing returns.
  7. End the experiment, or extend by 2 weeks.

The above should be planned before any work is started and any copywriter contractors are contacted.

This is because marketing may realise that they do not have any more logins to their email program, or that they need to setup a new permissions type for their conversion rate analytics tool, or that the A/B testing isn’t enabled on their plan… or any number of things that need priori planning before the experiment kicks off.

And even more importantly, without a clear target conversion rate, we’d have no idea how long to run the experiment for or whether the experiment was a success for the companies overall goals.

  • You Pass If: Every new experiment has a clearly defined data strategy
  • You Get Bonus Points If: Every previous experiment data strategy is available on your central wiki, so that new employees can learn from previous experiments
  • How To Fix It: Create internal documentation to educate on the ingredients of a good data strategy, and setup additional training resources for employees who want to do a deep dive. Encourage each team to present the results (and process) to previous experiments to other departments – even the experiments that failed, so that everyone can learn from the process and suggest new improvements for future.

Published by

Liam Gooding

Liam is the cofounder and CEO of Trakio. Previously an engineer, he writes about growing subscription companies using data-driven techniques and inside glimpses to Trakio's own growth journey. He wrote a book, "Growth Pirate!" which discusses data-driven growth strategies for startups.