What if we told you that building an optimization powerhouse into your company only takes 5 steps? We spoke with Richard Eckles, one of the growth leads at a large video-sharing platform, about his journey through introducing A/B testing into the organization. It takes time and effort, but a strong internal testing mindset can take any user-focused business a long way.
Bring everyone on board.
When Richard first joined, A/B testing was just starting to become a big practice in the industry. It was coming out as this “magic marketing tool,” and third-party solutions were growing in popularity. Richard always cared about experimentation — as a non-ecommerce app, they need “high engagement and retention to survive,” so running tests that impact these levers is what will boost growth.
However, he needed internal approval in order to implement A/B testing, and 90% of the battle was selling the idea into the rest of the company. The biggest hurdle was convincing senior management. Startups often aren’t afraid to dive right in with testing, and that’s due to buy-in from the CEO. The goal is to get the green light from as high up in the company as possible, but in larger organizations, CEO’s and executives are difficult to reach.
Richard used case studies, researched success stories from industry leaders (eg. Airbnb), and called upon his previous company experiences to prove his point. He brought his findings to the table through several company wide presentations and many internal conversations. There was continuous talk about A/B testing but no actual implementation, so he created and promoted an infographic called the “7 Step Process for Effective A/B Testing” to increase awareness.
After a lot of perseverance and personal investment, Richard finally succeeded — there now exists a growth & optimization team that sits within the user product tribe.
Have a plan. Build company culture.
“It’s all about the people and the process. Everyone should be involved, from analysts to designers to developers.” — Click to Tweet
Getting the go-ahead for testing is an amazing first step, but the journey’s only just begun. In order for A/B testing to work effectively, a company culture around testing needs to be developed and maintained. Setting up a strong framework and testing roadmap will help to ensure smoother implementation down the road. There are certain CRO management tools available, but some teams may prefer to track through Kanban boards or spreadsheets. Flexibility is also key — having a plan doesn’t necessarily mean determining exactly what the next ten tests are going to be. Instead, teams should spend time on choosing certain focus areas and prioritizing them. Check out some best practices for roadmapping from various experts in the industry.
“You need to have that kind of culture shift where testing becomes a part of everything you do. There’s this idea of chipping away at your audience and your products, and having a truly data-driven product roadmap. That’s another shift altogether.”
Richard suggested an interesting idea for promoting testing internally. Make it a game. Gamifying experimentation is a great way to achieve engagement at all levels within the company. For example, teams might have leaderboards for guessing the winning variation ahead of time. A/B testing requires active participation, and a little healthy competition can go a long way.
Data and design aren’t mutually exclusive.
“Sometimes, designers may feel like they’re having their creative arm removed by being data driven.”
Data is meant to enhance design decisions and Richard strongly believes that “user researchers are the link between design and growth.” Qualitative tests with a small sample size can’t accurately represent the entire customer base, so why settle for an incomplete data set? User research can be a smart method for uncovering what variations to test against a baseline and developing a strong hypothesis, but it’s risky to rely on that data for full-scale releases. A/B testing on larger live audiences empowers real-time user decisions to drive UX development.
Richard delves deeper into creating a balance with his own post: Why Design and Data Must Work Together.
Steady and constant testing wins the race.
With A/B testing, everyone wants to see solid results. Unfortunately, it’s hard to answer the question: “How exactly does a test affect my bottom line?” For example, an e-commerce app alters a Call-To-Action button and the variation successfully increases conversions by 5%. It’s certainly a positive signal, and a decision should be made based on that data. However, it’s not enough to just stop there.
“You’re not gonna change a call to action on a button and suddenly start seeing a massive spike in revenue. Life just isn’t that kind.”
That’s why successful A/B testing requires high volume. Companies need to be running “50 tests a month, not only one or two.” Booking.com is capitalizing on this mantra, and it’s proven to have great results for them. Even though Richard already achieved company buy-in through presentations and education, he still needs to demonstrate the success of high tempo testing.
With high tempo testing comes the inevitable failure here and there. There might not be a “right answer” for a really long time, but companies can’t be afraid to fail. It’s fine to be proven wrong, as long as you learn from it and pick yourself up. Eventually, you stop failing and get it right.
“That’s what AB testing is all about. It’s about learning quickly what works and what doesn’t work. Fail fast, learn fast, and let your users guide your decisions — not the other way round.” — Click to Tweet