Testing Mind Map Series: How to Think Like a CRO Pro (Part 8)
Interview with Deborah O’Malley of GuessTheTest
Data is the new currency of business. If you aren’t actively using data to challenge assumptions and see if your hypotheses hold up, you are operating at a disadvantage.
One way to do that is through testing. But it’s not always easy to know which tests to run and what the results mean. And even if you do have a solid understanding of CRO, there’s always a chance that there might be biases in your testing, which could invalidate the results.
Fortunately, you’re not alone in this.
The Testing Mind Map series will help get you in the right mindset and teach you how to execute tests with confidence.
In this interview, we chat with Deborah O’Malley from GuessTheTest, who ran her first optimization experiment at the age of eight. She shares her best resources for running experiments and outlines proven strategies to guide your next tests.
Deborah, tell us about yourself. What inspired you to get into testing & optimization?
When I was just eight years old, I unknowingly ran my first optimization experiment!
For my grade-school science experiment, I cut out different shapes and colors of construction paper, posted them on a board, and asked people which version they saw first.
Little did I know, this would be my first eye-tracking experiment…
Fast forward many years later, I ended up earning a master’s degree with a specialization in eye-tracking technology!
My research work looked at how to best design attention-grabbing ads. The findings have been published in six peer-reviewed journal articles and are a source of inspiration to digital marketers worldwide.
Upon completing my master’s, I started doing User Experience (UX) testing with a local UX firm.
As part of my training, the firm recommended I check out a resource called WhichTestWon.
So, I dutifully signed up and started taking the weekly A/B tests. Little did I know, this small act would change the course of my career…
Years later, WhichTestWon advertised a position opening for a senior content creator.
I was lucky enough to land the job!
And, in doing so, I moved from eye tracking and UX into the incredible world of CRO and A/B testing.
Here, I knew I had finally found my place.
I loved the whole-brained approach of using data to make evidence-based decisions while still applying creativity to design and test new variants.
Although WhichTestWon was a great experience, there were many bumps in the road.
At its height, the company was acquired and rebranded and soon after, completely dissolved — leaving marketers in a larch.
It was devastating for many. I saw it as my duty to fill the gap.
So, GuessTheTest was born.
As the founder of GuessTheTest, my aim is to provide experimenters with valuable A/B test case study data, and helpful resources to inform, inspire, and validate testing success.
Having produced, analyzed, run, and published thousands of client A/B test case studies, I’ve gained a rich and unique perspective on what works and doesn’t in CRO and A/B testing.
And I’m driven to help teach and share that knowledge with other optimizers.
How many years have you been optimizing for? What’s the one resource you recommend to aspiring testers & optimizers?
I’ve had an optimization mindset probably from the moment I was born.
As far back as I can remember, I’ve seen flaws and have come up with creative ideas to make things better, more efficient, and more effective.
But formally, my optimization career started in 2009 when I graduated from my master’s and landed my first career-related UX job.
As much as I don’t like to admit it to all those younger millennials out there, I’ve been doing optimization work for over a decade… and counting.
As for the sole testing resource I’d recommend, there are lots of great ones out there.
But if I had to choose just one, I’d have to say GuessTheTest. Although I do admit, I am a bit biased. 🙂
Answer in 5 words or less: What is the discipline of optimization to you?
A data-driven experimentation process.
What are the top 3 things people MUST understand before they start optimizing?
- A/B testing is the gold standard of optimization, but it must be done properly to get valid, reliable results.
- Your hypothesis is the backbone of a solid A/B test. You must start with defining a “smart hypothesis” that clearly identifies the target audience, conversion problem, and suggested solution, states the anticipated outcome, and of course, defines the test conversion goal.
- There is such a thing as a failed test, but there’s no such thing as a testing failure — if you learn something from the experiment.
How do you treat qualitative & quantitative data so it can tell you an unbiased story?
Merging qualitative and quantitative data together is an incredibly valuable, but highly under-utilized tactic in CRO and A/B testing.
At best, most experimenters tend to rely only on analytics or conversion data to drive decisions, and they pay little attention to anything else.
My inclination is a little different.
I like to start with qualitative data and then merge it with quantitative data.
Doing so takes more time and effort, but gives a fuller, more holistic view of the audience, their needs, and what it will take to improve conversions.
Perhaps this bias comes from the fact that I got my start in the UX world which is by and large qualitative.
But, some of the best experiments I’ve ever been part of, or have seen, like this one, take qualitative findings, typically from UX studies, delve into the analytics data, then validate results with A/B testing.
When you merge both qualitative and quantitative data and achieve similar findings with both approaches, you know you’re onto something very solid.
What kind of learning program have you set up for your optimization team? And why did you take this specific approach?
Building a culture of experimentation and comradery is really important.
Over the years, I’ve helped infuse learning across a lot of organizations.
Right now, for one testing agency, I’m running “Testing Roundtable Discussions” to help share test wins, create interactive discussions around testing ideas, and disseminate testing knowledge.
I’ve also created tutorials and training videos that cover everything from setting up and running A/B tests properly to analyzing data and results.
I see knowledge sharing as key, especially for new hires or people in roles outside testing — like project management, dev, or design — who might not be as versed in the subject matter.
True optimization can only be obtained with a knowledgeable, informed team.
What is the most annoying optimization myth you wish would go away?
That you shouldn’t copy competitors.
Your competitors can be a great source of inspiration.
True, what works well for one site and one audience may not directly apply to your site with your audience.
But you won’t know until you test.
Use competitor case study evidence — with proven results — to help direct your testing efforts. Then optimize on the success.
Download the infographic above to use when inspiration becomes hard to find!
Hopefully, our interview with Deborah will help guide your conversion rate optimization strategy in the right direction! What advice resonated most with you?
Be sure to stay tuned for our next interview with a CRO expert who takes us through even more advanced strategies! And if you haven’t already, check out our interviews with Gursimran Gujral of OptiPhoenix, Haley Carpenter of Speero, Rishi Rawat of Frictionless Commerce, Sina Fak of ConversionAdvocates, Eden Bidani of Green Light Copy, Jakub Linowski of GoodUI, and Shiva Manjunath of Speero!