Testing Mind Map Series: How to Think Like a CRO Pro (Part 64)
Interview with Jarrah Hemmant
Every couple of weeks, we get up close and personal with some of the brightest minds in the CRO and experimentation community.
We’re on a mission to discover what lies behind their success. Get real answers to your toughest questions. Share hidden gems and unique insights you won’t find in the books. Condense years of real-world experience into actionable tactics and strategies.
This week, we’re chatting with Jarrah Hemmant, CRO lead at Precis Digital.
Jarrah, tell us about yourself. What inspired you to get into testing & optimization?
It was a pure fluke, and an absolute shout-out here goes to my previous manager, mentor, friend, and one of the most inspiring CRO leaders out there (in my humble opinion – he did turn me into the CRO nerd that I am today!) – Ryan Webb. He first reached out to me about a CRO specialist position when my CV crossed his desk almost 10 years ago. I was in a more general marketing role and looking to relocate, contacting all sorts of agencies and marketing companies within the area with no clear idea of what I wanted. I had no idea what CRO was and had to Google it at the time, but by the time I’d read through the description and started doing the case, I absolutely knew that I had hit the jackpot in what I was searching for in my career. It had everything I enjoyed doing – research and analysis mixed in with concrete action and results.
Thankfully, Ryan thought the same, and here I am 10 years later. Moral of the story – don’t turn away positions that don’t fit the mould or are something you’ve never heard of before – it might end up being your calling!
How many years have you been testing for?
Since 2015, so nearly 10 years now.
What’s the one resource you recommend to aspiring testers & optimizers?
I personally think CRO excels as a field because it has so many varied experts and different opinions, strategies and tactics that you can explore and grow with. So I would always suggest exploring a wide variety of resources that are out there, and keeping your critical thinking hat on to draw your own conclusions.
Saying that, I love Tom Van Den Berg’s CRO Weekly Newsletter – he does a lot of the hard stuff I never seem to find the time for, and crawls the internet for the latest posts and updates in our industry. I really love sitting down first thing in the morning when it lands in my inbox and taking a few moments to read through some of his finds.
Answer in 5 words or less: What is the discipline of optimization to you?
It’s got to be the classic oldie for me – Test and Learn
What are the top 3 things people MUST understand before they start optimizing?
There are a lot of things people really should understand and be doing as part of optimisation. If I were to pick three to share, they would be:
- Build a good structure and document everything – Have an easy-to-follow naming convention for your experimentation, create clear plans, outline goals and your measurement plan, document your test setup and results, and outline what the action/next step was from the learnings. Spending just a fraction of your time setting this up is worth every penny – as you grow and scale it helps you keep track of your progress, go back and revisit old insights and experiments, and helps to onboard new stakeholders and develop your CRO culture easily.
- Different segments can react differently to your variations – Take time to break down your overall results into various segments. Are there some clear differences that need to be addressed? Have discussions and make plans if you do see discrepancies – can your dev team and website handle having different experiences for mobile and desktop, or different markets for example? Have you got personalisation set up to target different groups? If not, what compromises will you need to make when deciding on results?
- Create a measurement plan before each test – It’s not enough to run analysis prior to testing; you want the ability to analyse your results thoroughly after. Make sure you have a good measurement plan before you start your test and have agreed to all the secondary and smaller metrics outside your primary KPI that you might want to use in your post-analysis (and make sure they are tracking too before you start). Tests often ‘fail’ or are inconclusive, but having a good measurement plan in place, as well as the ability to analyse results in-depth with an integrated set of tools (both quantitative and qualitative), will ensure you are still learning from your test to help iterate and move on to the next one.
How do you treat qualitative & quantitative data to minimize bias?
Variety as much as possible – in both tools as well as customer and stakeholder perspectives. Collect and analyse data points from different research practices to help create a more holistic interpretation. And use multiple perspectives from both customers and internal stakeholders to minimise biases when drawing up conclusions.
I also try to be aware of my own biases when going through the research – sometimes making a note of my assumptions and questioning myself regularly, and talking to other people to sense-check and get their perspectives.
How (to you) is experimentation different from CRO?
I suppose in the context of our particular industry, experimentation is part of the overall CRO process, making sure you have vigorous validation and experimentation in place to expand your knowledge and learnings and back up your research. Experimentation as an approach has been around for a lot longer than CRO, and within CRO it is just about adapting the core principles as any vertical would do with experimentation to maximise and take advantage of.
That’s why it’s worthwhile for CRO specialists or experiment experts to pay attention to experimentation in other areas and industries – keeping an eye on trends and new innovations that could be applied to how we run within our own industry.
Talk to us about some of the unique experiments you’ve run over the years.
My favourites are always the simple and obvious quick wins – CTA changes, headline copy fixes, adding in reassurance and reviews – things that have an immediate impact as part of initial testing. It’s also always fun testing new features and running feature flags, and something I highly recommend should be part of your experimentation programme.
When it comes to uniqueness, one experiment stands out, though it wasn’t a one-off test but a collection of experiments and iterations we did over the course of a year to optimise, tweak and learn as much as we could about our users on the client’s product pages. The first test we ran was an MVT of a few different layout combinations, and it generated such interesting and mixed results across different segments. We got so many new learnings from it that it fed into a whole host of new tests and iterations as we began to take a more segmented approach to our product page layouts. The final result ended up being an evolved page, the product of all these test variations over the course of the year.
A completely random test – and far moved from the usual expectations of increasing Conversion Rate – was when we were helping a client try to increase opt-ins for their cookie policy. The client had implemented their new cookie banner, and seen a drastic and shocking decrease in acceptance rates. We did some digging and user testing and found the layout (the ‘cancel all’ button stacked upon the ‘accept all’) saw quite a few more people accidentally clicking the top button instinctively before processing the options. We tested a new layout that placed the CTAs side by side to help add clarity and visibility to their choices, and this had a good impact on improving the client’s opt-in rate overall, while still allowing users to make an informed choice.
Cheers for reading! If you’ve caught the CRO bug… you’re in good company here. Be sure to check back often, we have fresh interviews dropping twice a month.
And if you’re in the mood for a binge read, have a gander at our earlier interviews with Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Deborah O’Malley, Andra Baragan, Rich Page, Ruben de Boer, Abi Hough, Alex Birkett, John Ostrowski, Ryan Levander, Ryan Thomas, Bhavik Patel, Siobhan Solberg, Tim Mehta, Rommil Santiago, Steph Le Prevost, Nils Koppelmann, Danielle Schwolow, Kevin Szpak, Marianne Stjernvall, Christoph Böcker, Max Bradley, Samuel Hess, Riccardo Vandra, Lukas Petrauskas, Gabriela Florea, Sean Clanchy, Ryan Webb, Tracy Laranjo, Lucia van den Brink, LeAnn Reyes, Lucrezia Platé, Daniel Jones, May Chin, Kyle Hearnshaw, Gerda Vogt-Thomas, Melanie Kyrklund, Sahil Patel, Lucas Vos, David Sanchez del Real, Oliver Kenyon, David Stepien, Maria Luiza de Lange, Callum Dreniw, Shirley Lee, Rúben Marinheiro, Lorik Mullaademi, Sergio Simarro Villalba, Georgiana Hunter-Cozens, Asmir Muminovic, Edd Saunders, Marc Uitterhoeve, Zander Aycock, Eduardo Marconi Pinheiro Lima, Linda Bustos, Marouscha Dorenbos, Cristina Molina, and Tim Donets.