Testing Mind Map Series: How to Think Like a CRO Pro (Part 25)
Interview with Marianne Stjernvall
In this interview we chat with industry insider Marianne Stjernvall who has been doing CRO for ten years now; she guides us through what works vs what doesn’t… and why we should all be making sure our data is clean, accurate, and up-to-date.
Marianne reminds us that there are no shortcuts when it comes to experimentation — you have to put in the hard work and abide by high levels of rigor. But, this is what makes it fun!
After reading this interview, you’ll be armed with the knowledge you need to take your optimization program to the next level. You’ll also understand why data quality is so important and how experimentation can help you make better choices that align with your customers’ wants and needs.
Let’s dive in…
Marianne, tell us about yourself. What inspired you to get into testing & optimization?
For me it was passion. And also a bit of good luck. I was finishing my studies in computer and system science when I got contacted by an agency that needed a CRO specialist. I knew I loved the tech world, but I didn’t want to continue as a full time coder. I loved people and business too much. There were about five CRO specialists in Sweden at the time, so you can imagine I didn’t know much about it. I fell in love with it, at first sight — the combination of coding, analyzing, understanding human-computer interaction, and of course understanding what needs to be done, and more importantly what shouldn’t be done.
For how many years have you been optimizing?
In January 2023 I’m celebrating 10 years in the field of CRO. It’s crazy to think, and it’s gone by so fast. I’ve had the privilege of working and connecting with some of the best in the field. Being able to grow and never feeling like the smartest person in the room.
What’s the one resource you recommend to aspiring testers & optimizers?
There are a lot of good resources, CXL being one of them of course. I do recommend that highly. But what you really need to do – and put the most of your time into – is executing. Being in the field of experimentation and A/B tests you do have to execute a lot of A/B tests to learn. Each and every one of them comes with its own challenges and by exposing yourself to those challenges you will learn the most.
Answer in 5 words or less: What is the discipline of optimization to you?
Knowing, not guessing.
What are the top 3 things people MUST understand before they start optimizing?
- An A/B test is simple to showcase. A, B, and the winner was… But the actual field and execution of each and every test with all of its building blocks is very complex.
- The result of your optimization program will only be as good as your data quality.
- The point of optimization is not for you to become the hero. It’s for the organization to have a data-driven way of executing on its customers’ touchpoints and enabling the organization to make the best choices for both parties.
How do you treat qualitative & quantitative data so it tells an unbiased story?
When you’ve been within the same organization for a long time, getting unbiased data gets harder and harder. Because even though you know your users the best you are also fed with information from the organization and biased “truths” that might not be real.
For this reason, I always recommend doing a full audit of your experiment program once every 6 months. And this audit should be done by an agency not connected to your everyday work. It will provide you with the fresh eyes you need in order to create the best hypothesis.
That said, it doesn’t really matter how you treat your qualitative and quantitative data – because you will be biased at some point, and that’s ok. If you know it and take measures to “get out of it”.
What is the most annoying optimization myth you wish would go away?
“Can you tell me some A/B tests that always work?”.
I understand, being in the field for a long time you of course have good knowledge about what has worked and what hasn’t.
But the thing is – that it doesn’t apply to the results of AB tests. It depends on your brand, on your customer, and on your product. We’ve even seen that the same theories and tests don’t give us similar results within the same industry. Because each and every test is dependent on its context.
So there are no quick wins or shortcuts in experimentation – you actually have to do the job. And let’s be honest, that’s also the fun part!
Download the infographic above and add it to your swipe file for a little dose of inspiration when you’re feeling stuck!
Marianne’s insights into the world of CRO have been nothing short of refreshing and eye-opening. We hope this interview has given you some valuable insights and advice on how to experiment more effectively.
What advice resonated most with you?
Check back twice a month for upcoming interviews! And if you haven’t already, check out our past interviews with CRO legends Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Deborah O’Malley, Andra Baragan, Rich Page, Ruben de Boer, Abi Hough, Alex Birkett, John Ostrowski, Ryan Levander, Ryan Thomas, Bhavik Patel, Siobhan Solberg, Tim Mehta, Rommil Santiago, Steph Le Prevost, Nils Koppelmann, Danielle Schwolow, and our latest with Kevin Szpak.