How To Integrate Hotjar Heatmaps & Surveys With Google Optimize And Always Win At A/B testing
A/B testing is not rocket science. Strip it down to its bare essentials, and these experiments are nothing else than changing an element of a landing page and seeing how people react to it.
When you put it this way, virtually anyone can come up with an A/B testing hypothesis. Careful; we’re not saying that hypothesis will be any good. We’re merely stating the fact that, at first glance, A/B testing may look like something anyone can do.
You test your hypothesis. You compare results. You declare winners and losers. Come on, how hard can that be!?
Well, take it from someone who had this exact same mentality when they first started – when it comes to A/B testing, nothing is ever as it seems.
Losing or inconclusive tests may, in fact, be winners when analyzed correctly. On the other hand, winning experiments may turn out statistically irrelevant. Last but not least, coming up with hypotheses that are worth your while is a nightmare when you’ve got no clue where to start.
Fortunately, that’s not the case anymore. Our A/B testing reservoir is always filled to the top now. By the end of this post, yours should be too.
You see, over time, we’ve learned some A/B tests give you growth, others give you insights that feed your future successful experiments. And, of course, some give you both.
To understand where those insights come from, this post will walk you through:
- How to harness the power of heatmaps during your A/B tests
- How to utilize surveys for extracting valuable split-testing insights.
How to Go Beyond the Numbers of Your A/B Test
If you’ve read any of our previous posts, you know we’re crazy about data analytics and numbers. However, numbers without context (understanding what motivates people, what they desire, what keeps them from reaching their goals, etc.) don’t mean much.
It’s one of the reasons why we shy away from saying things such as data-driven decisions. Instead, we use the term data-informed decisions. That’s because context always leads the way.
For this reason, we never activate a split test without equipping it with tools capable of recording as much wisdom as possible. These tools are heatmaps and surveys.
These are tactics that are mainly used outside A/B tests. However, we were hugely inspired when we first threw them in the mix at the exact moment when running experiments.
Running Heatmaps & Surveys on the Control & Variations of the A/B test
Assume you’ve got the following experiment lined up in your A/B testing pipeline:
- A homepage where you’ll test the PAS copy formula (Problem Solution-Agitation) is your control: www.website.com/homepage
- Your variation is another version of that homepage where you test AIDA copy formula (Attention – Interest – Desire – Action): www.website.com/homepage-1
Obviously, we’re hoping to see an improvement in the conversion rate. However, it’s also extremely valuable to understand how these two different copywriting strategies impact:
- The prospect’s objections in each scenario
- How they skim through the website after reading the copy
- What are the questions that pop into your prospect’s mind when reading
We need to find out the answer to all these questions for each version of the split-test independently.
How is this helping me?
Going down the path of assumptions for the sake of example – let’s say the variation mentioned above brings you a 30% sales increase (which, in real-life, would be ab-so-lu-te-ly AMAZING!!!). But you don’t want to stop there. You want to push for an even more significant uplift.
So, for the next A/B test, you’ll try to improve the copy of this page even further – while keeping in line with the AIDA formula that worked wonders the first time.
Wouldn’t it be helpful if you had some sort of a neural link to what goes on in your prospect’s head while reading the text? Well, guess what? Those surveys and heatmaps you implemented – that’s exactly what they’re for.
Survey responses tell you exactly what your customers want and how you can improve your offering based on that. Meanwhile, heatmaps reveal how your customers interact with the website. Analyze them carefully and try to spot potential distractions. Are your prospects seeing what you want them to see? Are they confused by certain design elements? Find out the answers to these questions and construct your next variation accordingly.
Big Deal Mention
Captain Obvious here, but making sure you understand this is essential: The heatmaps and surveys need to run simultaneously as the A/B test!
Because that way you know for a fact the differences in customer insights come from the differences between the pages.
Everything else is the same. The market context. The homogeneity of audiences across the variations. The traffic characteristics.
The A/B testing tool takes care of keeping these variables constant across variations.
Cool. I want in on this too! How do I implement it?
The bad news here is implementing these two tactics into your A/B tests requires a bit of work. The good news is we’ll walk you through it while sharing our best practices.
First things first. There are 2 A/B testing implementation scenarios. The first scenario is when you use the URLs for both the control and the variation. The second one is when different URLs are in use.
Each scenario requires a different type of implementation.
For now, we’ll tackle the implementation of surveys and heatmaps when the URLs for the control and variation are different.
Setting Up The Survey
There are many tools you can use to implement your A/B testing surveys. We’ll show you how to do it in Hotjar, but rest assured that there isn’t much of a difference if you’re using other survey software.
If you run split-tests where control and variations sit on the same URL, please check our post on implementing Hotjar surveys when running A/B tests on the same URL.
Back to our scenario now. The first thing to do is to choose a Popover type of survey. This type of survey is best used when you want to understand more about your audience without distracting attention.
Don’t even think about having more than two or three questions inside your survey. Its purpose is to give you a better understanding of the challenges users face when navigating your variants. This is their chance to express their frustrations.
Here’s an example of a two questions survey:
As you can see, these are simple, yes or no, close-ended questions. Now is not the time to and place for open-ended questions that force your prospect to switch to a system 2 mindset (if you’re not familiar with Daniel Kahneman’s work, we highly recommend you read Thinking, Fast and Slow).
Our next best-practice for setting up your survey takes us to the fifth tab – Targeting. Here’s where you’ll pick the list of devices you want to target and the pages that will incorporate your survey.
Let’s move on to the sixth tab – Behaviour. Our advice here is to show the survey on exit intent for desktop users. Meanwhile, mobile visitors will see it as a pop-up 45-seconds after landing on your page.
Note: Make sure to set up two different surveys if you plan on making this type of differentiation.
Last but not least, make sure you tick the box that instructs Hotjar to only show the survey once. The last thing you want to do is irritate your prospects.
Voila! Your survey is all ready and waiting to feed you insights.
Get Familiar With Hotjar’s New Continous Heatmaps
Hotjar upped its game with its new Continous Heatmaps. Having to set up heatmaps manually is now a thing of the past.
It gets even better. Because you no longer have to worry about any type of special implementation if you’re running A/B tests where the control and variation share the same URL.
Regardless of how you’ve set the split-test up, once the experiment is over, you can analyze the results in Hotjar by following the instructions below:
- Type in the URL you’re interested in and select an appropriate time frame.
- Click Add Filter and scroll down to Google Optimize integration
- Copy and paste your Experiment ID (or choose it from the list)
- To see the heatmaps associated, select one of the variants (named “1” in our example) or the control page (named “0” in our example).
That wasn’t so complicated, was it?
PS: As you may already know, Hotjar is deprecating manual heatmaps starting April 2022.
Want to get even more context of what your users are doing and why they’re doing it? Make sure to check Recordings for each of your split-test variants to see what users click, tap, or scroll.
Hotjar already has Google Optimize integration in place. This means all that’s left for you to do is go to Filters and select Google Optimize at the bottom of the list.
Choose the variant you want to study, set the relevant time interval, and filter the table by recording relevance.
There you have it! You’ve not got a direct line between you and your users’ on-site behaviour.
At the end of the day, there’s nothing more satisfying than strutting out the results of a successful A/B test. With surveys and heatmaps on your side, you’ll be banging them up on a regular basis.
Since each split-test reveals different insights and feeds us new ideas, we find it incredibly helpful to keep separate reports for each of them. Drop us your email address and we’ll send you a copy of a sample report so you always have a record of why you implemented a certain change, what you thought it would achieve, and what actually happened.
What’s more, if you’d like help with your conversion optimization strategy, advertising efforts, or app & web analytics book a free strategy call or drop us an email and let’s see how we can take your business to the next level.