Whether your best ideas come to you in the shower, at the gym, or have you bolting awake in the middle of the night, sometimes you want to quickly A/B test to see if a given idea will help you hit your marketing targets.
This want to split test is real for many Unbounce customers, including Norway-based digital agency ConversionLab, who works with client Campaign Monitor.
Typically this agency’s founder, Rolf Inge Holden (Finge), delivers awesome results with high-performing landing pages and popups for major brands. But recently his agency tried an experiment we wanted to share because of the potential it could have for your paid search campaigns, too.
The Test Hypothesis
If you haven’t already heard of San-Francisco based Campaign Monitor, they make it easy to create, send, and optimize email marketing campaigns. Tasked with running especially effective PPC landing pages for the brand, Finge had a hypothesis:
If we match copy on a landing page dynamically with the exact verb used as a keyword in someone’s original search query, we imagine we’ll achieve higher perceived relevance for a visitor and (thereby) a greater chance of conversion.
In other words, the agency wondered whether the precise verb someone uses in their Google search has an effect on how they perceive doing something with a product, and—if they were to see this exact same verb on the landing page— whether this would increase conversions.
In the case of email marketing, for example, if a prospect typed: “design on-brand emails” into Google, ‘design’ is the exact verb they’d see in the headline and CTAs on the resulting landing page (vs. ‘build’ or ‘create’, or another alternative). The agency wanted to carry through the exact verb no matter what the prospect typed into the search bar for relevance, but outside the verb the rest of the headline would stay the same.
The question is, would a dynamic copy swap actually increase conversions?
Setting up a valid test
To run this test properly, ConversionLab had to consider a few table-stakes factors. Namely, the required sample size and duration (to understand if the results they’d achieve were significant).
In terms of sample size, the agency confirmed the brand could get the traffic needed to the landing page variations to ensure a meaningful test. Combined traffic to variant A and B was 1,274 visitors total and—in terms of duration—they would run the variants for a full 77 days for the data to properly cook.
Next, it was time to determine how the experiment would play out on the landing page. To accomplish the dynamic aspect of the idea, the agency used Unbounce’s Dynamic Text Replacement feature on Campaign Monitor’s landing page. DTR helps you swap out the text on your landing page with whatever keyword a prospect actually used in their search.
Below you can see a few samples of what the variants could have looked like once the keywords from search were pulled in (“create” was the default verb if a parameter wasn’t able to be pulled in):
What were the results?
When the test concluded at 77 days (Oct 31, 2017 —Jan 16, 2018), Campaign Monitor saw a 31.4% lift in conversions using the variant in which the verb changed dynamically. In this case, a conversion was a signup for a trial of their software, and the test achieved 100% statistical significance with more than 100 conversions per variant.
What these A/B test results mean
In the case of this campaign, the landing page variations (samples shown above) prompt visitors to click through to a second page where someone starts their trial of Campaign Monitor. The tracked conversion goal in this case (measured outside of Unbounce reporting) was increases to signups on this page after clicking through from the landing page prior.
This experiment ultimately helped Campaign Monitor understand the verb someone uses in search can indeed help increase signups.
The result of this test tell us that when a brand mirrors an initial search query as precisely as possible from ad to landing page, we can infer the visitor understands the page is relevant to their needs and are thereby more primed to click through onto the next phase of the journey and ultimately, convert.
Message match for the win!
Here’s Finge on the impact the test had on the future of their agency’s approach:
“Our hypothesis was that a verb defines HOW you solve a challenge; i.e. do you design an email campaign or do you create it? And if we could meet the visitor’s definition of solving their problem we would have a greater chance of converting a visit to a signup. The uplift was higher than we had anticipated! When you consider that this relevance also improves Quality Score in AdWords due to closer message match, it’s fair to say that we will be using DTR in every possible way forwards.”
Interested in A/B testing your own campaigns?
Whether you work in a SaaS company like Campaign Monitor, or have a product for which there are multiple verbs someone could use to make queries about your business, swapping out copy in your headlines could be an A/B test you want to try for yourself.
Using the same type of hypothesis format we shared above, and the help of the A/B testing calculator (for determining your duration and sample size), you can set up some variants of your landing pages to pair with your ads to see whether you can convert more.
ConversionLab’s test isn’t a catch all or best practice to be applied blindly to your campaigns across the board, but it could inspire you to try out Dynamic Text Replacement on your landing pages to see if carrying through search terms and intent could make a difference for you.
Via Marketing http://www.rssmix.com/
No comments:
Post a Comment