How to Increase Lead Gen Conversions With A/B Testing

DIGITAL MARKETING STRATEGY

One of my favorite things about working in demand generation is the proof. Everything we do as content marketers can and should relate back to a trackable key metric. Whether it be click-through rates, open rates, conversion rates, or cost per conversion, there’s always something we can and should measure to determine success.

As a digital agency, we get a lot of questions when it comes to creating a solid lead generation strategy. Clients looking to build gated content come to us to create the best strategy for gathering visitor info in the most frictionless way possible to boost conversions.

There will always be generalized best practices to follow, but since every industry, audience, and content type is different, A/B testing becomes crucial to proof-based marketing and measuring success. But first …
https://thinkbonfire.com/blog/getting-abm-started/What is proof-based marketing?

Proof-based marketing is defining a tangible key performance indicator (KPI) or metric that will be tracked and used to determine the success of a campaign. In an increasingly digital marketing world where nearly everything can be tracked, defining KPIs should be anchored in data.

The only surefire way to find out if your campaigns are actually working is to determine the success metric before you even begin! Doing this will tell you the true ROI of the blood, sweat, and tears put into strategizing, building, and executing your campaign.

Put simply, it’s the only type of marketing we should be doing, ever. No more guessing. End of story.

Learn how you can improve your lead generation forms with A/B testing to increase conversions and see true ROI.

A/B testing for proof-based marketing

From the start of any campaign, most clients’ main goal always seems to be “I need more leads!” But once you begin diving into the process of building a spiffy new asset to generate them, you start realizing there are 1 million and one options for accomplishing this goal. What you really want to get to the bottom of is which option is the most effective.

For example, when considering lead generation forms for gated content, should you pre-populate as many fields on a landing page with the visitor’s information as you can? At first glance you might say “Yes, of course, pre-populating forms increases conversions by up to 30 percent. That’s old news by now.” But, on second thought you might consider, “Hmm … will that make us look like creepers and cause visitors to leave?”

Before any knowledgeable marketer answers that question, they should first acknowledge that it depends on their audience. That’s why A/B testing is the best method to learn as much as you possibly can about them and their preferences. With proof-based marketing, you can easily take the guessing out of any decision. Here’s an example of a recent A/B test we performed.

Our A/B test for lead generation form placement

The question at hand: For gated interactive content, where should we place the lead generation form to capture visitor information (e.g., contact info or profiling questions)? Options to consider included:

  • Placing the form at the beginning before visitors enter the interactive experience.
  • Placing the form at the end before they receive their results.
  • Interrupting the content journey and ask for info in the middle in order to continue.
  • Or splitting it up—ask for an email address up front and then the rest at the end before they receive results.

The answer should always be: Let’s test and find out!  

We executed an A/B test on an interactive quiz developed for one of our clients to determine the best place to put a form that captures lead information. I’ve outlined our findings below.

A/B test setup

Champion: The initial version (A) of the quiz asked for visitors’ first and last names, email, and company name at the end. The form was required in order to receive quiz results and related content.

Challenger: The second version (B) of the quiz split up the form and asked for email up front before visitors could access the first quiz question. At the end of quiz, it asked for first and last names and company name before receiving the results.

Our hypotheses:

  • Asking for email up front decreases the number of fields to fill in at the end of the quiz, which in turn reduces friction associated with submitting the form. This would result in a higher percentage of form submits from those who started the quiz.
  • Respondents providing their email up front feel more invested in their quiz experience and will be more inclined to finish to receive their results.
  • Capturing respondents’ emails up front will provide the opportunity to follow up with them to share their results (if they completed all the questions) or to nurture them further.    

Our results:

Learn how you can improve your lead generation forms with A/B testing to increase conversions and see true ROI.

  • Of those who started the quiz, 34 percent submitted the form at the end of version B and fully converted, versus 27% from the version A.
  • Of those who made it to the form at the end of the quiz, 54 percent filled out all the form fields in version B and went on to view their results page, versus 45 percent for version A.
  • We received an email address from 41 percent of quiz respondents in version B, versus only 19 percent from version A.

After running the A/B test for over a month and receiving just under 6,000 visitors, we found that our hypotheses rang true. Out of all respondents who started the quiz, the Challenger version (B) outperformed the Champion version (A) in percentage of respondents who submitted the form at the end of the quiz, resulting in a true conversion.

We did see a larger drop off in version B asking for email up front. But once respondents got through that point and provided their email address, a higher percentage of them actually completed the quiz than version A—meaning they gave us the rest of their information at the end to see their final results.

In the end, we were able to prove that the Challenger version was most effective. Not only did splitting the form increase the percentage of true conversions, but it also allowed us to capture more email addresses to enable further nurture follow-ups and engagement.

The true purpose of A/B testing

Defining a tangible key metric up front that can be measured to track the success of a campaign is a requirement in order to survive in a digital, proof-based marketing world. Equally as important is to set a single objective or focus for your campaign before kicking it off.

The key metric we chose to measure success for our interactive quiz campaign was the percent of conversions from those who started the quiz. Everything we did came back to that data point.

In developing your own methods for A/B testing, here are some other elements you might consider examining:

  • Placement of the form on a static landing page (a lightbox pop-up window versus static on page; above the fold versus below the fold; left side versus right side)
  • Copy used for the CTA button on a form
  • Number of fields on a form (although I highly recommend that less is more)
  • Headline used on a landing page
  • Amount of copy used on a landing page (long form versus short form)

The next time you’re ready to kick off a new campaign, be sure to define exactly how you’ll measure success—or what the client defines as success. I promise you, setting it at the start will help you out every step of the way as you plan, build, and report on your campaign.

What A/B tests have you done recently that helped a campaign or answered a question debated internally? Comment below with your thoughts and experience with A/B testing and proof-based marketing.

Related resources:

More Content

2018-10-05T16:17:50+00:00

Leave A Comment

Bonfire Marketing

407 NE 12th Avenue, Portland, Oregon 97232

Phone: 503-334-2071

Web: https://thinkbonfire.com