fbpx
Brought to you by Noko, the time tracking and invoicing app that your team will love.

Increase conversion rates and avoid common A/B testing mistakes: an interview with Patrick McKenzie (patio11)

Today’s interview is with one of the most pragmatic and successful conversion rate optimizers in the industry. After having to A/B test Bingo Card Creator to near-perfection, Patrick McKenzie has learned about what does (and more importantly, doesn’t) work in the complex field of turning visitors into paying customers. You may know from his blog or as patio11 on Hacker News or Twitter, where he regularly answers questions and gives out advice from his years of experience.

Patrick_1
Patrick speaking to a BaconBiz Conf 2013 attendee

Now let’s get started with this interview!

What do you focus on most when working to increase your conversion rate, and what are some “easy wins” you can almost always implement?

It depends on the particular context for the company. Generally you want to take a look at the business prior to just firing wildly. For example, grab documentation and screenshots of the current pages for:

  1. The home page
  2. The plans and pricing page
  3. The signup page
  4. The landing pages which have the highest spend associated with them, if there is paid advertising going on

It is generally also a good idea to get an idea of what your current numbers or historical numbers are, so that you know what you’re trying to beat. You can, of course, just do A/B testing without knowing what your current conversion rate is, but I like knowing the lay of the land first.

Alright, so we know where we are. Now what do we target first? Clear errors. Everyone always, always, always has them prior to starting optimization. Clear errors are:

  1. Lack of persuasive copy in key places, (for example: literally having “Plans & Pricing” has the H1 on the plans and pricing page). You should replace that with benefits-focused copy- do something like anchor the prices on that page, stomp an objection, or otherwise try to get people to click to the signup form.
  2. Calls to action which suck, like “Submit” or “Signup.” Replace them with more descriptive Call To Actions (CTA) which promise value, not pain. One formula which works well: verb, benefit, immediacy. “Get Started Now” is a classic for a reason. “Start Tracking Your Time Now” is another good example. 😉
  3. Particularly for home pages, it is often possible to have high-quality visual designs overwhelm the visitor from clicking on the CTA. You’ll know the way to address this when you see the context of the page, but in general, you want the CTA button to be the obvious next step.

What are some of the biggest missteps you’ve seen people make when attempting to increase their conversion rate?

Many people, including myself, are guilty of just throwing stuff at the wall to see what sticks. While undirected hill-climbing WILL ACTUALLY WORK (surprisingly!), you’ll grow your metrics much faster if you have a theory of the mind of your visitor/customer, you develop hypotheses as to why they are doing the behavior you are seeing, and you design experiments to invalidate those hypotheses where you’re wrong. Then, when you start to get data which shows some of your thoughts on the Why question may not be wrong, you will have a much easier time developing “Hows” to optimize.

For example: One of my most effective optimizations by percentage ever was for Bingo Card Creator. It had a lot of moving parts to it, but the upshot was that it brought in a lot of customers in a non-traditional segment for me. Only after I realized that (after digging into data) did I figure out the easy, “obvious” two character (!) A/B test which ended up being worth tens of thousands of dollars.

When making a change focused on moving visitors “down the funnel” (from seeing your sales page to actually signing up) How do you differentiate between actual results and “false positives” (such as an increase in traffic) after you’ve made the change?

It is important, when possible, to test the entire freaking funnel. I know a lot of folks, particularly those with less traffic, will try to bootstrap by testing for a proxy conversion. (For example, testing for a click on a button on the home page rather than for a free trial signup.)

That will get you faster results, but you run the risk of getting bad results quickly. Example: I remember a time when a particular consulting client which makes a bug tracker accidentally became really, really efficient at convincing jealous girlfriends to bug their boyfriend’s cars via a redesign which cut out all description of what the product actually did. That did good things for early numbers in the funnels but did not exactly move the needle for their B2B software developer productivity application.

One way you can sanity check this sort of thing is to save what A/B tests someone has seen somewhere associated with their client accounts, and periodically review them. You can do this quantitatively (via e.g. cohort analysis of old A/B tests) but at low to moderate scale/sophistication sometimes just browsing through a single page in your admin is enough to flag the problem.

For example, if you have Alternative A which is beating the stuffing out of Alternative B in an A/B test (with the goal to get people to the signup page), and you look at the most recent 100 accounts which sign up and see that they signed up under ABBABABBBABBABBAA…, you would quickly come to the impression that that test isn’t as useful as you might naively think it is.

When do you think is the right time to transition from adjusting a single version of your content to performing AB testing?

This is a great and under-appreciated question. If you just want a quick rule of thumb: if you’re A/B testing for a free conversion, like a trial or email submission, you’ll want to have 3,000+ visitors a month to the page directly upstream of that conversion. If you don’t have that, work on acquisition first.

More broadly: I’d focus on A/B testing when you have a product which works, you have customers for it, your customers are giving you qualitative and quantitative evidence that it is creating lots of value for them, and you’ve got some leeway to invest in the future. If you won’t make rent next month unless you get 10 new customers, then don’t do A/B testing — hustle, bang down doors, and find those 10 new customers. But if you’re at $20k a month and want to get to $25k, starting A/B testing is a GREAT idea.

What are some of the most common pitfalls you’ve encountered when running AB tests?

Overwhelmingly the most common problems are related to wanting to do A/B testing but not actually sitting down and running 1+ tests a week every single week. If you can’t commit to that for at least two months, don’t do A/B testing. (Don’t tell me you don’t have enough time — I did this for YEARS when I was working 90+ hour weeks at the day job. It literally takes 5 minutes to implement A/B tests to change a call to action or headline.)

More commonly a problem when doing client work: often times people want to declare a winner as soon as one looks like it is beating the other, without waiting for the numbers to actually come back. Some A/B testing tools even encourage the user to do this (looking at you, Optimizely) because it’s such a common user wish. If you do this, you’re not experimenting, you’re just reading chicken entrails and hoping to gain insights from them.

What tools would you recommend for performing AB testing?

Is the person who will “own” A/B testing in the organization technical? If so, use whatever tool works server-side on your tech stack. I like A/Bingo, obviously, because I wrote it to be the perfect A/B testing tool for Rails developers who were exactly like me in every way. 😉

If they’re not technical, two good options: Visual Website Optimizer and Optimizely. Both are roughly similar in terms of reliability, reporting, and UX. I personally prefer VWO by a smidgeon. I’ll caution you that, like many cloud tools for analytics, it is sometimes difficult to reconcile their numbers with ones collected by other tools or your own code. That said, it’s worth it for allowing your e.g. marketing manager or copywriter to be able to A/B test headlines without requiring bugging engineering several times a day.


We hoped that you enjoyed this interview! Do you have any burning questions or a topic you need to read about? Drop us a line, we’d love to hear how we can help your business!

Get notified about new articles

Whenever we write a new article, you’ll be the first to know.