When It Comes to Messaging, Nothing Compares to A/B Testing

AB Subject Testing May 14, 2018 By: Blake Stenning

More organizations are adopting A/B testing to compare audience response to two versions of a messaging element. Here, association leaders share how this data analysis informs their decision making and improves member satisfaction.

When Joseph Cephas, vice president of communications for the American Society of Interior Designers, began planning a member recruitment campaign, he decided that before committing to a big social media ad buy, he would first test two different call-to-action buttons on Facebook and LinkedIn during a two-week period. It proved to be a smart move.

Cephas initially assumed that version A, a listing of tangible member benefits, would see a higher response rate than version B, a simple aspirational headline that read, “You Belong.”

By the end of the test period, the results were clear and surprising. Version B’s more personal approach had significantly higher open rates. Although his initial assumption was disproven, Cephas was hardly disappointed: ASID had gained valuable evidence-based knowledge of its audiences’ preference. With results in hand, he confidently invested in a large media buy using an appeal that he initially thought wouldn’t work.

For those who are new to the concept, A/B or split testing measures two variants of a single element (such as an email subject line) over a period of time to determine which version users prefer. It is becoming an increasingly important tool for associations seeking insight into members’ preferences and looking to refine content and benefits to stay relevant.

The following are examples of how A/B testing has helped association professionals improve the overall end-user experience.

Learning what matters. At the Association for Accessible Medicines (AAM), Erica Klinger, director of marketing, and Rachel Schwartz, director of communications, used A/B testing to determine why a specific advocacy campaign on their website was experiencing low conversion rates. After testing five different graphic images, they discovered that none made a significant difference on social media click rates. They then tested the page’s content by comparing the current messaging (A) with a more streamlined version (B). As version B’s click rates shot up markedly, it became evident that for AAM’s members, message matters most. With this insight, they refined the landing page’s content to better align with their members’ needs. “Why guess when you can test?” says Klinger.

A/B testing is becoming an increasingly important tool for associations seeking insight into members’ preferences and looking to refine content and benefits to stay relevant.

Testing platforms too. As associations are becoming more experienced with A/B testing emails, some are expanding beyond just content. Samantha Ridner, a web specialist with the National Association of Colleges and Employers, currently uses two different email service providers (ESPs) and is testing an identical survey invitation on both platforms to see if one outperforms the other, based on total open rates. First, the recipient list is sorted equally between current and new members, then half will be sent from ESP A (a shared IP server) and the remainder sent from ESP B (a dedicated IP server). The objective is to test whether there will be a measurable enough distinction to determine which provider is the better fit for the association’s needs.

Growing a community. Joe DeLisle, membership and council relations manager at AMGA, formerly the American Medical Group Association, has been using A/B testing for the past year for his association’s member onboarding campaign. Over the span of a five-week period, new community members receive a weekly email that focuses on a specific member benefit, such as using the listserv or creating a user profile. To ensure a random sampling, DeLisle splits the list in half based on member ID numbers (even numbers receive test A; odd receive test B). Throughout the campaign, messaging is continually refined to best reflect members’ preferences. Results from week one help shape the subject line for week two, and so on until the campaign is completed.

A/B Advice

One of the biggest challenges when planning a new A/B test is deciding what to measure. Just because everything can be tested does not mean all elements are equally relevant. “Two questions can easily become 10,” says Dave Rahmoeller, director of digital campaigns at the American Speech-Language-Hearing Association. His A/B testing strategies focus on more long-term goals by first speculating about what issues might be important six months from now, and then developing methodologies to help guide him toward those answers. Rahmoeller stresses the importance of being clear about what you’re trying to test at the onset, then carefully analyzing the results to ensure they are statistically significant.

For associations that are just beginning to use A/B testing, Amy Hager, CAE, IOM, director of ConsensusDocs programs at Associated General Contractors of America, offers this advice: “Start small. Change one word, but don’t change the meaning of the subject line.” Hager recently began testing email marketing subject lines and quickly found that simplifying the message led to increased engagement from her association’s members.

Whether it’s to help glean a deeper understanding of audience inclinations, optimize messaging throughout the course of a campaign, or improve internal decision making with evidence-based data, A/B testing is paying off for associations that invest their efforts toward using it effectively.

 

Blake Stenning

Blake Stenning, MAM, is director of communications for PHADA in Washington, DC, and immediate past chair of ASAE’s Communication Professionals Advisory Council.