Community Recruitment Best Practices

community recruitment photo

“Well begun is half done.” There is no place this is truer than in creating a community. Do a great job of recruiting the right community members and you will have powerful information. Do a poor job and fail to recruit your key target audience and you will gain no insights.

Building a successful research community requires forethought and careful planning. Communities should be an extension of your brand and a fertile ground for building deeper relationships through rich discourse and continuous feedback. However, engaging your target audience in a community is part art and science.

We believe in the power of science, so we did some research to underpin our best practices.

To help build vibrant, viable communities we offer four research-driven best practices:

  1. Be open about who you are and what you want;
  2. Give your community members something they will benefit from;
  3. Let them know their voice will be heard and valued;
  4. Let them join and answer using whatever device they like.

In this blog, we unpack each of these recommendations and outline the research that supports them.

Be open about who you are and what you want

Should an insight community be branded? Or should the sponsor be blinded? While not quite as intense a question as Hamlet’s “to be or not to be,” it is nonetheless an important concern that can impact your ability to recruit and retain community members.

Some argue that branding can bias a community, because people are conscious of the brand and bring their feelings about it to the research. Others suggest that if you hide the brand you are not being transparent with people and you are not enabling members to build the kind of connection they have when they know who wants their feedback.

Of course, there is no absolute black or white, right or wrong and it is most likely that—to an extent—all those things are true. But it is possible to do an empirical assessment of the effect of branding on intent to participate, and solve part of this puzzle. There has been debate over the years about whether communities should be transparent about who is sponsoring them or not.

To determine the effect of branding we conducted a study with a sample of affluent investors. Affluent investors are hard to recruit. They tend to be busy and invited to do many things, so they are a perfect test case. If branding makes it harder or easier to recruit, it can have significant financial implications.

So, we conducted a short (6 minute) survey with a sample of 608 affluent investors. We split the sample into two groups: half saw an invitation to join an unbranded insight community; the other half saw an invitation to join a branded community. The invitations were identical, other than the branded one mentioned that a financial company they knew was sponsoring the community.

We found that join rates were 25% higher amongst those who saw the branded invitation. Even more importantly, the join rate was 31% higher amongst those hard-to-reach people with investable assets of $500,000 or more.

branded recruit
We also asked “how often would you be willing to participate in online discussions and/or surveys with this type of community?” Fascinatingly, those exposed to the branded invitation were significantly more likely to be willing to participate more often.

brand invitation

 
Clearly, being upfront about who you are and what you want is vital to the success and vitality of your community.

Give your community members something they will benefit from

There is a common assumption that people join communities to earn money or at least have the chance to win a draw. But that’s not entirely true. In fact, paying people to participate in research has very little effect.

In “Incentives in Web Studies: Methodological Issues and a Review,”(1) Göritz conducted a meta-analysis of the effectiveness of incentives. She concluded “material incentives increase response and decrease drop-out” but “the combined effect of incentives on response and retention is still small.”

She also reminded readers that “using material incentives is only one option to influence data quality and quantity. We should not forget about other possibly response-enhancing techniques such as personalization, pre-notification, deadlines, reminders, offering result summaries and altruistic appeal.”

This meta-analysis reveals that adding incentives does make a difference to response rate, but it has a pretty modest effect.


 

We did some of our own testing on the effects of incentives on members of our market communities and found that paying out $1 for completing a short survey vs. paying them fifty cents had no impact on the response rate. Across a test of 153 studies we found the response rate at a dollar was 30% and at fifty cents it as 31%.

It seems that what people value is not really the honorarium. So, what do they value? Let’s return to our affluent investors for a clue. In that study, we found that join rates and participation are motivated by a thirst for knowledge first and monetary incentives a distant second.

We conducted a TUR(f) analysis to determine the maximum reach of appeal. This analysis is aimed to identify the optimal combination of incentives that produce the greatest “reach” or appeal, meaning that it captures the greatest number of investors.

On its own, offering to give people “exclusive learning materials about investment products or strategies” was the most effective tool for attracting affluent investors. The second biggest driver of willingness to join was the “ability to win cash prizes for each activity that I partake in,” but it was a distant second.

 

This tells us that it is important to offer something to your members that they will value and that some sort of potential monetary return can be helpful, but is of secondary importance.

This desire to connect and learn is very consistent with other research we’ve conducted on respondent motivation. An earlier study found that 95% of respondents agreed “I enjoy learning about new things and products when I do surveys.”

What it is that they will find value will vary with the nature of your community, so it is worth giving it careful consideration and perhaps even testing alternative strategies before embarking on a full recruit.

Let them know their voice will be heard and valued

In addition to learning, we have also found that giving respondents the ability to make sure their voice is heard is also a powerful motivation. Insight community members want a dialogue—they want to hear and be heard.

We previously found among the general population that 87% agree “I feel like I am being a trusted advisor when I provide feedback to a company on their products” and that 89% agree they feel like they are “doing my part as a good consumer and citizen when I provide feedback.”

Amongst the affluent investors, half indicated they would be motivated to join by knowing that “my voice” is “heard” (49%) and they would be seen as a “trusted advisor” (48%).

We surveyed over 80,000 of Maru/Matchbox’s market community members in the US, UK and Canada on their satisfaction with being part of the community. We found that, after the quality and topic of the survey, the biggest driver of respondent satisfaction is feeling “the input you provide is valued.”

Dialogue and respect are important to running a successful community, just as they are in any relationship. Newsletters, member portals and in-survey feedback show community members that their input is valued.

They want to feel like there are “doing their part.” And you need to tell them that they are, by providing feedback on what you have learned and what you have done as a result. Doing it on an ongoing basis is vital to the continued vibrancy of the community.

Let them join and answer using whatever device they like

Smartphones and tablets are so ubiquitous that we no longer question whether research needs to mobile-friendly. It has to be, or you simply won’t have the reach of everyone in your sample base.

Millennials and Hispanics—two harder to reach parts of the population—over-index on mobile use. In fact, 22% of Hispanics have access to the internet only through their smartphones—something that is also true for one in eight Americans overall.(2)

So if mobile is a must, the question becomes what’s the best way to reach someone on their mobile—through an app or through a web browser? To find out, we ran a test.

We asked a sample of 350 people who have investments if they would be willing to join an insight community about investments. When we asked if they would join one that involved having an app, the number willing to join shrank by a sixth. We asked if they would do surveys through the app, and that number shrank dramatically, ending up being three quarters smaller than the group originally willing to join the community. The population shrinks even further when we look at people who would keep the app front and center on their device, with the majority stating that the app would be stowed away in one of the subfolders or after the last app downloaded.

 

Those willing to join, download and use the app were demographically very unrepresentative of the investors as a whole. They were much younger and more likely to be male, have kids at home, and be wealthier and employed full time.

What people want is choice—the ability to do the survey on whatever device they happen to be on when they want to complete it. Nine in ten (92%) agree “I love having the freedom to answer a survey on whatever device I want.” And eight in ten (78%) agreed “I don’t like being forced to do a survey on a specific type of device.”

This is consistent with earlier research we conducted looking at what happens when you require people to use a specific type of device, be it desktop, smartphone or tablet. When we asked people to switch devices (if they did not start the survey on that specific device) the vast majority either tried to continue to do the survey on the device they were on or immediately quit and never returned.

People want choice, and delivering surveys through an app does not provide that.

Conclusion

We’ve gathered together a considerable body of research on which to base these recommendations for building vibrant, representative communities. Hopefully, after reviewing these findings you’ll better understand why we recommend:

  1. Be open about who you are and what you want;
  2. Give your community members something they will benefit from;
  3. Let them know their voice will be heard and valued;
  4. Let them join and answer using whatever device they like.

We hope these recommendations help you get started on building a vibrant community that drives valuable decision-making.

———————————————————————————————————————-

  1. Goritz, AS, (2006) Incentives in Web Surveys: Methodological Issues and a Review, International Journal of Internet Science, 1, 58-70.
  2. Mobile Fact Sheet, Pew Research Center, Internet Science and Tech