Thursday, January 15, 2009

Why your screener isn't working

I get that not every researcher wants to or has time to do her own recruiting of participants. Recruiting always seems like an ideal thing to outsource to someone else. As the researcher, you want to spend your time designing, doing, and analyzing research.

So, you find an agency to do the recruiting. Some are very appealing: They're cheap, they're quick, and they have big databases of people. You send requirements, they send a list of people they've scheduled.

How do you get the most out of an agency doing the recruiting? Write a great screener -- and test it. How do you get a great screener? Here are a few tips.

Seven screener best practices

  1. Focus questions on the behavior you want to see in the test. For example, for a hotel reservations website, you might want to know Does the person book his own travel online? For a website for a hospital network, the behavior might be Does the person have a condition we treat? Is the person looking for treatment?

  2. Limit the number of questions. If the question does not qualify or disqualify a respondent for the study, take the question out. If you want to collect information besides the selection criteria, develop a background questionnaire for the people selected for the study.

  3. Think about how you're going to use the data collected from the screener. Are you going to compare user groups based on the answers to screener questions? For example, if you're asking in your screener for people who are novices, intermediates, and experts with your product, are you actually going to have a large enough sample of participants to compare the data you collect in the usability test? If not, don't put requirements in your screener for specific numbers of participants with those qualities. Instead, ask for a mix.

  4. Avoid Yes/No responses. This is difficult to do, but worthwhile. Yes/No questions are very easy for respondents to guess what the "right" answer is to get into the study. In combination, a series of gamed Yes/No responses can make a respondent look like he fits your profile when he really doesn't.

  5. Ask open-ended questions if at all possible. This gets respondents to volunteer information in answer to a real question rather than picking the "right" choice from a list of options that the recruiter reads to them. You can give the recruiter the choices you think people will come up with and a pick list for the recruiter to use to note the data. But the recruiter should not read the list to the respondent. For example, on the hospital website, you might ask, "Tell me about your health right now. What were the last three things you visited a doctor for?"

  6. Avoid using number of hours or frequency as a measure of or a proxy for expertise. I was looking for tech savvy people for one study. One respondent told us she spent 60 hours a week on the Web. When she got into the lab, it was clear she didn't know how to use a browser. When I asked her what she does on the Web, she said this computer didn't look like hers at all. That she starts in a place where she clicks on a picture and it brings up her favorite game. Turns out, her son-in-law had set up a series of shortcuts on her desktop. She knew the games were on the Web, but that was all she knew about the Web.

  7. Watch the insider jargon. If you're using industry or company terms for products or services that you want to test, you may prime respondents for what you're looking for and lead them to giving the right answer. Again, open-ended questions can help here. This is where you start looking at your product from the user's point of view.

Need help developing a screener? Need help with doing recruiting? Contact me about recruiting services my company offers. We've got a great process and a 90% show rate.


  1. This is fantastic feedback for anyone looking to screen research participants. My company has been considering hiring a recruiting firm to handle our external study participants, but my biggest fear is getting the "right" people for the task. Your advice is right-on for making sure that we bring in people who actual fit the profile based on our tasks and not on demographics.

  2. I agree open-ended questions in a screener are best.

    But one reason some usability professionals use 'yes/no' questions is because they don't have confidence that the external recruiters can effectively assess what an acceptable open ended answer would be.

    In some cases, they may find that asking a 'yes/no' question is the safer approach.

    How would you handle this concern?

  3. I enjoyed your article, but thought that one thing that was missing was the need to screen out candidates who are shy or inarticulate. You learn a lot less from these kinds of candidate in a test. You could make this judgement from the way they answer the open ended questions. I've written about this in Writing the perfect participant screener.