Skip to content

The importance of research that’s human-centered—and adaptable

If you’ve ever tried to change a habit—or convince someone else to do so—you know firsthand that behavior change is hard. Because we like the hard stuff at KDHRC, a large part of what we do every day is helping our partners inform audiences about important health concerns, disrupt current behaviors, and foster a new set of actions. In other words: changing people’s actions for the better.

Such campaigns have many moving parts and requirements, including an understanding of the target audience, effective messaging, and innovative strategies to deliver that messaging in places where it will be seen and internalized. Good campaigns also require continuous improvement to optimize their impact. This is where human-centered research becomes indispensable.

Human-centered research optimizes the effectiveness of a campaign by infusing people’s perceptions, needs, and context of their lives. Each of our projects include input-gathering from the intended audience(s), subject matter experts, or other stakeholders to optimize the program’s impact or integrate continuous improvement. We ask our intended audience key questions that can provide actionable information about whether and to what degree content is moving knowledge, attitudes, and intentions. We also gather input regularly on content delivery – is the content being understood? Is it persuasive and clear?

The methods for collecting input vary based on the budget, timing, the goals of the project, and what is most comfortable for those participating. For example, we conduct multiple focus groups and interviews with teens and young adults to obtain insights on vaping and anti-vaping message development that can help us further tailor campaigns for “The Real Cost,” FDA’s ongoing tobacco prevention campaign aimed at teens. As another example, for a cochlear implants program, we conducted a usability study among older adults to understand the efficacy of website content, adopting a “talk aloud” methodology to learn how they navigated the material and why or why not they clicked through various portions of the content. Most projects are multi-modal, requiring multiple approaches to collect data and analyze them in such a way that the insights can easily applied back to the campaign.

However, designing anything for humans comes with a caveat: you better be ready to change it. People are unpredictable, and we all bring our own unique set of assumptions to everything we do—whether we are answering a question or designing it. This means that researchers must be ready to adjust even the most perfectly planned focus group, survey, or panel discussion to meet people where they’re at.

Ready, set, adapt

At KDHRC, we pride ourselves on being adaptable researchers. We are not only interested in optimizing campaigns but improving the process of improvement itself. That means digging deep to make sure we are using the right words and asking the right questions to elicit authentic and consistent responses, then recalibrating the questions accordingly. It also means adapting research modalities to fit changing circumstances. Of course, the greatest test in adaptability came last year, when along with the rest of humanity we had to quickly pivot in-person focus groups and workshops to Zoom calls and virtual events. 

One of the ways we adjust and improve our research is to test survey questions before we deploy them. We assemble small groups and ask each person to step through the survey while we watch their screen for hesitations or changes to their answers. In real time, we ask what made them hesitate or change their response. This “survey of a survey” helps us understand which questions might be too ambiguous to garner consistent responses.

Sometimes it’s not the questions that need to change, but the tools. In the development of a playbook for college students with lupus, we assembled an advisory panel of college students to learn more about the type of programs that would work best on campus. When several of the college students on the advisory panel missed their scheduled 1:1 phone calls, we shifted to semi-structured interviews over text—a methodology that provided rich and useful insights.

Getting to the right questions in the right format starts with the awareness that “you don’t know what you don’t know.” It’s not always obvious what the right question is, and our inquiry won’t become truly useful until we find out. Human-centered research takes many factors into consideration, from the words we use to frame the question to the tools by which those questions are asked. This is how we get to the insights that can help a campaign resonate with its audience and make real behavior change possible.

Back To Top
Search