Abstract
Social scientists often employ direct observation to study human behavior, but a health researcher who proposes it may face considerable skepticism from colleagues. Concerns about reactivity lead many to question the validity of observational data. Because few studies have measured reactivity, evidence to evaluate this concern is limited. The authors report results from their systematic measurement of reactivity during a Peruvian malaria prevention study. In sixty observations over nine months, observers recorded all behavior they perceived as potentially reactive. The authors then assessed reactivity using iterative coding and analysis. Although they documented 339 reactivity episodes, only two involved behaviors related to study objectives. These findings are consistent with prior research and provide additional evidence that reactivity, though common, need not bias study results. The authors suggest strategies for assessing reactivity that can help reassure skeptics and reinforce the validity of observational data.
Original language | English (US) |
---|---|
Pages (from-to) | 3-25 |
Number of pages | 23 |
Journal | Field Methods |
Volume | 21 |
Issue number | 1 |
DOIs | |
State | Published - 2009 |
Keywords
- Data validity
- Direct observation
- Malaria
- Peru
- Reactivity
- Triangulation
ASJC Scopus subject areas
- Anthropology