LEITURA AULA 3 Cognitive Aspects of Survey Methodology
LEITURA AULA 3 Cognitive Aspects of Survey Methodology
LEITURA AULA 3 Cognitive Aspects of Survey Methodology
SUMMARY
Since its initiation in the early 1980s, research into cognitive aspects of survey methods (CASM) has
made considerable progress in illuminating the cognitive and communicative processes underlying
survey responding. This article reviews key themes and developments, notes strengths and shortcomings and places the contributions to this special issue in this context. Copyright # 2007 John
Wiley & Sons, Ltd.
278
N. Schwarz
1984). Respondents first need to interpret the question to understand what is meant and to
determine which information they ought to provide. If the question is an attitude question,
they may either retrieve a previously formed attitude judgment from memory, or they may
form a judgment on the spot, based on whatever relevant information is accessible at that
point in time. While survey researchers have typically hoped for the former, the latter is far
more likely, consistent with current research into attitude construction in social psychology
(Schwarz, in press; Smith & Conrey, 2007). If the question is a behavioural question,
respondents need to identify the behaviour of interest and recall relevant information.
Survey researchers have typically hoped that respondents do so by reviewing the reference
period specified in the question (such as last week or last month) to identify relevant
instances that can be counted. Unless the behaviour is rare and important, such
an enumeration strategy is rarely used; instead, respondents resort to a variety of estimation
and inference strategies (Menon, 1994). Once a private judgment is formed in respondents
minds, they have to communicate it to the researcher. To do so, they may need to format their
judgment to fit the response alternatives provided as part of the question. Moreover,
respondents may wish to edit their response before they communicate it, due to influences of
social desirability and situational adequacy. Performance of each of these tasks is highly
context dependent and often profoundly shaped by the research instrument. The resulting
contextual influences are usually referred to as response effects in the survey literature.
As Ongena and Dijkstra (2007, this issue) note, this widely shared conceptualisation of
the components of the survey response process is exclusively respondent-focused, as is the
bulk of CASM research. Yet the prototypical survey interview, conducted in person or on
the phone, involves the collaboration of an interviewer and a respondent and is presumably
as much affected by the interviewers as by the respondents performance. Historically,
analyses of interviewer behaviour and its influence on survey data were a key feature of
survey methods research (for a review see Cannell & Kahn, 1968), with an emphasis on the
interpersonal aspects of survey interviews. This approach lost much of its popularity when
comprehensive meta-analyses of response effects, conducted in the early 1970s, indicated
that the influence of task characteristics dwarfed the influence of interviewer and
respondent characteristics (Sudman & Bradburn, 1974). In response to this observation,
survey methodologists turned increasingly to the investigation of the tasks presented by a
survey question. This emphasis provided a natural point of contact for cognitive
psychologists and dominates CASM research, as the contributions to this issue illustrate.
In parallel, qualitative studies in the traditions of ethnography and discourse analysis
illuminated complications that arise in survey interviews (Gerber, 1999; Maynard,
Houtkoop-Steenstra, Schaeffer, & Van der Zouwen, 2002; Suchman & Jordan, 1990),
although many researchers found it difficult to derive general principles from the rich and
compelling case studies presented. Ongena and Dijkstra (2007, this issue) connect these
respondent-focused and interaction-focused research traditions and outline a model that
links the tasks faced by interviewers and respondents. The emerging issues present a
promising topic for experimental psychologists interested in collaborative cognition
(Baltes & Staudinger, 1996).
Cognitive aspects
279
comprehension and judgment. Not surprisingly, survey researchers have always worried if
respondents understand their questions as intended (for early discussions see Belson, 1968,
1981; Cantril, 1944; Payne, 1951). Accordingly, they devised numerous guidelines for
good survey questions, emphasising the need to avoid complicated wordings and
unfamiliar terms (Bradburn, Sudman, & Warnsink, 2004). Although this advice is sound, it
misses that question comprehension involves more than an understanding of the literal
meaning of the utterance. When asked, What have you done today? respondents will
certainly understand the wordsyet they may nevertheless not know on which behaviours
they are to report. Should they report that they took a shower, for example or is the
researcher not interested in this information? To provide a meaningful answer, respondents
need to infer the questioners intentions, that is the pragmatic meaning of the question. To
do so, they rely on the tacit assumptions that govern the conduct of conversations in daily
life (Grice, 1975) and draw on the context of the utterance to infer the intended meaning.
In surveys and laboratory experiments, the researchers contributions to the conversation
are not limited to explicit instructions and the questions asked. Instead, they include
apparently formal aspects of questionnaire design, from the choice of response
alternatives to the formal characteristics of scales and the graphical lay-out of
questionnaires. Numerous studies showed that respondents draw on such characteristics
to infer the pragmatic meaning of questions in ways that can be conceptualised in terms of
Grices (1975) logic of conversation (for reviews see Clark & Schober, 1992; Schwarz,
1996). Even the researchers affiliation, gleaned from the letterhead, or the title of a survey
can affect how respondents interpret the questions asked, as Galesic and Tourangeau (this
volume) illustrate. In their study, respondents answered identical questions about work
place behaviour presented as part of a Sexual Harassment Survey conducted for Women
Against Sexual Harassment or as part of a Work Atmosphere Survey conducted for a
Work Environment Institute. As expected, respondents perceived the same behaviours as
more likely to represent sexual harassment when they were presented as part of a sexual
harassment survey rather than a work atmosphere surveyafter all, why else would they
have been included? Moreover, once labelled as sexual harassment, they rated these
behaviours as more bothersome and reported that they experienced them with a higher
frequency. These findings highlight how contextual variables influence the pragmatic
interpretation of questions, with downstream effects on frequency estimates. Processes of
this type presumably underlie the observation that different surveys arrive at markedly
different prevalence estimates, as Galesic and Tourangeau (2007, this issue) note in their
review.
Deliberate reliance on contextual information is particularly likely when respondents
have no opportunity to ask for clarification or to ground their understanding in an
unconstrained exchange with the interviewer (Schober, 1999). This is the case under the
self-administered conditions of mail and web surveys, where nobody is available
to be asked, or when a well-trained interviewer responds, Whatever it means to you.
From a conversational perspective, such whatever-it-means-to-you responses are
nonsensicalafter all, it is the questioner who has to decide what he or she wants to
know. Nevertheless, they are a standard feature of interviewer training in survey research,
where an assumed need for question standardisation trumps other concerns (Fowler &
Mangione, 1990). But as Suchman and Jordan (1990) noted, what needs to be standardised
is question meaning, not question wording per se. This is particularly apparent when the
question pertains to factual matters, where respondents understanding of what, for
example counts as furniture may differ from the wisdom of the government agency that
Copyright # 2007 John Wiley & Sons, Ltd.
280
N. Schwarz
defined this category of household purchases. In an influential series of studies, Conrad and
Schober (for a review see Schober & Conrad, 2002) provided compelling evidence that a
liberalisation of the standardisation requirement improves data quality under these
conditions, although the same may not apply to attitude questions (see Sudman et al., 1996,
for a discussion). In the present issue, they extend this work from personal interviews to
web surveys and observe that respondents answer more accurately when they can request
clarifications or when clarifications are automatically provided by the program if
respondents take a long time to answer (Conrad, Schober, & Coiner, 2007, this issue).
These design features avoid burdening the questionnaire with numerous definitions for
respondents who may not need them. The tricky question, of course, is how to determine
when a respondent is likely to need clarifications. At present, we know little about how
respondents decide that they do so, nor about how interviewers decide to offer clarifications
when their instructions permit it (see Ongena & Dijkstra, 2007, this issue). Metacognitive
experiences of processing difficulty are likely to play an important role in respondents
decision, but have so far not received attention in CASM research. A systematic
exploration of these issues will fill an important gap in the discussion of standardised
interviewing.
To identify question comprehension problems at the pretesting stage, CASM researchers
developed a rich arsenal of methods that are routinely applied in cognitive laboratories at
survey research centres and statistical agencies (for reviews see the contributions in
Schwarz & Sudman, 1996). Most widely used are cognitive interviews that combine
elements of concurrent or retrospective think-aloud procedures with paraphrasing tasks
(DeMaio & Rothgeb, 1996), as well as detailed analyses of interviews conducted under
field conditions (Fowler & Cannell, 1996). From the perspective of survey practitioners,
the development of improved pretesting procedures is often considered the most important
and fruitful contribution of CASM research, although the contributions of this work to
basic theorising have so far been limited.
Cognitive aspects
281
282
N. Schwarz
Cognitive aspects
283
that minor irritations are part of what they are to report on, given that major annoyances are
unlikely to occur on a daily basis. When asked how often they have been angry last week,
however, they infer that the researcher is interested in more serious instances of
angerafter all, they can hardly be expected to remember all the minor irritations of the
week. This shift in the inferred pragmatic meaning of the question results in higher
frequency reports for short than for long reference periods, reflecting the differential actual
frequency of minor versus major episodes of anger (Winkielman, Knauper, & Schwarz,
1998). Accordingly, mismatches between reports pertaining to reference periods of
differential length are not exclusively due to memory processes and it is often difficult to
disentangle the effects of question interpretation and forgetting.
When behavioural reports are conceptualised in the overall framework of the question
answering process, it becomes apparent that memory processes are only one of the
determinants of their accuracy. As already noted, respondents may report on a behaviour
that does not match what the researcher had in mind (e.g. Schober & Conrad, 2002) and
contextual variables may influence question interpretation with downstream effects on
frequency reports (Galesic & Tourangeau, 2007, this issue; Winkielman et al., 1998).
Moreover, respondents may hesitate to report that they do engage in undesirable
behaviours or fail to engage in desirable ones. The extent to which such socially desirable
responding (for a review see DeMaio, 1984) reflects deliberate misreporting or a
self-serving reconstruction of what one must have done is the topic of some controversy
and it is often difficult to determine whether respondents lie, manage to see themselves in a
positive light or merely rely on cultural norms and generic self-knowledge in
reconstructing their past behaviour (Dunning, 2001; Kunda, 1999; Ross, 1989). Stocke
and Starks (2007, this issue) analysis of reported voting behaviour bears on this ambiguity.
Consistent with previous studies, they observe that respondents over-report voting
behaviour and the more so, the more time has elapsed since election day. More important,
the influence of temporal distance is more pronounced for respondents with high political
involvement. Stocke and Stark suggest that these respondents hold stronger political
participation norms and hence are subject to stronger social desirability bias. This bias is
assumed to exert its strongest influence under conditions of poor memory for ones actual
behaviour, consistent with the observation that self-serving reconstructions are most likely
under conditions of ambiguity (Kunda, 1999). If respondents solely cared about their
self-presentation in the interview, they would presumably also over-report when they are
aware of their absence from the ballot box. While motivational interpretations of such
findings are plausible, it is difficult to rule out a purely cognitive account. Being unable to
recall with any certainty whether they voted in the last election or not, respondents may
draw on their usual behaviour to infer what they must have done (Ross, 1989. Those with
higher political involvement presumably participate in elections more regularly and hence
are also more likely to infer that they must have done so in the last election.
A TWO-WAY BRIDGE?
The initiators of the first CASM conferences hoped to build a two-way bridge between
cognitive psychology and survey methods to facilitate an exchange that would advance
basic research and improve survey practice. In the two decades since these conferences,
this bridge has seen considerable traffic. However, much of this traffic has been from
psychology to survey methods. This is not surprising. Survey research offers a method, not
a substantive body of theorising about human cognition and behaviour. Hence, the contact
Copyright # 2007 John Wiley & Sons, Ltd.
284
N. Schwarz
Cognitive aspects
285
Brown, N. R., Williams, R. L., Barker, E. T., & Galambos, N. L. (2007). Estimating frequencies of
emotions and actions: A web-based diary study. Applied Cognitive Psychology, 21, 259276 (this
issue). DOI: 10.1002/acp.1303
Cannell, C. F., Fisher, G., & Bakker, T. (1965). Reporting on hospitalization in the Health Interview
Survey. Vital and Health Statistics (PHS Publication No. 1000, Series 2, No. 6). Washington, D.C.:
US Government Printing Office.
Cannell, C. F., & Kahn, R. L. (1968). Interviewing. In G. Lindzey, & E. Aronson (Eds.),
The handbook of social psychology (Vol. 2). Reading, MA: Addison-Wesley.
Cannell, C. F., Marquis, K. H., & Laurent, A. (1977). A summary of studies of interviewing
methodology. Vital and Health Statistics, Series 2, No. 69 (DHEW Publication No. HRA 77-1343).
Washington, DC: Government Printing Office.
Cantril, H. (1944). Gauging public opinion. Princeton, N.J.: Princeton University Press.
Chessa, A. G., & Holleman, B. C. (2007). Answering attitudinal questions: Modelling the response
process underlying contrastive questions. Applied Cognitive Psychology, 21, 203225 (this issue).
DOI: 10.1002/acp.1337
Clark, H. H., & Schober, M. F. (1992). Asking questions and influencing answers. In J. M. Tanur
(Ed.), Questions about questions (pp. 1548). New York: Russel Sage.
Conrad, F. G., Schober, M. F., & Coiner, T. (2007). Bringing features of human dialogue to web
surveys. Applied Cognitive Psychology, 21, 165187 (this issue). DOI: 10.1002/acp.1335
DeMaio, T. J. (1984). Social desirability and survey measurement: A review. In C. F. Turner, &
E. Martin (Eds.), Surveying subjective phenomena (Vol. 2, pp. 257281). New York: Russell Sage.
DeMaio, T. J., & Rothgeb, J. M. (1996). Cognitive interviewing techniques: In the lab and in the field.
In N. Schwarz, & S. Sudman (Eds.), Answering questions: Methodology for determining cognitive
and communicative processes in survey research (pp. 177196). San Francisco: Jossey-Bass.
Dunning, D. (2001). On the motives underlying social cognition. In A. Tesser, & N. Schwarz (Eds.),
Blackwell handbook of social psychology: Intraindividual processes (pp. 348374). Oxford, UK:
Blackwell.
Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. Fort Worth, TX: Harcourt Brace
Jovanovich College.
Fowler, F. J., & Cannell, C. F. (1996). Using behavioral coding to identify cognitive problems with
survey questions. In N. Schwarz, & S. Sudman (Eds.), Answering questions: Methodology for
determining cognitive and communicative processes in survey research (pp. 1536). San Francisco: Jossey-Bass.
Fowler, F. J., & Mangione, T. W. (1990). Standardized survey interviewing: Minimizing interviewer
related error. Newbury Park, CA: Sage.
Freedman, D., Thornton, A., Camburn, D., Alwin, D., & Young-DeMarco, L. (1988). The life history
calendar: A technique for collecting retrospective data. Sociological Methodology, 18, 3768.
Galesic, M., & Tourangeau, R. (2007). What is sexual harassment? It depends on who asks! Framing
effects on survey responses. Applied Cognitive Psychology, 21, 189202 (this issue). DOI:
10.1002/acp.1336
Gawronski, B. (Ed.). (in press). What is an attitude? Special issue of Social Cognition.
Gerber, E. R. (1999). The view from anthropology: Ethnography and the cognitive interview. In M.
Sirken, D. Hermann, S. Schechter, N. Schwarz, J. Tanur, & R. Tourangeau (Eds.), Cognition and
survey research (pp. 217234). New York: Wiley.
Grice, H. P. (1975). Logic and conversation. In P. Cole, & J. L. Morgan (Eds.), Syntax and semantics:
Speech acts (Vol. 3, pp. 4158). New York: Academic Press.
Hardin, C. D., & Higgins, E. T. (1996). Shared reality: How social verification makes the subjective
objective. In R. M. Sorrentino, & E. T. Higgins (Eds.), Handbook of motivation and cognition: The
interpersonal context (Vol. 3, pp. 2884). New York: Guilford.
Hippler, H. J., & Schwarz, N. (1986). Not forbidding isnt allowing: The cognitive basis of the
forbid-allow asymmetry. Public Opinion Quarterly, 50, 8796.
Hippler, H. J., Schwarz, N., & Sudman, S. (Eds.). (1987). Social information processing and survey
methodology. New York: Springer Verlag.
Jabine, T. B., Straf, M. L., Tanur, J. M., & Tourangeau, R. (Eds.). (1984). Cognitive aspects of survey
methodology: Building a bridge between disciplines. Washington, DC: National Academy Press.
Jobe, J., & Loftus, E. (Eds.). (1991). Cognitive aspects of survey methodology. Special issue of
Applied Cognitive Psychology, 5, 173296.
Copyright # 2007 John Wiley & Sons, Ltd.
286
N. Schwarz
Krosnick, J. A., & Fabrigar, L. R. (in press). The handbook of questionnaire design. New York, NY:
Oxford University Press.
Kunda, Z. (1999). Social cognition. Cambridge, MA: MIT Press.
Lord, C. G., & Lepper, M. R. (1999). Attitude representation theory. Advances in Experimental Social
Psychology, 31, 265343.
Lyberg, L., Biemer, P., Collins, M., DeLeeuw, E., Dippo, C., Schwarz, N., & Trewin, D. (Eds.).
(1997). Survey measurement and process quality. Chichester, UK: Wiley.
Mathiowetz, N. A., & Duncan, G. J. (1988). Out of work, out of mind: Response errors in
retrospective reports of unemployment. Journal of Business and Economic Statistics, 6, 221229.
Maynard, D., Houtkoop-Steenstra, H., Schaeffer, N. C., & van der Zouwen, J. (Eds.). (2002).
Standardization and tacit knowledge: Interaction and practice in the survey interview. New York:
John Wiley & Sons.
Menon, G. (1994). Judgments of behavioral frequencies: Memory search and retrieval strategies. In
N. Schwarz, & S. S. Sudman (Eds.), Autobiographical memory and the validity of retrospective
reports (pp. 161172). New York: Springer Verlag.
Nowack, A., Szamrej, J., & Latane, B. (1990). From private attitude to public opinion: A dynamic
theory of social impact. Psychological Review, 97, 362376.
OMuircheartaigh, C. O. (1999). CASM: Success, failure, and potential. In M. Sirken, D. Hermann,
S. Schechter, N. Schwarz, J. Tanur, & R. Tourangeau (Eds.), Cognition and survey research
(pp. 3963). New York: Wiley.
Ongena, Y. P., & Dijkstra, W. (2007). A model of cognitive processes and conversational principles in
survey interview interaction. Applied Cognitive Psychology, 21, 145163 (this issue). DOI:
10.1002/acp.1334
Payne, S. L. (1951). The art of asking questions. Princeton: Princeton University Press.
Presser, S., Rothgeb, J. M., Couper, M. P., Lessler, J. T., Martin, E., Martin, J., & Singer, E. (Eds.).
(2004). Methods for testing and evaluating survey questionnaires. New York: Wiley.
Robinson, M. D., & Clore, G. L. (2002). Belief and feeling: Evidence for an accessibility model of
emotional self-report. Psychological Bulletin, 128, 934960.
Ross, M. (1989). The relation of implicit theories to the construction of personal histories.
Psychological Review, 96, 341357.
Rugg, D. (1941). Experiments in wording questions. Public Opinion Quarterly, 5, 9192.
Schober, M. F. (1999). Making sense of questions: An interactional approach. In M. Sirken,
D. Hermann, S. Schechter, N. Schwarz, J. Tanur, & R. Tourangeau (Eds.), Cognition and survey
research (pp. 7794). New York: Wiley.
Schober, M. F., & Conrad, F. G. (2002). A collaborative view of standardized survey interviews. In
D. Maynard, H. Houtkoop-Steenstra, N. C. Schaeffer, & J. van der Zouwen (Eds.), Standardization
and tacit knowledge: Interaction and practice in the survey interview (pp. 6794). New York: John
Wiley & Sons.
Schuman, H., & Presser, S. (1981). Questions and answers in attitude surveys. New York: Academic Press.
Schuman, H., Rieger, C., & Gaidys, V. (1994). Collective memories in the United States and
Lithuania. In N. Schwarz, & S. Sudman (Eds.), Autobiographical memory and the validity of
retrospective reports (pp. 313334). New York: Springer Verlag.
Schwarz, N. (1996). Cognition and communication: Judgmental biases, research methods and the
logic of conversation. Hillsdale, NJ: Erlbaum.
Schwarz, N. (2003). Culture-sensitive context effects: A challenge for cross-cultural surveys. In
J. Harkness, F. van de Vijver, & P. Ph. Mohler (Eds.), Cross-cultural survey methods (pp. 93100).
New York: Wiley.
Schwarz, N. (in press). Attitude construction: Evaluation in context. Social Cognition.
Schwarz, N., & Bless, H. (1992). Constructing reality and its alternatives: Assimilation and contrast
effects in social judgment. In L. L. Martin, & A. Tesser (Eds.), The construction of social judgment
(pp. 217245). Hillsdale, NJ: Erlbaum.
Schwarz, N., & Bless, H. (in press). Mental construal processes: The inclusion/exclusion model. In
D. A. Stapel, & J. Suls (Eds.), Assimilation and contrast in social psychology. Philadelphia, PA:
Psychology Press.
Schwarz, N., & Bohner, G. (2001). The construction of attitudes. In A. Tesser, & N. Schwarz (Eds.),
Blackwell handbook of social psychology: Intraindividual processes (pp. 436457). Oxford, UK:
Blackwell Publishers.
Copyright # 2007 John Wiley & Sons, Ltd.
Cognitive aspects
287
Schwarz, N., Hippler, H. J., Deutsch, B., & Strack, F. (1985). Response scales: Effects of category
range on reported behavior and subsequent judgments. Public Opinion Quarterly, 49, 388395.
Schwarz, N., Park, D., Knauper, B., & Sudman, S. (Eds.). (1999). Cognition, aging, and self-reports.
Philadelphia, PA: Psychology Press.
Schwarz, N., & Sudman, S. (Eds.). (1992). Context effects in social and psychological research. New
York: Springer Verlag.
Schwarz, N., & Sudman, S. (1994). Autobiographical memory and the validity of retrospective
reports. New York: Springer Verlag.
Schwarz, N., & Sudman, S. (Eds.). (1996). Answering questions: Methodology for determining
cognitive and communicative processes in survey research. San Francisco: Jossey-Bass.
Shum, M. S., & Rips, L. J. (1999). The respondents confession: Autobiographical memory in the
context of surveys. In Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., &
Tourangeau, R. (Eds.), Cognition and survey research (pp. 95109). New York: Wiley.
Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., & Tourangeau, R. (Eds.). (1999).
Cognition and survey research. New York: Wiley.
Smith, E. R., & Conrey, F. R. (2007). Mental representations are states not things: Implications for
implicit and explicit measurement. In B. Wittenbrink, & N. Schwarz (Eds.), Implicit measures of
attitudes: Progress and controversies. (pp. 247264). New York: Guilford.
Stocke, V., & Stark, T. (2007). Political involvement and memory failure as interdependent
determinants of vote overreporting. Applied Cognitive Psychology, 21, 239257 (this issue).
DOI: 10.1002/acp.1339
Strack, F., & Martin, L. (1987). Thinking, judging, and communicating: A process account of context
effects in attitude surveys. In H. J. Hippler, N. Schwarz, & S. Sudman (Eds.), Social information
processing and survey methodology (pp. 123148). New York: Springer Verlag.
Suchman, L., & Jordan, B. (1990). Interactional troubles in face-to-face interviews. Journal of the
American Statistical Association, 85, 232241.
Sudman, S., & Bradburn, N. M. (1974). Response effects in surveys: A review and synthesis. Chicago:
Aldine, 1974.
Sudman, S., Bradburn, N., & Schwarz, N. (1996). Thinking about answers: The Application of
cognitive processes to survey methodology. San Francisco, CA: Jossey-Bass.
Tanur, J. M. (Ed.). (1992). Questions about questions. New York: Russel Sage.
Tourangeau, R. (1984). Cognitive science and survey methods: A cognitive perspective. In T. Jabine,
M. Straf, J. Tanur, & R. Tourangeau (Eds.), Cognitive aspects of survey methodology: Building a
bridge between disciplines (pp. 73100). Washington, DC: National Academy Press.
Tourangeau, R. (1992). Attitudes as memory structures: Belief sampling and context effects. In N.
Schwarz, & S. Sudman (Eds.), Context effects in social and psychological research (pp. 3547).
New York: Springer Verlag.
Tourangeau, R. (1999). Context effects on answers in attitude questions. In M. Sirken, D. Hermann,
S. Schechter, N. Schwarz, J. Tanur, & R. Tourangeau (Eds.), Cognition and survey research
(pp. 111132). New York: Wiley.
Tourangeau, R., & Rasinski, K. A. (1988). Cognitive processes underlying context effects in attitude
measurement. Psychological Bulletin, 103, 299314.
Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The psychology of survey response. Cambridge:
Cambridge University Press.
van der Vaart, W., & Glasner, T. (2007). Applying a timeline as a recall aid in a telephone survey: A
record check study. Applied Cognitive Psychology, 21, 227238 (this issue). DOI: 10.1002/
acp.1338
Wentland, E. J., & Smith, K. W. (1993). Survey responses: An evaluation of their validity. San
Diego, CA: Academic Press.
Winkielman, P., Knauper, B., & Schwarz, N. (1998). Looking back at anger: Reference periods
change the interpretation of (emotion) frequency questions. Journal of Personality and Social
Psychology, 753, 719728.
Wyer, R. S., & Carlston, D. E. (1979). Social cognition, inference, and attribution. Hillsdale, NJ:
Erlbaum.