LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Pepper, D.; Hodgen, J.; Lamesoo, K.; Kõiv, P.; Tolboom, J. (2016)
Publisher: Taylor & Francis
Languages: English
Types: Article
Subjects: Cognitive interviewing, think aloud, validation, assessment, PISA, international, self-efficacy, mathematics
Cognitive interviewing (CI) provides a method of systematically collecting validity evidence of response processes for questionnaire items. CI involves a range of techniques for prompting individuals to verbalise their responses to items. One such technique is concurrent verbalisation, as developed in Think Aloud Protocol (TAP). This article investigates the value of the technique for validating questionnaire items administered to young people in international surveys. To date, the literature on TAP has focused on allaying concerns about reactivity – whether response processes are affected by thinking aloud. This article investigates another concern, namely the completeness of concurrent verbalisations – the extent to which respondents verbalise their response processes.\ud An independent, exploratory validation of the PISA assessment of student self-efficacy in mathematics by a small international team of researchers using CI with concurrent verbalisation in four education systems (England, Estonia, Hong Kong, and the Netherlands) provided the basis for this investigation. The researchers found that students generally thought aloud in response to each of the items, thereby providing validity evidence of responses processes varying within and between the education systems, but that practical steps could be taken to increase the completeness of concurrent verbalisations in future validations.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. 1999. Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.
    • American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. 2014. US: Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.
    • Bandura, Albert. 1986. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: PrenticeHall.
    • Bandura, Albert. 1995. Self-efficacy in Changing Societies. Cambridge: Cambridge University Press.
    • Bandura, Albert. 1997. Self-efficacy: The Exercise of Self-Control. New York: W.H. Freeman.
    • Bandura, Albert. 2006. “Guide for Constructing Self-Efficacy Scales.” In Self-Efficacy Beliefs of Adolescents, edited by F. Pajares and T. Urdan, 307-337. Greenwich, CT: Information Age.
    • Bandura, Albert. 2010. “Self-Efficacy.” In The Corsini Encyclopedia of Psychology, edited by Irving B. Weiner and W. Edward Craighead, 1534-1536. Hoboken, NJ: John Wiley.
    • Breakspear, S. 2012. “The Policy Impact of PISA.” OECD Education Working Papers. Paris: OECD.
    • Castillo-Díaz, Miguel, and José-Luis Padilla. 2013. “How Cognitive Interviewing Can Provide Validity Evidence of the Response Processes to Scale Items.” Social Indicators Research 114 (3): 963-975.
    • Eklöf, Hanna. 2010. “Skill and Will: Test-taking Motivation and Assessment Quality.” Assessment in Education: Principles, Policy & Practice 17 (4): 345-356.
    • Ericsson, K. Anders, and Herbert A. Simon. 1993. Protocol Analysis: Verbal Reports as Data. Cambridge, MA: MIT.
    • Fox, Mark C., K. Anders Ericsson, and Ryan Best. 2011. “Do Procedures for Verbal Reporting of Thinking Have To Be Reactive? A Meta-Analysis and Recommendations for Best Reporting Methods.” Psychological Bulletin 137 (2): 316-344.
    • Greene, Jeffrey Alan, Jane Robertson, and Lara-Jeane Croker Costa. 2011. “Assessing Self-Regulated Learning Using ThinkAloud Methods.” In Handbook of Self-Regulation of Learning and Performance, edited by B. J. Zimmerman and D. H. Schunk, 313-328. Abingdon, Oxon: Taylor & Francis.
    • Hopfenbeck, Therese N., and Andrew Maul. 2013. “Examining Evidence for the Validity of PISA Learning Strategy Scales Based on Student Response Processes.” International Journal of Testing 11 (2): 95-121.
    • Kane, Michael T. 2006. “Validation.” In Educational Measurement, edited by R. L. Brennan, 17-64. Westport, CT: Praeger.
    • Kane, Michael T. 2013. “Validating the Interpretations and Uses of Test Scores.” Journal of Educational Measurement 50 (1): 1-73.
    • Karabenick, Stuart A., Michael E. Woolley, Jeanne M. Friedel, Bridget V. Ammon, Julianne Blazevski, Christina Rhee Bonney, and Elizabeth De Groot. 2007. “Cognitive Processing of Self-Report Items in Educational Research: Do They Think What We Mean?” Educational Psychologist 42 (3): 139-151.
    • Leighton, Jacqueline P. 2004. “Avoiding Misconception, Misuse, and Missed Opportunities: The Collection of Verbal Reports in Educational Achievement Testing.” Educational Measurement: Issues and Practice 23 (4): 6-15.
    • Maddox, Bryan, Bruno D. Zumbo, Brenda Tay-Lim, and Demin Qu. 2015. “An Anthropologist among the Psychometricians: Assessment Events, Ethnography, and Differential Item Functioning in the Mongolian Gobi.” International Journal of Testing 15 (4): 291-309.
    • Messick, S. 1995. “Validity of Psychological Assessment: Validation of Inferences from Persons' Responses and Performances as Scientific Inquiry into Score Meaning.” American Psychologist 50 (9): 741-749.
    • Miller, K. 2011. “Cognitive Interviewing.” In Question Evaluation Methods: Contributing to the Science of Data Quality, edited by J. Madans, K. Miller, A. Maitland, and G. Willis, 49-75. Hoboken, NJ: Wiley.
    • OECD. 2005. PISA 2003 Technical Report. Paris: OECD.
    • OECD. 2014. PISA 2012 Technical Report. Paris: OECD.
    • OECD. 2015. How Confident Are Students in Their Ability to Solve Mathematics Problems? Paris: OECD.
    • Pereyra, M. A., H. G. Kotthoff, and R. Cowen. 2011. PISA under Examination: Changing Knowledge, Changing Tests, and Changing Schools. Rotterdam: Sense.
    • Usher, Ellen L., and Frank Pajares. 2009. “Sources of Self-Efficacy in Mathematics: A Validation Study.” Contemporary Educational Psychology 34 (1): 89-101.
    • Willis, G. B. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. London: Sage.
    • Zumbo, Bruno D., Yan Liu, Amery D. Wu, Benjamin R. Shear, Oscar L. Olvera Astivia, and Tavinda K. Ark. 2015. “A Methodology for Zumbo's Third Generation DIF Analyses and the Ecology of Item Responding.” Language Assessment Quarterly 12 (1): 136-151.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article