Charles Explorer logo
🇬🇧

Examining the differences in reporting behavior among East Asian students with different language and immigrant background using international large-scale assessment data

Publication at Faculty of Education |
2022

Abstract

International large-scale assessment studies in education collect, besides information about student achievement, a variety of data concerning student attitudes and their school-related experiences through student questionnaires. Such data grow in importance for both educational policy and practice and it often serves as a basis for the comparison of different groups of students.

However, in comprehensive questionnaire surveys using self-report items students might differ in (a) the way they use rating scales (e.g., two students with different objective levels of a given concept might assess themselves on a rating scale identically) and (b) in the amount of effort they put into responding to the survey (e.g., students might skip questions or answer inconsistently). These inaccuracies might lead to wrong conclusions about the level of the target indicators in different cultures and among different groups of students.

Methodological approaches that would help identify the differences in student reporting behavior are being sought. These include the overclaiming technique (OCT; Paulhus et al., 2003), response styles analysis (Baumgartner & Steenkamp, 2001), and student survey effort measures such as item non-response (Zamarro et al., 2019).

Using these and other approaches, previous studies have documented differences in reporting behavior between different countries and world regions (e.g., He & van de Vijver, 2016; Vonkova, Papajoanu, & Stipek, 2018) and different groups of students within countries (e.g., Vonkova & Hrabak, 2015). One of the regions that draws the attention of researchers focusing on student reporting behavior is East Asia.

In this region, students tend to show high scores on achievement in the PISA studies, but low scores on motivation (He & van de Vijver, 2016) - a paradoxical finding that could be caused by specific reporting behavior of Asian students. The results of some of the previous studies using the OCT, and response styles (RS) analysis have indeed suggested the existence of specific patterns in reporting behavior among these countries/economies (e.g., He & van de Vijver, 2016; Vonkova, Papajoanu, & Stipek, 2018).

However, there is still a limited knowledge concerning the differences in reporting behavior, as measured using multiple methodological approaches, between groups of East Asian students distinguished by their language spoken at home and immigrant background. In this study, we aim to address this gap.

Specifically, our research question is: What are the differences in reporting behavior measures (as identified using the overclaiming technique, response style analysis, and item non-response analysis) between students with different language and immigrant background in East Asian countries/economies? In the study, we use the PISA 2012 student questionnaire data. Our first analysis is focused on the East Asia region (Chinese Taipei, Hong-Kong, Japan, Korea, Macao, Shanghai).

For each student, we computed the reporting behavior measures using three methodological approaches. First, we used the identification response styles independent of item content (Buckley, 2009).

Using student responses to all 4-point Likert scale items in student questionnaire, we computed (a) acquiescence response style (ARS) - the tendency to choose strongly agree response category, (b) disacquiescence response style (DARS) - the tendency to choose strongly disagree category, (c) extreme response style (ERS) - tendency to choose extreme categories (sum of ARS and DARS). Using student responses to five pairs of the most highly correlated 4-point Likert scale items in the student questionnaire, we further computed (d) noncontingent responding (NCR) - tendency to respond inconsistently to similar questions.

Second, we used the overclaiming technique (Vonkova, Papajoanu, & Stipek, 2018), which is based on students' reported familiarity with each from a list of items, where some items are existing concepts from a particular field of knowledge and some are non-existing items. In PISA 2012, the field of knowledge the OCT was applied in was mathematics.

Using the data, we compute (a) proportion of hits (PH; proportion of real items claimed), (b) proportion of false alarms (PFA; proportion of non-existing items claimed), (c) index of accuracy (IA; IA = PH-PFA), (d) index of exaggeration (IE; IE = [PH+PFA]/2). Third, we computed a measure of student survey effort (Zamarro et al., 2019) - questionnaire items non-response (NRS) - as a percentage of all student questionnaire items where a student did not respond.

As for student background variables we used the information concerning the language of the test administered to the student, the language the student speaks at home, the country of birth of the student, and the country of birth of the student's parents. The results of our first analysis have shown that there are differences in reporting behavior among different East Asian countries/economies.

We also show that student background (their country of birth, the country of birth of their parents, the language spoken at home) has a significant effect on their reporting behavior. The analysis revealed, for example, that students in East Asian countries with immigrant background (i.e., who were born or their parents were born outside the country of the test), on average, differ in the values of the reporting behavior measures from those without immigrant background.

Also, there are, on average, differences in reporting behavior between students living outside East Asian countries who speak East Asian language at home or were born in East Asia (i.e., have East Asian background) and students living in East Asian countries. These differences in reporting behavior between individual East Asian countries and different groups of students within these countries (based on their immigrant and language background) merit further attention, as they might hinder the accuracy of the comparisons based on student self-report data.

Further analyses should include more student characteristics (e.g., demographic, socioeconomic) and examine whether these might serve as moderating variables in the analyses. Also, a close examination of other world regions as to the differences in reporting behavior between groups of students with different immigrant, language, and socio-economic background might bring novel insight and enhance our understanding of how reporting behavior affects the results in self-report questionnaire surveys.