Skip to main content

Advertisement

Log in

The impact of linguistic similarity on cross-cultural comparability of students’ perceptions of teaching quality

  • Published:
Educational Assessment, Evaluation and Accountability Aims and scope Submit manuscript

Abstract

Valid cross-country comparisons of student learning and pivotal factors contributing to it, such as teaching quality, offer the possibility to learn from outstandingly effective educational systems across the world and to improve learning in classrooms by providing policy relevant information. Yet, it often remains unclear whether the instruments used in international large–scale assessments work similarly across different cultural and linguistic groups, and thus can be used for comparing them. Using PISA 2012 data, we investigated data comparability of three teaching quality dimensions, namely student support, classroom management, and cognitive activation using a newly developed psychometric approach, namely alignment. Focusing on 15 countries, grouped into five linguistic clusters, we secondly assessed the impact of linguistic similarity on data comparability. Main findings include that (1) comparability of teaching quality measures is limited when comparing linguistically diverse countries; (2) the level of comparability varies across dimensions; (3) linguistic similarity considerably enhances the degree of comparability, except across the Chinese-speaking countries. Our study illustrates new and more flexible possibilities to test for data comparability and outlines the importance to consider cultural and linguistic differences when comparing teaching-related measures across groups. We discuss possible sources of lacking data comparability and implications for comparative educational research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Spain was not chosen since there are five different language versions for the autonomous Spanish communities (OECD 2014).

  2. In PISA 2012, China was represented through separate educational systems. Hong Kong was not chosen for our study, since the language of instruction is English for a major part of the student population (OECD 2014). Since Shanghai, Macao, and Taipei were treated as separate educational systems in PISA 2012, we treat them as “countries” in our study for simplicity, even though they should be referred to as cities/educational systems.

  3. Given the two-stage random sampling of students (stage 1) and schools (stage 2), PISA data does not provide information on the classroom level (Scherer et al. 2016).

References

  • Asparouhov, T., & Muthén, B. O. (2014). Multiple-group factor analysis alignment. Structural Equation Modeling: A Multidisciplinary Journal, 21, 495–508.

    Article  Google Scholar 

  • Baumert, J., Kunter, M., Blum, W., Brunner, M., Voss, T., Jordan, A., Klusmann, U., Krauss, S., Neubrand, M., & Tsai, Y. M. (2010). Teachers’ mathematical knowledge, cognitive activation in the classroom, and student progress. American Educational Research Journal, 47, 133–180.

    Article  Google Scholar 

  • Brophy, J. E. (2000). Teaching. Brussels: International Academy of Education.

    Google Scholar 

  • Brown, T. A. (2015). Confirmatory factor analysis for applied research (second edition). New York: The Guilford Press.

    Google Scholar 

  • Byrne, B. M., & van de Vijver, F. J. R. (2017). The maximum likelihood alignment approach to testing for approximate measurement invariance: a paradigmatic cross-cultural application. Psicothema, 29, 539–551.

    Google Scholar 

  • Byrne, B. M., Shavelson, R. J., & Muthén, B. O. (1989). Testing for the equivalence of factor covariance and mean structures - the issue of partial measurement invariance. Psychological Bulletin, 105, 456–466.

    Article  Google Scholar 

  • Caro, D. H., Lenkeit, J., & Kyriakides, L. (2016). Teaching strategies and differential effectiveness across learning contexts: evidence from PISA 2012. Studies in Educational Evaluation, 49, 30–41.

    Article  Google Scholar 

  • Çetin, B. (2010). Cross-cultural structural parameter invariance on PISA 2006 student questionnaires. Eurasian Journal of Educational Research, 38, 71–89.

    Google Scholar 

  • Cheung, F. M., van de Vijver, F. J. R., & Leong, F. T. L. (2011). Toward a new approach to the study of personality in culture. The American Psychologist, 66, 593–603.

    Article  Google Scholar 

  • Condon, L., Ferrando, P. J., & Demestre, J. (2006). A note on some item characteristics related to acquiescent responding. Personality & Individual Differences, 40, 403–407.

    Article  Google Scholar 

  • Davidov, E., Meuleman, B., Cieciuch, J., Schmidt, P., & Billiet, J. (2014). Measurement equivalence in cross-national research. Annual Review of Sociology, 40, 55–75.

    Article  Google Scholar 

  • De Roover, K., Vermunt, J. K., Timmerman, M. E., & Ceulemans, E. (2017). Mixture simultaneous factor analysis for capturing differences in latent variables between higher level units of multilevel data. Structural Equation Modeling: A Multidisciplinary Journal, 24, 506–523.

    Article  Google Scholar 

  • Deci, E. L., & Ryan, R. M. (1996). Intrinsic motivation and self-determination in human behavior. New York: Plenum.

    Google Scholar 

  • Desa, D. (2014). Evaluating measurement invariance of TALIS 2013 complex scales: comparison between continuous and categorical multiple-group confirmatory factor analyses. OECD Education Working Papers: Vol. 103. Paris: OECD Publishing.

  • Dietrich, J., Dicke, A.-L., Kracke, B., & Noack, P. (2015). Teacher support and its influence on students’ intrinsic value and effort: dimensional comparison effects across subjects. Learning and Instruction, 39, 45–54.

    Article  Google Scholar 

  • Fauth, B., Decristan, J., Rieser, S., Klieme, E., & Buttner, G. (2014). Student ratings of teaching quality in primary school: dimensions and prediction of student outcomes. Learning and Instruction, 29, 1–9.

    Article  Google Scholar 

  • Fischer, H. E., Labudde, P., Neumann, K., & Viiri, J. (2014). Quality of instruction in physics: comparing Finland, Germany and Switzerland. Münster: Waxmann.

    Google Scholar 

  • Flake, J. K., & McCoach, D. B. (2018). An investigation of the alignment method with polytomous indicators under conditions of partial measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 25, 56–70.

    Article  Google Scholar 

  • Gläser-Zikuda, M., Harring, M., & Rohlfs, C. (Eds.). (2017). Handbuch Schulpädagogik [handbook school pedagogy]. Stuttgart: UTB; Waxmann.

    Google Scholar 

  • Hattie, J. A. (2009). Visible learning: a synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.

    Google Scholar 

  • He, J., & Kubacka, K. (2015). Data comparability in the teaching and learning international survey (TALIS) 2008 and 2013. OECD education working papers: Vol.124. Paris: OECD Publishing.

  • He, J., & van de Vijver, F. J. R. (2016). Correcting for scale usage differences among Latin American countries, Portugal, and Spain in PISA. Revista Electronica de Investigacion y Evaluacion Educativa, 22.

  • He, J., Buchholz, J., & Klieme, E. (2017). Effects of anchoring vignettes on comparability and predictive validity of student self-reports in 64 cultures. Journal of Cross-Cultural Psychology, 48, 319–334.

    Article  Google Scholar 

  • Klieme, E., Pauli, C., & Reusser, K. (2009). The Pythagoras study. In T. Janik & T. Seidel (Eds.), The power of video studies in investigating teaching and learning in the classroom (pp. 137–160). Münster: Waxmann.

    Google Scholar 

  • Klusmann, U., Kunter, M., Trautwein, U., Lüdtke, O., & Baumert, J. (2008). Teachers’ occupational well-being and quality of instruction: the important role of self-regulatory patterns. Journal of Educational Psychology, 100, 702–715.

    Article  Google Scholar 

  • Kounin, J. S. (1970). Discipline and group management in classrooms. New York: Holt, Rinehart & Winston.

    Google Scholar 

  • Kramsch, C. (1998). Language and culture. Oxford: University Press.

    Google Scholar 

  • Kuger, S., Klieme, E., Lüdtke, O., Schiepe-Tiska, A., & Reiss, K. (2017). Mathematikunterricht und Schülerleistung in der Sekundarstufe: Zur Validität von Schülerbefragungen in Schulleistungsstudien. [Mathematics achievement and student outcomes in secondary education: validity of student scores in educational studies]. Zeitschrift für Erziehungswissenschaft, 20, 61–98.

    Article  Google Scholar 

  • Kunter, M., Baumert, J., & Köller, O. (2007). Effective classroom management and the development of subject-related interest. Learning and Instruction, 17, 494–509.

    Article  Google Scholar 

  • Lafontaine, D., Dupont, V., Jaegers, D., & Schillings, P. (2018). Self-concept in reading: subcomponents, cross-cultural invariance and relationships with reading achievement in an international context (PIRLS 2011). Submitted to Studies in Educational Evaluation.

  • Lee, J. (2012). Conducting cognitive interviews in cross-national settings. Assessment, 21.

  • Lipowsky, F., Rakoczy, K., Pauli, C., Drollinger-Vetter, B., Klieme, E., & Reusser K. (2009). Quality of geometry instruction and its short-term impact on students’ understanding of the Pythagorean Theorem. Learning and Instruction, 19, 257–537.

    Article  Google Scholar 

  • Lomazzi, V. (2018). Using alignment optimization to test the measurement invariance of gender role attitudes in 59 countries. Methods, Data, Analyses: A Journal for Quantitative Methods and Survey Methodology, 12, 77–103.

    Google Scholar 

  • Lüdtke, O., Robitzsch, A., Trautwein, U., & Kunter, M. (2009). Assessing the impact of learning environments: how to use student ratings of classroom or school characteristics in multilevel modelling. Contemporary Educational Psychology, 34, 120–131.

    Article  Google Scholar 

  • Miller, K., Mont, D., Maitland, A., Altman, B., & Madans, J. (2011). Results of a cross-national structured cognitive interviewing protocol to test measures of disability. Quality & Quantity, 45, 801–815.

    Article  Google Scholar 

  • Munck, I., Barber, C., & Torney-Purta, J. (2017). Measurement invariance in comparing attitudes toward immigrants among youth across Europe in 1999 and 2009: the alignment method applied to IEA CIVED and ICCS. Sociological Methods & Research.

  • Muthén, B. O., & Asparouhov, T. (2012). Bayesian structural equation modeling: a more flexible representation of substantive theory. Psychological Methods, 17, 313–335.

    Article  Google Scholar 

  • Muthén, B. O., & Asparouhov, T. (2014). IRT studies of many groups: the alignment method. Frontiers in Psychology, 5, 978.

    Google Scholar 

  • Muthén, B. O., & Asparouhov, T. (2018). Recent methods for the study of measurement invariance with many groups: alignment and random effects. Sociological Methods & Research, 47, 637–664.

    Article  Google Scholar 

  • Muthén, L. K., & Muthén, B.O (1998–2012). Mplus User’s Guide. Seventh Edition. Los Angeles, CA: Muthén & Muthén.

  • Nilsen, T., & Gustafsson, J.-E. (2016). Teacher quality, instructional quality and student outcomes: Relationships across countries, cohorts and time. IEA research for education, a series of in-depth analyses based on data of the International Association for the Evaluation of Educational Achievement (IEA). Cham: Springer International Publishing.

  • OECD. (2013). PISA 2012 results: ready to learn (volume III) – students’ engagement, drive and self-beliefs. Paris: OECD Publishing.

    Book  Google Scholar 

  • OECD. (2014). PISA 2012 technical report. Paris: OECD Publishing.

    Google Scholar 

  • OECD. (2015). Codebook for PISA 2012 Main Study Student Questionnaire. Paris: OECD Publishing.

    Google Scholar 

  • Pinger, P., Rakoczy, K., Besser, M., & Klieme, E. (2017). Interplay of formative assessment and instructional quality—interactive effects on students’ mathematics achievement. Learning Environments Research, 47, 133.

    Google Scholar 

  • Praetorius, A.-K., Pauli, C., Reusser, K., Rakoczy, K., & Klieme, E. (2014). One lesson is all you need? Stability of instructional quality across lessons. Learning and Instruction, 31, 2–12.

    Article  Google Scholar 

  • Praetorius, A.-K, Klieme, E., Bell, C.A., Qi, Y., Witherspoon, W., & Opfer, D. (2018a). Country conceptualizations of teaching quality in TALIS Video: Identifying similarities and differences. Paper presentation at the annual meeting of the American Educational Research Association, New York, NY.

  • Praetorius, A.-K., Klieme, E., Herbert, B., & Pinger, P. (2018b). Generic dimensions of teaching quality: the German framework of Three Basic Dimensions. ZDM, 47, 97.

    Google Scholar 

  • Rutkowski, L., & Svetina, D. (2014). Assessing the hypothesis of measurement invariance in the context of large-scale international surveys. Educational and Psychological Measurement, 74, 31–57.

    Article  Google Scholar 

  • Scherer, R., Nilsen, T., & Jansen, M. (2016). Evaluating individual students’ perceptions of instructional quality: an investigation of their factor structure, measurement invariance, and relations to educational outcomes. Frontiers in Psychology, 7, 110.

    Google Scholar 

  • Schulz, W. (2003). Validating questionnaire constructs in international studies: two examples from PISA 2000. In Paper presentation at the annual meeting for the. Chicago: American Educational Research Association.

    Google Scholar 

  • Schulz, W. (2005). Testing parameter invariance for questionnaire indices using confirmatory factor analysis and item response theory. Paper presentation at the annual meeting for the American Educational Research Association, San Francisco.

  • Seidel, T., & Shavelson, R. J. (2007). Teaching effectiveness research in the past decade: the role of theory and research design in disentangling meta-analysis results. Review of Educational Research, 77, 454–499.

    Article  Google Scholar 

  • Täht, K., & Must, O. (2013). Comparability of educational achievement and learning attitudes across nations. Educational Research and Evaluation, 19, 19–38.

    Article  Google Scholar 

  • van de Grift, W. J. C. M. (2014). Measuring teaching quality in several European countries. School Effectiveness and School Improvement, 25, 295–311.

    Article  Google Scholar 

  • van de Vijver, F. J. R. (2018a). Capturing bias in structural equation modeling. In E. Davidov, P. Schmidt, J. Billiet, & B. Meuleman (Eds.), Cross-cultural analysis: methods and applications. New York: Routledge.

    Google Scholar 

  • van de Vijver, F. J. R. (2018b). Talk at the OECD-GESIS seminar: translating and adapting instruments in large-scale assessments. Paris.

  • van de Vijver, F. J. R., & He, J. (2016). Bias assessment and prevention in non-cognitive outcome measures in PISA questionnaires. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Methodology of educational measurement and assessment. Assessing contexts of learning: an international perspective. Cham: Springer International Publishing.

    Google Scholar 

  • van de Vijver, F. J. R., & Leung, K. (1997). Methods and data analysis of comparative research. Thousand Oaks: Sage.

    Google Scholar 

  • van de Vijver, F. J. R., & Tanzer, N. K. (2004). Bias and equivalence in cross-cultural assessment: an overview. Revue Européenne de Psychologie Appliqué, 54, 119–135.

    Article  Google Scholar 

  • Walberg, H. J., & Paik, S. J. (2000). Effective educational practices. Educational practices series. Brussels: IAE.

    Google Scholar 

  • Willis, G. B., & Miller, K. (2011). Cross-cultural cognitive interviewing: seeking comparability and enhancing understanding. Field Methods, 23, 331–341.

    Article  Google Scholar 

  • Yi, H. Y., & Lee, Y. (2017). A latent profile analysis and structural equation modeling of the instructional quality of mathematics classrooms based on the PISA 2012 results of Korea and Singapore. Asia Pacific Education Review, 18, 23–39.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jessica Fischer.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fischer, J., Praetorius, AK. & Klieme, E. The impact of linguistic similarity on cross-cultural comparability of students’ perceptions of teaching quality. Educ Asse Eval Acc 31, 201–220 (2019). https://doi.org/10.1007/s11092-019-09295-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11092-019-09295-7

Keywords

Navigation