Research Article
BibTex RIS Cite

INVESTIGATING THE WORDING EFFECT IN SCALES BASED ON DIFFERENT DIMENSION REDUCTION TECHNIQUES

Year 2022, Volume: 35 Issue: 1, 44 - 67, 28.04.2022
https://doi.org/10.19171/uefad.1033284

Abstract

This study aims to examine the dimensionality of a dataset obtained from the application of a multidimensional scale which is not balanced in terms of the numbers of the positively and negatively worded items based on the item response theory, DETECT, and factor analysis. To this aim, a scale developed to measure parents' perceptions of their children's school was utilized. The study group consisted of 1,388 parents. The dimensionality of the dataset obtained from the scale was examined based on the item response theory, DETECT and factor analyses. The results of the three methods commonly revealed that the negatively worded items formed a separate cluster. Based on this finding, it can be stated that relations among the negatively worded items cannot only be explained by the intended trait with the scale, and the wording of the items also affect the relations among the negatively worded items. No matter what dimension reduction technique was used to analyze the data, the study results evidenced the existence of a strong wording effect in the data set. Based on the study results, some recommendations were given to researchers and practitioners regarding how to examine the dimensionality of a dataset obtained from a scale including both positively and negatively worded items.

References

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of Educational Measurement, 29, 67-91. Ackerman, T. A., Gierl, M. A., & Walker, C. M. (2003). Using multidimensional item response theory to evaluate educational and psychological tests. Educational Measurement: Issues and Practice, 22, 37-53.
  • Baker, F. B. (2001). The basics of item response theory. ERIC Clearinghouse on Assessment and Evaluation.
  • Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement, 60(3), 361-370.
  • Brown, T. A. (2006). Confirmatory factor analysis for applied research. The Guilford Press.
  • Büyüköztürk, Ş. (2011). Sosyal bilimler için veri analizi el kitabı. Pegem Akademi.
  • Chalmers, R. P. (2012). A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1-29.
  • Checa, I., & Espejo, B. (2018). Method effects associated with reversed items in the 29 items Spanish version of Ryff’s Well-Being Scales. Neuropsychiatry, 8(5), 1533-1540.
  • Chen, Y., Rendina-Gobioff, G., & Dedrick, R. F. (2007). Detecting effects of positively and negatively worded items on a self-concept scale for third and sixth grade elementary students. Paper presented at the 52nd Annual Meeting of the Florida Educational Research Association.
  • Curry, K.A & Holter, A. (2019) The Influence of Parent Social Networks on Parent Perceptions and Motivation for Involvement. Urban Education, 54(4), 535-563.
  • Davison, M. (1977). On a metric, unidimensional unfolding model for attitudinal and developmental data. Psychometrika, 42, 523-548.
  • De Ayala, R. J. (1994). The influence of multidimensionality on the graded response model. Applied Psychological Measurement, 18(2), 155-170.
  • De Ayala, R. J. (2009). De Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
  • DiStefano, C., & Motl, R. W. (2009). Personality correlates of method effects due to negatively worded items on the Rosenberg Self-Esteem scale. Personality and Individual Differences, 46, 309–313.
  • Embretson, S. E., & Reise, S.P.(2000). Item response theory for psychologists. Lawrence Erlbaum Associate, Inc.
  • Gable, R., Murphy, C. A., Hall, C., & Clark, A.E., (1986) The Development of the Pilot Form of the Parent Attitudes toward School Effectiveness (PATSE) Questionnaire. Paper presented at the annual meeting of the American Educational Research Association, San Francisco.
  • Gu, H., Wen, Z., & Fan, X. (2015). The impact of wording effect on reliability and validity of the Core Self-Evaluation Scale (CSES): A bi-factor perspective. Personality and Individual Differences, 83, 142-147.
  • Ho, R. (2006). Handbook of Univariate and Multivariate Data Analysis and Interpretation with SPSS. Taylor& Francis Group, LLC.
  • Hyland, P., Boduszek, D., Dhingra, K., Shevlin, M., & Egan, A. (2014). A bifactor approach to modelling the Rosenberg Self Esteem Scale. Personality and Individual Differences, 66, 188-192.
  • İlhan, M., & Güler, N. (2017). The number of response categories and the reverse scored item problem in Likert-type scales: A study with the Rasch model. Journal of Measurement and Evaluation in Education and Psychology, 8(3), 321-343.
  • Immekus, J., & Imbrie, P. K. (2008). Dimensionality assessment using the full information item bifactor analysis for graded response data an illustration with the State Metacognitive Inventory. Educational and Psychological Measurement, 68(4), 695-709.
  • Kim, H. R. (1994). New techniques for the dimensionality assessment of standardized test data (Doctoral dissertation, University of Illinois at Urbana-Champaign).
  • Knight, R. G., Chisholm, B. J., Marsh, N. V., & Godfrey, H. P. D. (1988). Some normative, reliability and factor analytic data for the Revised UCLA Loneliness Scale. Journal of Clinical Psychology, 44(2), 203-206. Kula-Kartal, S., & Kutlu, Ö. (2020). Analyzing the dimensionality of Academic Motivation Scale based on the item response theory models. Eurasian Journal of Educational Research, 86, 157–174.
  • Kula-Kartal, S., & Mor-Dirlik, E. (2021). Examining the dimensionality and monotonicity of an attitude dataset based on the item response theory models. International Journal of Assessment Tools in Education, 8(2), 296-309.
  • Kula-Kartal, S. (2021). Examining scale items in terms of method effects based on the bifactor item response theory model. Kastamonu Education Journal, 29(1), 201-209.
  • Li, Y., Jiao, H., & Lissitz, R. W. (2012). Applying multidimensional item response theory models in validating test dimensionality: An example of K–12 large scale science assessment. Journal of Applied Testing Technology, 13(2), 2-27.
  • Liu, J. (2007). Comparing multidimensional and unidimensional computer adaptive strategies in psychological and health assessment (Doctoral dissertation). Retrieved from http://d-scholarship.pitt.edu/.
  • Matlock K. L., Turner, R. C., & Gitchel, W. D. (2018). A study of reverse worded matched item pairs using the generalized partial credit and nominal response models. Educational and Psychological Measurement, 78(1), 103-127.
  • Melnick, S. A. & Fiene, R. (1989) Parent Attitudes toward School Effectiveness in the Harrisburg City School District's Elementary Division. Final Report. Reports – Research / Technical (143). Pennsylvania State Univ., Middletown. Capitol Campus.
  • Merritt, S.M. (2012). The two-factor solution to Allen and Meyer’s (1990) affective commitment scale: Effects of negatively worded items. Journal of Business and Psychology, 27(4), 421-436.
  • Paek, I., & Cole, K. (2020). Using R for item response theory model applications. Routledge Taylor & Francis Group.
  • Paulhus, D. L. (1991). Measurement and control of response bias. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes (Vol. 1, pp. 17-59). San Diego, CA: Academic Press.
  • Pilotte, W. J., & Gable, R. K. (1990). The impact of positive and negative item stems on the validity of a computer anxiety scale. Educational and Psychological Measurement, 50, 603-610.
  • Reckase, M. D. (2009). Multidimensional item response theory. New York: Springer. Reise, S. P., & Revicki, D. A. (2015). Handbook of item response theory modeling. Taylor & Francis Group.
  • Rennie, K. M. (1997). Exploratory and Confirmatory Rotation Strategies in Exploratory Factor Analysis. Annual Meeting of the Southwest
  • Robitzsch, A. (2018). sirt: Supplementary item response theory models. R package version 2.4-20. https://CRAN.R-project.org/package=sirt
  • Roszkowski, M.J., & Soven, M. (2010). Shifting gears: Consequences of including two negatively worded items in the middle of a positively worded questionnaire. Assessment & Evaluation in Higher Education, 35(1), 113-130.
  • Roussos, L. A., & Özbek, Ö. Y. (2006). Formulation of the DETECT population parameter and evaluation of detect estimator bias. Journal of Educational Measurement, 43(3), 215–243.
  • Salazar, M. S. (2015). The dilemma of combining positive and negative items in scales. Psicothema, 27(2), 192-199.
  • Schriesheim, Chester A., Regina J. Eisenbach, & Kenneth D. Hill (1991). The Effect of negation and polar opposite item reversals on questionnaire reliability and validity: An experimental investigation. Educational and Psychological Measurement, 51(1), 67-78.
  • Spector, P. E., Katwyk, P. T., Brannick, M. T., & Chen, P. Y. (1997). When two factors don’t reflect two constructs: How item characteristic can produce artifactual factors. Journal of Management, 23(5), 659-677.
  • Stevens, J. (2002). Applied Multivariate Statistics for Social Sciences. (4th ed.). Lawrence Erlbaum Associates.
  • Stout, W., Habing, B., Douglas, J., Kim, H. R., Roussos, L., & Zhang, J. (1996). Conditional covariance-based nonparametric multidimensionality assessment. Applied Psychological Measurement, 20(4), 331-354.
  • Stout, W., Nandakumar, R., & Habing, B. (1996). Analysis of latent dimensionality of dichotomously and polytomously scored test data. Psychometrika, 23(1), 37-65.
  • Suarez-Alvarez, J., Pedrosa, I., Lozano, L. M., Garcia-Cueto, E., Cuesta, M., & Muniz, J. (2018). Using reversed items in Likert scales: A questionable practice. Psicothema, 30(2), 149-158
  • Şeker, H. (2011) Developing a questionnaire on attitude towards school. Learning Environments Research, 14(3), 241-261.
  • Tate, R. (2003). A comparison of selected empirical methods for assessing the structure of responses to test items. Applied Psychological Measurement, 27, 159–203.
  • Tavşancıl, E. (2010). Tutumların ölçülmesi ve SPSS ile veri analizi. Nobel Yayın Dağıtım.
  • Thissen, D., & Wainer, H. (2001). Test scoring. Lawrence Erlbaum Associates, Inc.
  • Tomas, J. M., & Oliver, A. (1999) Rosenberg's self‐esteem scale: Two factors or method effects. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 84-98.
  • Van Schuur, W. H., & Kiers, H. A. L. (1994). Why factor analysis often is the incorrect model for analyzing bipolar concepts, and what model to use instead. Applied Psychological Measurement, 18(2), 97-110.
  • Wang, J., Siegal, H. A., Falck, R. S., & Carlson, R. G. (2001) Factorial structure of Rosenberg's Self-Esteem Scale among crack-cocaine drug users. Structural Equation Modeling, 8(2), 275-286.
  • Weijters, B., & Baumgartner, H. (2012). Misresponse to reversed and negated items in surveys: A review. Journal of Marketing Research, 59(5), 737-747.
  • Wong, N., Rindfleisch, A., & Burroghs, J. E. (2003). Do reversed worded items confound measures uncross cultural consumer research? The case of material values scale. Journal of Consumer Research, 30(1), 72-91.
  • Zhang, B., Luo, J., Chen, Y., Roberts, B., Drasgow, F. (2020). The road less traveled: A cross-cultural study of the negative wording factor in multidimensional scales. https://doi.org/10.31234/osf.io/2psyq
  • Zhang, J., & Stout, W. (1999). The theoretical DETECT index of dimensionality and its application to approximate simple structure. Psychometrika, 64(2), 213-249.

ÖLÇEKLERDE İFADE ETKİSİNİN FARKLI BOYUTLULUK ANALİZLERİYLE İNCELENMESİ

Year 2022, Volume: 35 Issue: 1, 44 - 67, 28.04.2022
https://doi.org/10.19171/uefad.1033284

Abstract

Bu araştırmanın amacı, olumlu ve olumsuz madde sayısının dengeli olmadığı çok boyutlu bir ölçekten elde edilen verinin boyutluluğunun çok boyutlu madde tepki kuramı, DETECT ve faktör analizi yöntemlerine dayalı olarak incelenmesidir. Bu amaçla, araştırma kapsamında velilerin okula yönelik oluşturdukları algılarının ortaya çıkarılması amacıyla geliştirilmiş bir ölçek kullanılmıştır. Çalışma grubunda 1388 veli yer almıştır. Ölçeğin uygulanmasından elde edilen verinin boyutluluğu faktör analizi, çok boyutlu madde tepki kuramı ve DETECT analizine dayalı olarak incelenmiştir. Üç farklı boyutluluk analizine dayalı olarak verinin boyutluluğu incelendiğinde, tüm yöntemlerde ortak biçimde olumsuz maddelerin ayrı bir boyut oluşturduğu görülmüştür. Buna göre, olumsuz maddeler arasındaki ilişkilerin yalnızca ölçekle ölçülen özellikle açıklanamayacağı, olumsuz maddeler arasındaki ilişkiler üzerinde maddelerin ifade ediliş yönünün de etkili olduğu belirtilebilir. Bu araştırmanın bulguları, veri setinde her teknikte ortaya çıkan güçlü bir ifade etkisi olduğunu göstermiştir. Bu bulgulara dayalı olarak, araştırmacı ve uygulayıcılara olumlu ve olumsuz maddelerin birlikte kullanıldığı ölçeklerden elde edilen verinin boyutluluğunun incelenmesine ilişkin önerilerde bulunulmuştur.

References

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of Educational Measurement, 29, 67-91. Ackerman, T. A., Gierl, M. A., & Walker, C. M. (2003). Using multidimensional item response theory to evaluate educational and psychological tests. Educational Measurement: Issues and Practice, 22, 37-53.
  • Baker, F. B. (2001). The basics of item response theory. ERIC Clearinghouse on Assessment and Evaluation.
  • Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement, 60(3), 361-370.
  • Brown, T. A. (2006). Confirmatory factor analysis for applied research. The Guilford Press.
  • Büyüköztürk, Ş. (2011). Sosyal bilimler için veri analizi el kitabı. Pegem Akademi.
  • Chalmers, R. P. (2012). A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1-29.
  • Checa, I., & Espejo, B. (2018). Method effects associated with reversed items in the 29 items Spanish version of Ryff’s Well-Being Scales. Neuropsychiatry, 8(5), 1533-1540.
  • Chen, Y., Rendina-Gobioff, G., & Dedrick, R. F. (2007). Detecting effects of positively and negatively worded items on a self-concept scale for third and sixth grade elementary students. Paper presented at the 52nd Annual Meeting of the Florida Educational Research Association.
  • Curry, K.A & Holter, A. (2019) The Influence of Parent Social Networks on Parent Perceptions and Motivation for Involvement. Urban Education, 54(4), 535-563.
  • Davison, M. (1977). On a metric, unidimensional unfolding model for attitudinal and developmental data. Psychometrika, 42, 523-548.
  • De Ayala, R. J. (1994). The influence of multidimensionality on the graded response model. Applied Psychological Measurement, 18(2), 155-170.
  • De Ayala, R. J. (2009). De Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
  • DiStefano, C., & Motl, R. W. (2009). Personality correlates of method effects due to negatively worded items on the Rosenberg Self-Esteem scale. Personality and Individual Differences, 46, 309–313.
  • Embretson, S. E., & Reise, S.P.(2000). Item response theory for psychologists. Lawrence Erlbaum Associate, Inc.
  • Gable, R., Murphy, C. A., Hall, C., & Clark, A.E., (1986) The Development of the Pilot Form of the Parent Attitudes toward School Effectiveness (PATSE) Questionnaire. Paper presented at the annual meeting of the American Educational Research Association, San Francisco.
  • Gu, H., Wen, Z., & Fan, X. (2015). The impact of wording effect on reliability and validity of the Core Self-Evaluation Scale (CSES): A bi-factor perspective. Personality and Individual Differences, 83, 142-147.
  • Ho, R. (2006). Handbook of Univariate and Multivariate Data Analysis and Interpretation with SPSS. Taylor& Francis Group, LLC.
  • Hyland, P., Boduszek, D., Dhingra, K., Shevlin, M., & Egan, A. (2014). A bifactor approach to modelling the Rosenberg Self Esteem Scale. Personality and Individual Differences, 66, 188-192.
  • İlhan, M., & Güler, N. (2017). The number of response categories and the reverse scored item problem in Likert-type scales: A study with the Rasch model. Journal of Measurement and Evaluation in Education and Psychology, 8(3), 321-343.
  • Immekus, J., & Imbrie, P. K. (2008). Dimensionality assessment using the full information item bifactor analysis for graded response data an illustration with the State Metacognitive Inventory. Educational and Psychological Measurement, 68(4), 695-709.
  • Kim, H. R. (1994). New techniques for the dimensionality assessment of standardized test data (Doctoral dissertation, University of Illinois at Urbana-Champaign).
  • Knight, R. G., Chisholm, B. J., Marsh, N. V., & Godfrey, H. P. D. (1988). Some normative, reliability and factor analytic data for the Revised UCLA Loneliness Scale. Journal of Clinical Psychology, 44(2), 203-206. Kula-Kartal, S., & Kutlu, Ö. (2020). Analyzing the dimensionality of Academic Motivation Scale based on the item response theory models. Eurasian Journal of Educational Research, 86, 157–174.
  • Kula-Kartal, S., & Mor-Dirlik, E. (2021). Examining the dimensionality and monotonicity of an attitude dataset based on the item response theory models. International Journal of Assessment Tools in Education, 8(2), 296-309.
  • Kula-Kartal, S. (2021). Examining scale items in terms of method effects based on the bifactor item response theory model. Kastamonu Education Journal, 29(1), 201-209.
  • Li, Y., Jiao, H., & Lissitz, R. W. (2012). Applying multidimensional item response theory models in validating test dimensionality: An example of K–12 large scale science assessment. Journal of Applied Testing Technology, 13(2), 2-27.
  • Liu, J. (2007). Comparing multidimensional and unidimensional computer adaptive strategies in psychological and health assessment (Doctoral dissertation). Retrieved from http://d-scholarship.pitt.edu/.
  • Matlock K. L., Turner, R. C., & Gitchel, W. D. (2018). A study of reverse worded matched item pairs using the generalized partial credit and nominal response models. Educational and Psychological Measurement, 78(1), 103-127.
  • Melnick, S. A. & Fiene, R. (1989) Parent Attitudes toward School Effectiveness in the Harrisburg City School District's Elementary Division. Final Report. Reports – Research / Technical (143). Pennsylvania State Univ., Middletown. Capitol Campus.
  • Merritt, S.M. (2012). The two-factor solution to Allen and Meyer’s (1990) affective commitment scale: Effects of negatively worded items. Journal of Business and Psychology, 27(4), 421-436.
  • Paek, I., & Cole, K. (2020). Using R for item response theory model applications. Routledge Taylor & Francis Group.
  • Paulhus, D. L. (1991). Measurement and control of response bias. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes (Vol. 1, pp. 17-59). San Diego, CA: Academic Press.
  • Pilotte, W. J., & Gable, R. K. (1990). The impact of positive and negative item stems on the validity of a computer anxiety scale. Educational and Psychological Measurement, 50, 603-610.
  • Reckase, M. D. (2009). Multidimensional item response theory. New York: Springer. Reise, S. P., & Revicki, D. A. (2015). Handbook of item response theory modeling. Taylor & Francis Group.
  • Rennie, K. M. (1997). Exploratory and Confirmatory Rotation Strategies in Exploratory Factor Analysis. Annual Meeting of the Southwest
  • Robitzsch, A. (2018). sirt: Supplementary item response theory models. R package version 2.4-20. https://CRAN.R-project.org/package=sirt
  • Roszkowski, M.J., & Soven, M. (2010). Shifting gears: Consequences of including two negatively worded items in the middle of a positively worded questionnaire. Assessment & Evaluation in Higher Education, 35(1), 113-130.
  • Roussos, L. A., & Özbek, Ö. Y. (2006). Formulation of the DETECT population parameter and evaluation of detect estimator bias. Journal of Educational Measurement, 43(3), 215–243.
  • Salazar, M. S. (2015). The dilemma of combining positive and negative items in scales. Psicothema, 27(2), 192-199.
  • Schriesheim, Chester A., Regina J. Eisenbach, & Kenneth D. Hill (1991). The Effect of negation and polar opposite item reversals on questionnaire reliability and validity: An experimental investigation. Educational and Psychological Measurement, 51(1), 67-78.
  • Spector, P. E., Katwyk, P. T., Brannick, M. T., & Chen, P. Y. (1997). When two factors don’t reflect two constructs: How item characteristic can produce artifactual factors. Journal of Management, 23(5), 659-677.
  • Stevens, J. (2002). Applied Multivariate Statistics for Social Sciences. (4th ed.). Lawrence Erlbaum Associates.
  • Stout, W., Habing, B., Douglas, J., Kim, H. R., Roussos, L., & Zhang, J. (1996). Conditional covariance-based nonparametric multidimensionality assessment. Applied Psychological Measurement, 20(4), 331-354.
  • Stout, W., Nandakumar, R., & Habing, B. (1996). Analysis of latent dimensionality of dichotomously and polytomously scored test data. Psychometrika, 23(1), 37-65.
  • Suarez-Alvarez, J., Pedrosa, I., Lozano, L. M., Garcia-Cueto, E., Cuesta, M., & Muniz, J. (2018). Using reversed items in Likert scales: A questionable practice. Psicothema, 30(2), 149-158
  • Şeker, H. (2011) Developing a questionnaire on attitude towards school. Learning Environments Research, 14(3), 241-261.
  • Tate, R. (2003). A comparison of selected empirical methods for assessing the structure of responses to test items. Applied Psychological Measurement, 27, 159–203.
  • Tavşancıl, E. (2010). Tutumların ölçülmesi ve SPSS ile veri analizi. Nobel Yayın Dağıtım.
  • Thissen, D., & Wainer, H. (2001). Test scoring. Lawrence Erlbaum Associates, Inc.
  • Tomas, J. M., & Oliver, A. (1999) Rosenberg's self‐esteem scale: Two factors or method effects. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 84-98.
  • Van Schuur, W. H., & Kiers, H. A. L. (1994). Why factor analysis often is the incorrect model for analyzing bipolar concepts, and what model to use instead. Applied Psychological Measurement, 18(2), 97-110.
  • Wang, J., Siegal, H. A., Falck, R. S., & Carlson, R. G. (2001) Factorial structure of Rosenberg's Self-Esteem Scale among crack-cocaine drug users. Structural Equation Modeling, 8(2), 275-286.
  • Weijters, B., & Baumgartner, H. (2012). Misresponse to reversed and negated items in surveys: A review. Journal of Marketing Research, 59(5), 737-747.
  • Wong, N., Rindfleisch, A., & Burroghs, J. E. (2003). Do reversed worded items confound measures uncross cultural consumer research? The case of material values scale. Journal of Consumer Research, 30(1), 72-91.
  • Zhang, B., Luo, J., Chen, Y., Roberts, B., Drasgow, F. (2020). The road less traveled: A cross-cultural study of the negative wording factor in multidimensional scales. https://doi.org/10.31234/osf.io/2psyq
  • Zhang, J., & Stout, W. (1999). The theoretical DETECT index of dimensionality and its application to approximate simple structure. Psychometrika, 64(2), 213-249.
There are 55 citations in total.

Details

Primary Language Turkish
Subjects Other Fields of Education
Journal Section Articles
Authors

Seval Kula Kartal 0000-0002-3018-6972

Eren Can Aybek 0000-0003-3040-2337

Metin Yaşar 0000-0002-7854-1494

Publication Date April 28, 2022
Submission Date December 8, 2021
Published in Issue Year 2022 Volume: 35 Issue: 1

Cite

APA Kula Kartal, S., Aybek, E. C., & Yaşar, M. (2022). ÖLÇEKLERDE İFADE ETKİSİNİN FARKLI BOYUTLULUK ANALİZLERİYLE İNCELENMESİ. Journal of Uludag University Faculty of Education, 35(1), 44-67. https://doi.org/10.19171/uefad.1033284