Abstract
Cognitive flexibility is the evaluation of events from a broad perspective rather than evaluating them rigidly. The use of the Cognitive Flexibility Inventory (CFI) by adaptation to different cultures and the increase in citations in recent years indicate that this assessment instrument is up-to-date and practical. Therefore, it is necessary and important to investigate the psychometric properties of this frequently used assessment instrument with different techniques. In this study, it was aimed to rescale the CFI items and model parameters according to item response theory and inspect differentiation for CFI items according to gender. Research data were collected from a total of 750 undergraduate students (564 female and 186 male) via Google Forms. To decide on the best model for the cognitive flexibility construct, bifactor, multidimensional, and unidimensional item response theory models were tested according to item fit and global model-data fit statistics. Besides, the graded response model was used to calibrate the CFI items. As a result of IRT analysis, it was determined that the model with which the CFI fits best was the bifactor model. After IRT analysis, differentiation was inspected via different methods. After the differential item functioning analysis, expert opinions were obtained for items which had a potential bias. As a conclusion, there was a consensus only on the idea that item 12 showed gender bias. With 5-point Likert-type scaling, gender bias existed. A new adaptation for a 7-point Likert type should be implemented and gender bias should be examined.
Similar content being viewed by others
Data Availability
The data, supplementary, and appendix files of this study are openly available on the Open Science Framework page (https://osf.io/ehqnj/files/).
References
Anderson, D. R. (2008). Model based inference in the life sciences: A primer on evidence. Springer.
Anderson, P. (2002). Assessment and development of executive function (EF) during childhood. Child Neuropsychology, 8(2), 71–82. https://doi.org/10.1076/chin.8.2.71.8724
Armitage, S. G. (1946). An analysis of certain psychological tests used for the evaluation of brain injury. In J. F. Dashiell (Ed.), Psychological monographs (Vol. 60, pp. 1–48). The American Psychological Association, Inc.
Ayan, C., & Barış Pekmezci, F. (2021). The Unit Testlet Dilemma: PISA Sample. International Journal of Assessment Tools in Education, 8(3), 613–632. https://doi.org/10.21449/ijate.948734
Baker, F. B. (2001). The basics of item response theory (2nd ed.). ERIC Clearinghouse on Assessment and Evaluation.
Beck, A. T. (1979). Cognitive therapy and the emotional disorders. Meridian.
Beck, A. T., & Alford, B. A. (2009). Depression: Causes and treatment (2nd ed.). University of Pennsylvania Press.
Beck, A. T., & Weishaar, M. E. (2011). Cognitive therapy. In R. J. Corsini & D. Wedding (Eds.), Current psychotherapies (9th ed., pp. 276–309). Brooks/Cole.
Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society Series B (Statistical Methodology), 57(1), 289–300. https://doi.org/10.1111/j.2517-6161.1995.tb02031.x
Bentler, P. M. (2009). Alpha, dimension-free, and model-based internal consistency reliability. 74(1), 137–143. https://doi.org/10.1007/S11336-008-9100-1
Berg, E. A. (1948). A simple objective technique for measuring flexibility in thinking. The Journal of General Psychology, 39(1), 15–22. https://doi.org/10.1080/00221309.1948.9918159
Bilgin, M. (2009). Developing a cognitive flexibility scale: Validity and reliability studies. Social Behavior and Personality, 37(3), 343–353. https://doi.org/10.2224/sbp.2009.37.3.343
Bjorner, J. B., Smith, K. J., Stone, C., & Sun, X. (2007). IRTFIT: A macro for item fit and local dependence tests under IRT models. QualityMetric Incorporated.
Brown, G., Widing, R. E., II., & Coulter, R. L. (1991). Customer evaluation of retail salespeople utilizing the SOCO scale: A replication, extension, and application. Journal of the Academy of Marketing Science, 19(4), 347–351.
Butler, A. C., Chapman, J. E., Forman, E. M., & Beck, A. T. (2006). The empirical status of cognitive-behavioral therapy: A review of meta-analyses. Clinical Psychology Review, 26(1), 17–31. https://doi.org/10.1016/j.cpr.2005.07.003
Cai, L., & Monroe, S. (2014). A new statistic for evaluating item response theory models for ordinal data. CRESST Report 839. S. National Center for Research on Evaluation, and Student Testing (CRESST).
Cai, L., du Toit, S. H. C., & Thissen, D. (2015). IRTPRO: Flexible professional item response theory modeling for patient reported outcomes (version 3.1) [Computer software]. Scientific Software International.
Camilli, G., & Shepard, L. A. (1994). Methods for identifying biased test items (Vol. 4). Sage Publications, Inc.
Cañas, J. J., Fajardo, I., & Salmerón, L. (2006). Cognitive flexibility. In W. Karwowski (Ed.), International encyclopedia of ergonomics and human factors (2nd ed., pp. 297–301). Taylor & Francis.
Cattie, J. E., Buchholz, J. L., & Abramowitz, J. S. (2020). Cognitive therapy and cognitive-behavioral therapy. In S. B. Messer & N. J. Kaslow (Eds.), Essential psychotherapies: Theory and practice (4th ed., pp. 142–182). The Guilford Press.
Çelikkaleli, Ö. (2014). The validity and reliability of the cognitive flexibility scale. Education and Science, 39(176), 339–346. https://doi.org/10.15390/EB.2014.3466
Chalmers, R. P., Pritikin, J., Robitzsch, A., Zoltak, M., KwonHyun, K., Falk, C. F., Meade, A., Schneider, L., King, D., Liu, C. W., & Oguzhan, O. (2021). mirt: Multidimensional item response theory [R package]. In https://cran.r-project.org/web/packages/mirt/index.html
Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265–289. https://doi.org/10.3102/10769986022003265
Chernyshenko, O. S., Stark, S., & Guenole, N. (2007). Can the discretionary nature of certain criteria lead to differential prediction across cultural groups? International Journal of Selection and Assessment, 15(2), 175–184. https://doi.org/10.1111/j.1468-2389.2007.00379.x
Clauser, B., Mazor, K., & Hambleton, R. K. (1993). The effects of purification of the matching criterion on the identification of DIF using the Mantel-Haenszel procedure. Applied Measurement in Education, 6(4), 269–279. https://doi.org/10.1207/s15324818ame0604_2
Dajani, D. R., & Uddin, L. Q. (2015). Demystifying cognitive flexibility: Implications for clinical and developmental neuroscience. Trends in Neurosciences, 38(9), 571–578. https://doi.org/10.1016/j.tins.2015.07.003
de Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
Dennis, J. P., & Wal, J. S. V. (2010). The cognitive flexibility inventory: Instrument development and estimates of reliability and validity. Cognitive Therapy and Research, 34(3), 241–253. https://doi.org/10.1007/s10608-009-9276-4
Dienes, K. A., Torres-Harding, S., Reinecke, M. A., Freeman, A., & Sauer, A. (2011). Cognitive therapy. In S. B. Messer & A. S. Gurman (Eds.), Essential psychotherapies: Theory and practice (3rd ed., pp. 143–183). The Guilford Press.
Dorans, N. J., & Kulick, E. (2006). Differential item functioning on the mini-mental state examination: An application of the Mantel-Haenszel and standardization procedures. Medical Care, 44(11), 107–114. https://doi.org/10.1097/01.mlr.0000245182.36914.4a
Drasgow, F. (1987). Study of the measurement bias of two standardized psychological tests. Journal of Applied Psychology, 72(1), 19–29. https://doi.org/10.1037/0021-9010.72.1.19
DuPaul, G. J., Fu, Q., Anastopoulos, A. D., Reid, R., & Power, T. J. (2020). ADHD parent and teacher symptom ratings: Differential item functioning across gender, age, race, and ethnicity. Journal of Abnormal Child Psychology, 48(5), 679–691. https://doi.org/10.1007/s10802-020-00618-7
Edelen, M. O., Stucky, B. D., & Chandra, A. (2015). Quantifying “problematic” DIF within an IRT framework: Application to a cancer stigma index. Quality of Life Research, 24(1), 95–103. https://doi.org/10.1007/s11136-013-0540-4
Ercikan, K., Arim, R., Law, D., Domene, J., Gagnon, F., & Lacroix, S. (2010). Application of think aloud protocols for examining and confirming sources of differential item functioning identified by expert reviews. Educational Measurement: Issues and Practice, 29(2), 24–35. https://doi.org/10.1111/j.1745-3992.2010.00173.x
Finch, H. (2005). The MIMIC model as a method for detecting DIF: Comparison with Mantel-Haenszel, SIBTEST, and the IRT likelihood ratio. Applied Psychological Measurement, 29(4), 278–295. https://doi.org/10.1177/0146621605275728
Fleishman, J. A., Spector, W. D., & Altman, B. M. (2002). Impact of differential item functioning on age and gender differences in functional disability. Journals of Gerontology: Social Sciences, 57B(5), 275–284. https://doi.org/10.1093/geronb/57.5.S275
Gabrys, R. L., Tabri, N., Anisman, H., & Matheson, K. (2018). Cognitive control and flexibility in the context of stress and depressive symptoms: The cognitive control and flexibility questionnaire. Frontiers in Psychology, 9, 1–19. https://doi.org/10.3389/fpsyg.2018.02219
Gallo, J. J., Anthony, J. C., & Muthen, B. O. (1994). Age differences in the symptoms of depression: A latent trait analysis. Journal of Gerontology: Psychological Sciences, 49(6), 251–264. https://doi.org/10.1093/geronj/49.6.P251
Garner, W. R. (1960). Rating scales, discriminability, and information transmission. Psychological Review, 67(6), 343–352. https://doi.org/10.1037/h0043047
Gibbons, R. D., & Hedeker, D. R. (1992). Full-information item bi-factor analysis. Psychometrika, 57(3), 423–436. https://doi.org/10.1007/Bf02295430
Gingrich, P. (1992). Introductory statistics for the social sciences. Department of Sociology and Social Sciences, University of Regina. https://uregina.ca/~gingrich/ch11a.pdf
Glöckner-Rist, A., & Hoijtink, H. (2003). The best of both worlds: Factor analysis of dichotomous data using item response theory and structural equation modeling. Structural Equation Modeling, 10(4), 544–565. https://doi.org/10.1207/S15328007sem1004_4
Golden, C. J. (1975). A group version of the Stroop Color and Word Test. Journal of Personality Assessment, 39(4), 386–388. https://doi.org/10.1207/s15327752jpa3904_10
Gómez-Benito, J., Balluerka, N., González, A., Widaman, K. F., & Padilla, J. L. (2017). Detecting differential item functioning in behavioral indicators across parallel forms. Psicothema, 29(1), 91–95. https://doi.org/10.7334/psicothema2015.112
Gómez-Benito, J., Sireci, S., Padilla, J. L., Hidalgo, M. D., & Benítez, I. (2018). Differential item functioning: Beyond validity evidence based on internal structure. Psicothema, 30(1), 104–109. https://doi.org/10.7334/psicothema2017.183
Green, P. E., & Rao, V. R. (1970). Rating scales and information recovery: How many scales and response categories to use? Journal of Marketing, 34(3), 33–39. https://doi.org/10.2307/1249817
Gülüm, İV., & Dağ, İ. (2012). Tekrarlayıcı düşünme ölçeği ve bilişsel esneklik envanterinin Türkçeye uyarlanması, geçerliliği ve güvenilirliği [The Turkish adaptation, validity and reliability study of the repetitive thinking questionnaire and the cognitive flexibility inventory]. Anadolu Psikiyatri Dergisi [anatolian Journal of Psychiatry], 13(3), 216–223.
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications. Springer Science+Business Media.
Hammer, J. H. (2016). Percent of uncontaminated correlations (PUC) calculator: A microsoft excel-based tool to calculate the PUC statistic http://drjosephhammer.com/research/bifactor-analysis-resources/
Hofmann, S. G., Asmundson, G. J. G., & Beck, A. T. (2013). The science of cognitive therapy. Behavior Therapy, 44(2), 199–212. https://doi.org/10.1016/j.beth.2009.01.007
Hofmann, S. G., Asnaani, A., Vonk, I. J. J., Sawyer, A. T., & Fang, A. (2012). The efficacy of cognitive behavioral therapy: A review of meta-analyses. Cognitive Therapy and Research, 36(5), 427–440. https://doi.org/10.1007/s10608-012-9476-1
Holzinger, K. J., & Swineford, F. (1937). The bi-factor method. Psychometrika, 2(1), 41–54. https://doi.org/10.1007/BF02287965
Johnco, C., Wuthrich, V. M., & Rapee, R. M. (2014). The influence of cognitive flexibility on treatment outcome and cognitive restructuring skill acquisition during cognitive behavioural treatment for anxiety and depression in older adults: Results of a pilot study. Behaviour Research and Therapy, 57, 55–64. https://doi.org/10.1016/j.brat.2014.04.005
Jorgensen, T. D., Pornprasertmanit, S., Schoemann, A. M., Rosseel, Y., Y., R., Miller, P., C., Q., Garnier-Villarreal, M., Selig, J., Boulton, A., Preacher, K., Coffman, D., Rhemtulla, M., Robitzsch, A., Enders, C., Arslan, R., Clinton, B., Panko, P., Merkle, E., . . ., & Johnson, A. R. (2021). semTools: Useful tools for structural equation modeling. R package version 0.5–5. [Computer software]. In https://cran.r-project.org/web/packages/semTools/index.html
Karami, H., & SalmaniNodoushan, M. A. (2011). Differential item functioning (DIF): Current problems and future directions. Online Submission, 5(3), 133–142.
Kotrlik, J. W., & Williams, H. A. (2003). The incorporation of effect size in information technology, learning, and performance research. Information Technology, Learning, and Performance Journal, 21(1), 1–7.
Kurginyan, S. S., & Osavolyuk, E. Y. (2018). Psychometric properties of a Russian version of the cognitive flexibility inventory (CFI-R). Frontiers in Psychology, 9. https://doi.org/10.3389/fpsyg.2018.00845
Levine, D. W., Kaplan, R. M., Kripke, D. F., Bowen, D. J., Naughton, M. J., & Shumaker, S. A. (2003). Factor structure and measurement invariance of the Women’s Health Initiative Insomnia Rating Scale. Psychological Assessment, 15(2), 123–136. https://doi.org/10.1037/1040-3590.15.2.123
Lissitz, R. W., & Green, S. B. (1975). Effect of the number of scale points on reliability: A Monte Carlo approach. Journal of Applied Psychology, 60, 10–13. https://doi.org/10.1037/h0076268
Martin, M. M., & Anderson, C. M. (1998). The cognitive flexibility scale: Three validity studies. Communication Reports, 11(1), 1–9. https://doi.org/10.1080/08934219809367680
Martin, M. M., & Rubin, R. B. (1995). A new measure of cognitive flexibility. Psychological Reports, 76(2), 623–626. https://doi.org/10.2466/pr0.1995.76.2.623
Matell, M. S., & Jacoby, J. (1971). Is there an optimal number of alternatives for likert scale items? Study 1: Reliability and validity. Educational and Psychological Measurement, 31, 657–674. https://doi.org/10.1177/001316447103100307
Miles, S., Howlett, C. A., Berryman, C., Nedeljkovic, M., Moseley, G. L., & Phillipou, A. (2021). Considerations for using the Wisconsin Card Sorting Test to assess cognitive flexibility. Behavior Research Methods, 53, 2083–2091. https://doi.org/10.3758/s13428-021-01551-3
Miller, T. R., & Spray, J. A. (1993). Logistic discriminant function analysis for DIF identification of polytomously scores items. Journal of Educational Measurement, 30(2), 107–122. https://doi.org/10.1111/j.1745-3984.1993.tb01069.x
Muthén, B. (1985). A method for studying the homogeneity of test items with respect to other relevant variables. Journal of Educational Statistics, 10(2), 121–132. https://doi.org/10.3102/10769986010002121
Muthén, B. O., Kao, C. F., & Burstein, L. (1991). Instructionally sensitive psychometrics: Application of a new IRT-based detection technique to mathematics achievement test items. Journal of Educational Measurement, 28(1), 1–22. https://doi.org/10.1111/j.1745-3984.1991.tb00340.x
Oaster, T. R. F. (1989). Number of alternatives per choice point and stability of Likert-type scales. Perceptual and Motor Skills, 68(2), 549–550. https://doi.org/10.2466/pms.1989.68.2.549
Oort, F. J. (1998). Simulation study of item bias detection with restricted factor analysis. Structural Equation Modeling: A Multidisciplinary Journal, 5(2), 107–124. https://doi.org/10.1080/10705519809540095
Orlando, M., & Thissen, D. (2000). Likelihood-based item-fit indices for dichotomous item response theory models. Applied Psychological Measurement, 24(1), 50–64. https://doi.org/10.1177/01466216000241003
Orlando, M., & Thissen, D. (2003). Further investigation of the performance of S - X2: An item fit index for use with dichotomous item response theory models. Applied Psychological Measurement, 27(4), 289–298. https://doi.org/10.1177/0146621603027004004
Oshiro, K., Nagaoka, S., & Shimizu, E. (2016). Development and validation of the Japanese version of cognitive flexibility scale. BMC Research Notes, 9(275), 1–8. https://doi.org/10.1186/s13104-016-2070-y
Piórowski, K., Basińska, M. A., Piórowska, A., & Grzankowska, I. (2017). Adaptacja kwestionariusza elastyczności poznawczej - Cognitive flexibility inventory [Adaptation of the cognitive flexibility inventory]. Przeglad Psychologiczny [Psychological Review], 60(4), 601–616.
Portoghese, I., Lasio, M., Conti, R., Mascia, M. L., Hitchcott, P., Agus, M., Gemignani, A., & Penna, M. P. (2020). Cognitive Flexibility Inventory: Factor structure, invariance, reliability, convergent, and discriminant validity among Italian university students. Psych Journal, 9(6), 934–941. https://doi.org/10.1002/pchj.401
Preston, C. C., & Colman, A. M. (2000). Optimal number of response categories in rating scales: Reliability, validity, discriminating power, and respondent preferences. Acta Psychologica, 104(1), 1–15. https://doi.org/10.1016/S0001-6918(99)00050-5
Quinn, H. O. C. (2014). Bifactor models, explained common variance (ECV), and the usefulness of scores from unidimensional item response theory analyses [Unpublished master's thesis, University of North Carolina]. Chapel Hill. https://cdr.lib.unc.edu/concern/dissertations/w95051780?locale=en
R Core Team. (2016). R: A language and environment for statistical computing. In R Foundation for Statistical Computing. https://www.R-project.org/
Reise, S. P., Moore, T. M., & Haviland, M. G. (2010). Bifactor models and rotations: Exploring the extent to which multidimensional data yield univocal scale scores. Journal of Personality Assessment, 92(6), 544–559. https://doi.org/10.1080/00223891.2010.496477
Reise, S. P., Morizot, J., & Hays, R. D. (2007). The role of the bifactor model in resolving dimensionality issues in health outcomes measures. Quality of Life Research, 16, 19–31. https://doi.org/10.1007/s11136-007-9183-7
Reise, S. P., Scheines, R., Widaman, K. F., & Haviland, M. G. (2013). Multidimensionality and structural coefficient bias in structural equation modeling: A bifactor perspective. Educational and Psychological Measurement, 73(1), 5–26. https://doi.org/10.1177/0013164412449831
Revelle, W. (2021). psych: Procedures for psychological, psychometric, and personality research. R package version 2.1.9. Northwestern University. https://cran.r-project.org/web/packages/psych/index.html
Rodriguez, A., Reise, S. P., & Haviland, M. G. (2016). Applying bifactor statistical indices in the evaluation of psychological measures. Journal of Personality Assessment, 98(3), 223–237. https://doi.org/10.1080/00223891.2015.1089249
Roever, C., & McNamara, T. (2006). Language testing: The social dimension. International Journal of Applied Linguistics, 16(2), 242–258. https://doi.org/10.1111/j.1473-4192.2006.00117.x
Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36. https://doi.org/10.18637/jss.v048.i02
Sapmaz, F., & Doğan, T. (2013). Bilişsel esnekliğin değerlendirilmesi: Bilişsel esneklik envanteri Türkçe versiyonunun geçerlik ve güvenirlik çalışmaları [Assessment of cognitive flexibility: Reliability and validity studies of Turkish version of the cognitive flexibility inventory]. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi [ankara University Journal of Faculty of Educational Sciences], 46(1), 143–161.
Schuts, H. G., & Rucker, M. H. (1975). A comparison of variables configurations across scale lengths: An empirical study. Educational and Psychological Measurement, 35(2), 319–324. https://doi.org/10.1177/001316447503500210
Scott, W. A. (1962). Cognitive complexity and cognitive flexibility. Sociometry, 25(4), 405–414. https://doi.org/10.2307/2785779
ŞengülAvşar, A. (2022). Comparing the automatic item selection procedure and exploratory factor analysis in determining factor structure. Participatory Educational Research (PER), 9(2), 416–436. https://doi.org/10.17275/per.22.47.9.2
Shareh, H., Farmani, A., & Soltani, E. (2014). Investigating the reliability and validity of the cognitive flexibility inventory (CFI-I) among Iranian university students. Practice in Clinical Psychology, 2(1), 43–50.
Shih, C. L., & Wang, W. C. (2009). Differential item functioning detection using multiple indicators, multiple causes method with a pure short anchor. Applied Psychological Measurement, 33(3), 184–199. https://doi.org/10.1177/0146621608321758
Stark, S., Chernyshenko, O. S., Chan, K.-Y., Lee, W. C., & Drasgow, F. (2001). Effects of the testing situation on item responding: Cause for concern. Journal of Applied Psychology, 86(5), 943–953. https://doi.org/10.1037/0021-9010.86.5.943
Stark, S., Chernyshenko, O. S., & Drasgow, F. (2006). Detecting differential item functioning with confirmatory factor analysis and item response theory: Toward a unified strategy. Journal of Applied Psychology, 91(6), 1292–1306. https://doi.org/10.1037/0021-9010.91.6.1292
Stucky, B. D., & Edelen, M. O. (2015). Using hierarchical IRTmodels to create unidimensional measures from multidimensional data. In S. P. Reise & D. A. Revicki (Eds.), Handbook of item response theory modeling: Applications to typical performance assessment (pp. 183–206). Routledge.
Stucky, B. D., Thissen, D., & Edelen, M. O. (2013). Using logistic approximation of marginal trace lines to develop short assessments. Applied Psychological Measurement, 37(1), 41–57. https://doi.org/10.1177/0146621612462759
Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic regression procedures. Journal of Educational Measurement, 27(4), 361–370. https://doi.org/10.1111/j.1745-3984.1990.tb00754.x
Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 147–172). Lawrence Erlbaum Associates Inc.
Thissen, D., Steinberg, L., & Wainer, H. (1993). Detection of differential item functioning using the parameters of item response models. In P. W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 67–113). Lawrence Erlbaum Associates Inc.
Tokuyoshi, Y., & Iwasaki, S. (2012). Development and validation of the cognitive flexibility inventory. The 76th Annual Convention of the Japanese Psychological Association, Nagoya.
Toland, M. D., Sulis, I., Giambona, F., Porcu, M., & Campbell, J. M. (2017). Introduction to bifactor polytomous item response theory analysis. Journal of School Psychology, 60, 41–63. https://doi.org/10.1016/j.jsp.2016.11.001
Tuerlinckx, F., & De Boeck, P. (2001). The effect of ignoring item interactions on the estimated discrimination parameters in item response theory. Psychological Methods, 6(2), 181–195. https://doi.org/10.1037/1082-989x.6.2.181
Wainer, H., & Wang, X. H. (2000). Using a new statistical model for testlets to score TOEFL. Journal of Educational Measurement, 37(3), 203–220. https://doi.org/10.1111/j.1745-3984.2000.tb01083.x
Wakita, T., Ueshima, N., & Noguchi, H. (2012). Psychological distance between categories in the Likert scale: Comparing different numbers of options. Educational and Psychological Measurement, 72(4), 533–546. https://doi.org/10.1177/0013164411431162
Wang, W. C., & Wilson, M. (2005). The Rasch testlet model. Applied Psychological Measurement, 29(2), 126–149. https://doi.org/10.1177/0146621604271053
Wang, W. C., & Yeh, Y. L. (2003). Effects of anchor item methods on differential item functioning detection with the likelihood ratio test. Applied Psychological Measurement, 27(6), 479–498. https://doi.org/10.1177/0146621603259902
Wang, Y., Yang, Y., Xiao, W.-T., & Su, Q. (2016). Validity and reliability of the Chinese version of the cognitive flexibility inventory in college students. Chinese Mental Health Journal, 30(1), 58–62.
Zelazo, P. D. (2006). The dimensional change card sort (DCCS): A method of assessing executive function in children. Nature Protocols, 1(1), 297–301. https://doi.org/10.1038/nprot.2006.46
Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and likert-type (ordinal)item scores. Directorate of Human Resources Research and Evaluation, Department of National Defense.
Funding
The authors have not received specific grant from any funding agency, commercial or not-for-profit sectors for the submitted work.
Author information
Authors and Affiliations
Contributions
Volkan Avşar: Introduction/data collecting/ethical approval/method/result/discussion/limitations; Fulya Barış Pekmezci: study design/method/data analysis/result/discussion/limitations.
Corresponding author
Ethics declarations
Ethics Approval
This study design complies with the Declaration of Helsinki ethical standards. The ethics committee approval for the present study was obtained from the Ethics Committee of the Social Sciences and Humanities at Recep Tayyip Erdoğan University (Letter dated 08.03.2022 and numbered 2022/39).
Conflict of Interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Appendices
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Avşar, V., Pekmezci, F.B. Rescaling of Cognitive Flexibility Inventory by Criticism of Turkish Adaptation Form. J Cogn Ther 16, 682–709 (2023). https://doi.org/10.1007/s41811-023-00188-8
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s41811-023-00188-8