Abstract
Nonparametric item response models provide a flexible framework in psychological and educational measurements. Douglas (Psychometrika 66(4):531–540, 2001) established asymptotic identifiability for a class of models with nonparametric response functions for long assessments. Nevertheless, the model class examined in Douglas (2001) excludes several popular parametric item response models. This limitation can hinder the applications in which nonparametric and parametric models are compared, such as evaluating model goodness-of-fit. To address this issue, We consider an extended nonparametric model class that encompasses most parametric models and establish asymptotic identifiability. The results bridge the parametric and nonparametric item response models and provide a solid theoretical foundation for the applications of nonparametric item response models for assessments with many items.
Similar content being viewed by others
References
Chen, Y., Li, X., Liu, J., & Ying, Z. (2021). Item response theory—a statistical framework for educational and psychological measurement. arXiv preprint arXiv:2108.08604.
Douglas, J. (1997). Joint consistency of nonparametric item characteristic curve and ability estimation. Psychometrika, 62, 7–28.
Douglas, J. A. (2001). Asymptotic identifiability of nonparametric item response models. Psychometrika, 66(4), 531–540.
Douglas, J., & Cohen, A. (2001). Nonparametric item response function estimation for assessing parametric model fit. Applied Psychological Measurement, 25(3), 234–243.
Falk, C. F., & Cai, L. (2016). Maximum marginal likelihood estimation of a monotonic polynomial generalized partial credit model with applications to multiple group analysis. Psychometrika, 81, 434–460.
Johnson, M. S. (2007). Modeling dichotomous item responses with free-knot splines. Computational Statistics & Data Analysis, 51(9), 4178–4192.
Lee, Y.-S., Wollack, J. A., & Douglas, J. (2009). On the use of nonparametric item characteristic curve estimation techniques for checking parametric model fit. Educational and Psychological Measurement, 69(2), 181–197.
Mikhailov, V. G. (1994). On a refinement of the central limit theorem for sums of independent random indicators. Theory of Probability & Its Applications, 38(3), 479–489.
Peress, M. (2012). Identification of a semiparametric item response model. Psychometrika, 77, 223–243.
Ramsay, J. O. (1991). Kernel smoothing approaches to nonparametric item characteristic curve estimation. Psychometrika, 56(4), 611–630.
Ramsay, J. O., & Abrahamowicz, M. (1989). Binomial regression with monotone splines: A psychometric application. Journal of the American Statistical Association, 84(408), 906–915.
Ramsay, J. O., & Winsberg, S. (1991). Maximum marginal likelihood estimation for semiparametric item analysis. Psychometrika, 56(3), 365–379.
Sijtsma, K. & Molenaar, I. W. (2002). Introduction to nonparametric item response theory. SAGE Publications, Inc.
Sijtsma, K. (1998). Methodology review: Nonparametric IRT approaches to the analysis of dichotomous item scores. Applied Psychological Measurement, 22(1), 3–31.
Van der Linden, W. J. (2018). Handbook of item response theory. CRC Press.
Winsberg, S., Thissen, D., & Wainer, H. (1984). Fitting item characteristic curves with spline functions. ETS Research Report Series, 1984(2), i–14.
Acknowledgements
The author would like to thank Editor-in-Chief Dr. Sandip Sinharay, an Associate Editor, and a referee for their valuable comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
We present all the lemmas in Section A and provide their proofs in Section B. The proofs of propositions are provided in Section C.
1.1 A. Lemmas
Lemma 1
Consider an integer k such that \(\theta _k\in (\alpha ,\beta )\). There exist constants \(\tilde{C}_{\alpha \beta }>0\) and \(N_{\alpha \beta }\) independent with (n, i) such that when \(n \geqslant N_{\alpha \beta }\),
Lemma 2
Under the conditions of Theorem 2, there exist constants \(\tilde{C}_{\alpha \beta ,1}\) and \(\tilde{C}_{\alpha \beta ,2}\) independent with (n, i) such that
Lemma 3
(Hoeffding inequality of bounded variables) For any (n, i), and \(m>0\),
1.2 B. Proofs of Lemmas
1.2.1 B1. Proof of Lemma 1
Let \(\mu _{\theta }=\sum _{j\ne i} P_{n,j}(\theta )\) and \(\sigma ^2_{\theta }=\sum _{j\ne i} P_{n,j}(\theta )(1-P_{n,j}(\theta ))\), which are the mean and variance of \((n-1)\bar{Y}_{n,-i}\) conditioning on \(\Theta =\theta \), respectively. By applying a bound on the normal approximation for the distribution of a sum of independent Bernoulli variables (Mikhailov, 1994)
for some universal constant c.
Define the intervals \(I_{n,1/2}=(\theta _k-n^{-1/2}, \theta _k+n^{-1/2})\) and \(\tilde{I}_{n,1/2}=I_{n,1/2}\cap (\alpha /2,(1+\beta )/2)\). For \(\theta \in \tilde{I}_{n,1/2}\) and \(\theta _k \in (\alpha ,\beta )\),
where in the second inequality, \({M}_{\alpha \beta ,2}\) is a constant that is independent with (n, i) by Condition 3. Moreover, by Condition 4, there exist positive constants \(L_{\alpha \beta } \) and \(U_{\alpha \beta } \) that are independent with (n, i) such that
Therefore, by (12), (13), and (14), for \(\theta \in \tilde{I}_{n,1/2}\),
where \(\tilde{C}_{\alpha \beta }>0\) is a constant that is independent with (n, i). Therefore,
There exists \(N_{\alpha \beta }\) independent with i such that when \(n\geqslant N_{\alpha \beta }\), \(\tilde{I}_{n,1/2}=I_{n,1/2}\) by \(\theta _k\in (\alpha ,\beta )\). Then by (15),
1.2.2 B2. Proof of Lemma 2
To prove Lemma 2, we note that
We next derive an upper bound of \(P\left( 0<\Theta <\delta \mid \mathcal {E}_{n,k}\right) \), and a similar upper bound can be obtained for \(P\left( 1-\delta<\Theta <1 \mid \mathcal {E}_{n,k}\right) \) following a similar analysis.
By \(P(0<\Theta<\delta \mid \mathcal {E}_{n,k})= P(0<\Theta <\delta ,\ \mathcal {E}_{n,k})/P(\mathcal {E}_{n,k})\), and the lower bound of \(P(\mathcal {E}_{n,k})\) in Lemma 1, it suffices to derive an upper bound of \(P(0<\Theta <\delta ,\ \mathcal {E}_{n,k})\) below. In particular,
By \(\bar{P}_{n,-i}(\theta _k)=k/(n-1)\),
where we define \(\bar{\kappa }_{-i}=\sum _{j\ne i} \kappa _j/(n-1)\). By \(0<\alpha /2<\alpha<\theta _k<\beta \) and Conditions 3 and 4, we have \(\bar{P}_{n,-i}(\theta _k) \geqslant \bar{P}_{n,-i}(\alpha )\) and \(\bar{P}_{n,-i}(\alpha /2)\geqslant \bar{\kappa }_{-i}.\) Therefore,
where the second equation is obtained by the intermediate value theorem with \(\tilde{\alpha }\in (\alpha /2,\alpha )\), and the last inequality is obtained by Condition 3 with \(\tilde{m}_{\alpha }\) being a constant independent with (n, i). Let \(\tilde{C}_{\alpha } = \tilde{m}_{\alpha } \alpha /2\). By Condition 4, there exists \(\delta _{\alpha }>0\) such that for \(\theta <\delta \leqslant \delta _{\alpha }\),
Combining (17), (18), and (19),
where the last inequality follows by Lemma 3. By the above inequality and (16),
where the second inequality is obtained by Lemma 1. A similar upper bound can be obtained for \(P (1-\delta<\Theta <1 \mid \mathcal {E}_{n,k}\ ) \) too. Lemma 2 is proved.
1.3 C. Proofs of Propositions
1.3.1 C1. Proof of Proposition 1
Let \(x=\Phi ^{-1}(\theta )\). We can equivalently write \(P'_{n,i}(\theta )=h_{n,i}(x)\), where
Note that \(\theta \downarrow 0\) and \(\theta \uparrow 0\) correspond to \(x \rightarrow -\infty \) and \(x\rightarrow +\infty \), respectively. Let \(p_{+}=\lim _{\theta \rightarrow 1}P_{n,i}'(\theta )=\lim _{x\rightarrow +\infty } h_{n,i}(x)\) and \(p_{-}=\lim _{\theta \rightarrow 0}P_{n,i}'(\theta )=\lim _{x\rightarrow -\infty } h_{n,i}(x)\). As \(h_{n,i}(x)=0\) when \(a_{n,i}=0\), it suffices to consider \(a_{n,i}\ne 0\) below. In particular,
This suggests that \(P'_{n,i}(\theta )\) cannot be uniformly bounded on (0, 1) when \(a_{n,i}^2\ne 1\) or \( b_{n,i}\ne 0. \)
On the other hand, when \(\theta \in [\alpha , \beta ]\), \(x\in [\Phi ^{-1}(\alpha ), \Phi ^{-1}(\beta )]\) with the two end points bounded away from \(-\infty \) and \(+\infty \). Therefore, when \(a_{n,i}\in [1/C, C]\) and \(|b_{n,i}|\in [1/C', C']\), there exist \(0< m_{\alpha \beta }< M_{\alpha \beta } <\infty \) such that \(P'_{n,i}(\theta ) = h_{n,i}(x) \in (m_{\alpha \beta }, M_{\alpha \beta })\). Thus Condition 3 is satisfied.
We next prove that Condition 4 is also satisfied. When \(\epsilon \in (0,1/2)\), \(\Phi ^{-1}(\epsilon )<0\), and then we set \(l_{\epsilon }=\Phi \left[ C_a\Phi ^{-1}(\epsilon ) - C_b\right] .\) When \(\theta \in [0,l_{\epsilon }]\),
When \(\epsilon \in (1/2,1)\), \(\Phi ^{-1}(\epsilon )>0\), and we set \(l_{\epsilon }=\Phi \left[ C_a^{-1}\Phi ^{-1}(\epsilon ) - C_b\right] .\) Then we can obtain \(P_{n,i}(\theta )\leqslant \epsilon \) similarly to the above analysis. Following similar analysis, we can construct \(u_{\epsilon }\) so that \(1-P_{n,i}(\theta )\leqslant \epsilon \) for \(\theta \in [u_{\epsilon },1]\). In brief, Condition 4 is satisfied.
1.3.2 C2. Proof of Proposition 2
Let \(x=\Phi ^{-1}(\theta )\). We can equivalently write \(P'_{n,i}(\theta )=h_{n,i}(x)\), where
which is obtained by
Note that \(\theta \downarrow 0\) and \(\theta \uparrow 0\) correspond to \(x \rightarrow -\infty \) and \(x\rightarrow +\infty \), respectively. When \(a_{n,i}=0\), \(h_{n,i}(x)=0\). When \(a_{n,i}\ne 0\), \(e^{a_{n,i}(x-b_{n,i})}+e^{-a_{n,i}(x-b_{n,i})}\rightarrow +\infty \) and \(e^{x^2/2}\) as \(|x|\rightarrow +\infty \). Moreover, as the quadratic term diverges faster than the linear term, we know \(e^{x^2/2}/[e^{a_{n,i}(x-b_{n,i})}+e^{-a_{n,i}(x-b_{n,i})}]\rightarrow \infty \). Thus, \(h_{n,i}(x)\rightarrow +\infty \).
On the other hand, when \(\theta \in [\alpha , \beta ]\), \(x\in [\Phi ^{-1}(\alpha ), \Phi ^{-1}(\beta )]\) with the two end points bounded away from \(-\infty \) and \(+\infty \). Therefore, under the conditions in (ii) of Proposition 2, there exist \(0< m_{\alpha \beta }< M_{\alpha \beta } <\infty \) such that \(P'_{n,i}(\theta ) = h_{n,i}(x) \in (m_{\alpha \beta }, M_{\alpha \beta })\). Thus, Condition 3 is satisfied.
We next prove that Condition 4 is also satisfied. When \(\epsilon \) is small such that \(g^{-1}(\epsilon /C_{c,d})<0\), we set \(l_{\epsilon }=\Phi \left[ C_ag^{-1}(\epsilon /C_{c,d}) - C_b\right] .\) Then for \(\theta \in [0,l_{\epsilon }]\),
where the inequality (i1) above is obtained by \(a_{n,i}C_a\geqslant 1 > 0\) and \(|b_{n,i}|-C_b\leqslant 0\). When \(\epsilon \) is large such that \(g^{-1}(\epsilon /C_{c,d})>0\), we set \(l_{\epsilon }=\Phi \left[ C_a^{-1}g^{-1}(\epsilon /C_{c,d}) - C_b\right] \!,\) and then \(P_{n,i}(\theta )-c_{n,i}\leqslant \epsilon \) can be obtained similarly. Following similar analysis, we can construct \(u_{\epsilon }\) so that \(d_{n,i}-P_{n,i}(\theta ) \leqslant \epsilon \) for \(\theta \in [u_{\epsilon },1]\). In brief, Condition 4 is satisfied.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
He, Y. Extended Asymptotic Identifiability of Nonparametric Item Response Models. Psychometrika (2024). https://doi.org/10.1007/s11336-024-09972-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11336-024-09972-7