Abstract
Many examinations, such as competitive, intuitive, non-institutional examinations that students apply for, are carried out every year. In most cases, competitive and entry examinations contain objective or multiple-choice questions. Such tests are evaluated and carried out on the device, and their evaluation is therefore straightforward. However, since these examinations address multiple-choice questions only, there is still no ability to answer and evaluate descriptive questions. If the method of assessing descriptive responses is automated to effectively evaluate the student’s examination response sheets, it will be very helpful for academic institutions. A new method is suggested in this study to evaluate the short answers of the students, such as descriptive answers using algorithms from natural language processing [NLP]. The staff member develops a response sheet and keyword dataset for the examination process in this system. In data storage, such datasets are stored and students enter their answers on the examination page. Using NLP algorithms, this system calculates results automatically. Before this assessment process, the preprocessing technique was applied to the responses provided by the students.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Nandini V, Uma Maheswari P (2018) Automatic assessment of descriptive answers in online examination systems using semantic relational features. J Supercomput
Patil SM, Sonal Patil MS. Evaluating the student descriptive answer using natural language processing. Int J Eng Res Technol 3(3)
Meena K, Raj L (2014) Evaluation of the descriptive type answers using hyperspace analog to language and self-organizing map. In: 2014 IEEE international conference on computational intelligence and computing research, Coimbatore, pp 1–5
Lakshmi V, Ramesh V (2017) Evaluating students’ descriptive answers using natural language processing and artificial neural networks. Int J Creat Res Thoughts (IJCRT) 5(4):3168–3173
Xu Y, Reynolds N (2012) Using text mining techniques to analyze students’ written response to a teacher leadership dilemma. Int J Comput Theory Eng 4
Kudi P, Manekar A (2014) Online examination with short text matching. In: IEEE Global conference on wireless computing and networking
Ghosh S, Fatima SS (2010) Design of an automated essay grading (AEG) system in Indian context. Int J Comput Appl (0975-8887) 1(11)
Cutrone L, Chang M, Kinshuk (2011) Auto-assessor: computerized assessment system for marking student’s short-answers automatically. In: IEEE International conference on technology for education
Ade-Ibijola AO, Wakama I, Amadi JC (2012) An expert system for automated essay scoring (AES) in computing using shallow NLP techniques for inferencing. Int J Comput Appl (0975-8887) 51(10)
Sukkarieh JZ, Blackmore J (2009) C-rater: automatic content scoring for short constructed responses. In: Proceeding of the 22nd international FLAIRS conference
Aziz MJA, Ahmad FD, Ghani AAA, Mahmod R (2009) Automated marking system for short answer examination (AMSSAE). In: IEEE symposium on industrial electronics & applications, 2009. ISIEA 2009, pp 47–51
Roy C, Chaudhuri C (2018) Case based modeling of answer points to expedite semi-automated evaluation of subjective papers. In: 2018 IEEE 8th international advance computing conference (IACC), pp 85–90
Rahman M, Hasan Siddiqui F (2018) NLP-based automatic answer script evaluation. DUET J 4(1):35–42
Tulaskar A, Thengal A, Koyande K (2017) Subjective answer evaluation system. Int J Eng Sci Comput 7(4)
Rokade A, Patil B, Rajani S, Revandkar S, Shedge R (2018) Automated grading system using natural language processing. In: 2018 Second international conference on inventive communication and computational technologies (ICICCT)
Saipech P, Seresangtakul P (2018) Automatic Thai subjective examination using cosine similarity. In: 2018 5th international conference on advanced informatics: concept theory and applications (ICAICTA)
Nikam P, Shinde M, Mahajan R, Kadam S (2015) Automatic evaluation of descriptive answer using pattern matching algorithm. Int J Comput Sci Eng 3(1):69–70
Kashi A, Shastri S, Deshpande AR (2016) A score recommendation system towards automating assessment in professional courses. In: 2016 IEEE eighth international conference on technology for education, pp 140–143
Praveen S (2014) An approach to evaluate subjective questions for online examination system. Int J Innov Res Comput Commun Eng 2(11)
Patil P, Joshi S (2014) Kernel based process level authentication framework for secure computing and high-level system assurance. Int J Innov Res Comput Commun Eng 2(1)
Bhosale H, Joshi S (2014) Review on DRINA: a lightweight and reliable routing approach for in-network aggregation in wireless sensor networks. Int J Emerg Trends Technol Comput Sci 2(11)
Jog S, Joshi S (2014) Review on self-adaptive semantic focused crawler for mining services information discovery. Int J Eng Res Technol (IJERT) 1(1)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Rambola, R.K., Bansal, A., Savaliya, P., Sharma, V., Joshi, S. (2021). Development of Novel Evaluating Practices for Subjective Answers Using Natural Language Processing. In: Singh Pundir, A.K., Yadav, A., Das, S. (eds) Recent Trends in Communication and Intelligent Systems. Algorithms for Intelligent Systems. Springer, Singapore. https://doi.org/10.1007/978-981-16-0167-5_21
Download citation
DOI: https://doi.org/10.1007/978-981-16-0167-5_21
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-0166-8
Online ISBN: 978-981-16-0167-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)