The 2013 International Computer and Information Literacy Study (ICILS) showed that female students demonstrated higher achievement in computer and information literacy (CIL) than male students in 12 of the 14 countries considered, with an average 19 scale points (or one-fifth of a standard deviation) difference across those 12 countries. An analysis of differential item functioning indicated that female students generally performed relatively better on tasks that involved communication, design, and creativity, while male students generally performed relatively better on more technical tasks, and those concerned with security. Female students took a little longer to complete the test than male students; this may have contributed to their better scores. While there were few differences between female and male students’ basic information and communications technologies (ICT) self-efficacy, on average, male students recorded higher specialized ICT self-efficacy than female students in all 14 countries, and the difference was moderate to large in 12 of the 14 countries. General ICT self-efficacy was positively associated with both male and female CIL achievement to a similar extent in all 14 countries. Advanced ICT self-efficacy, however, was less strongly and less consistently related to CIL achievement.
As noted in Chap. 1, many large-scale assessments in a range of countries have reported that, on average, female students achieve higher scores than male students on computer, digital, or ICT literacy assessments (the terminology varies but the constructs are similar). These results differ from what might be expected, given the preponderance of males working in information technology and enrolled in computer science courses. These results also differ from the reports of self-reported competencies in the early stages of the introduction of computer technology to school (Cooper 2006; Volman and van Eck 2001). Punter et al. (2017) suggested that there has been a change in the relative performance of female and male students that has accompanied a broader societal change in computer use, from technical to applications incorporating information management and communications that make use of the internet. They argued that the performance of female and male students on different types of task should be investigated. We begin this chapter with an overview of the gender differences reported in the ICILS 2013 international report (Fraillon et al. 2014), and then summarize some detailed analyses of differences between female and male students overall and on different types of task, as well as reported differences in self-efficacy.
3.2 Gender Differences in Overall Performance
As reported in the ICILS 2013 international report (Fraillon et al. 2014), the performance of female students was substantially higher than that of male students in 12 out of the 14 ICILS 2013 countries for which adequate data were collected (Table 3.1). The size of the difference ranged from small in the Czech Republic (12 scale points) to moderate in the Republic of Korea (38 scale points). In the remaining two countries (Thailand and Turkey; in both these countries achievement levels were very low), the differences were negligible.
3.3 Gender Differences in Specific Skills
The probability of responding correctly to an item is generally assumed to be dependent only on a student’s ability and not on any other characteristics of the students, such as gender. If an item is easier for a male student than a female student with the same ability, the item is showing differential item functioning (DIF) and will advantage male students in general. The sum of the DIF estimates over all items is zero. The sum of the DIF for certain groups of items may not always add up to zero, however, and can thus reveal that some types of items are easier for male students and others for female students, after taking their ability into account. Items that display large DIFs are usually excluded from the measurement scale during calculation of ability estimates. It is not possible to remove all items that show any DIF, however, and so most remaining items show smaller levels of DIF. DIF values for females were estimated for each of the items in the ICILS 2013 CIL assessment for each of the computer literacy domains/strands, and the estimates over the group of items were summed (Table 3.2).
On average, female students performed better than male students of the same ability when asked to create information and, to a lesser extent, when asked to transform information. Male students outperformed female students of the same ability on items that required knowledge about and understanding of computer use, and on items that concerned using information safely and securely.
These findings agree with those reported in Punter et al. (2017), who examined item bias using different methods; they concluded that overall, ICILS 2013 items showed little gender DIF.
The ICILS 2013 test consisted of three types of items: multiple response items, constructed response items, and large tasks. The large tasks ask students to create an information product, such as a poster, presentation, or website. For instance, students might be asked to use a simple website builder to plan and create a webpage, or to use online database tools to select and adapt information in order to create an information sheet for their peers. DIF was also explored for these item types (Table 3.3). Large tasks were found to be relatively easier for female students. Constructed response and, to a lesser extent, multiple choice items were found to be relatively easier for male students. This pattern was true within each of the domains of CIL.
Individual assessment items that favored female students generally required skills involving communication, design, and creativity. In comparison, those items favoring male students generally required less creative skills, but more technical skills and greater knowledge of security issues, such as knowing the purpose of a captcha and recognizing spam emails (Table 3.4).
3.4 Gender Differences in CIL Self-efficacy
To examine self-efficacy in ICILS 2013, students were asked to report how well they could do each of the following general CIL skills:
Search for and find a file on a computer;
Edit digital photographs or other graphic images;
Create or edit documents (for example assignments for school);
Search for and find information needed on the internet;
Create a multimedia presentation (with sound, pictures, or video); and
Upload text, images, or video to an online profile.
In ICILS 2013, student responses to this set of items were combined into a self-efficacy scale for basic CIL skills. The scale was constructed to have a mean of 50 and a standard deviation of 10.
Female students reported significantly higher levels of general self-efficacy, on average, than male students in six countries (Table 3.5). In Chile and the Republic of Korea, the differences were significant but small, while in the Russian Federation, Croatia, Australia, and Thailand, the differences were negligible (although statistically significant). In the remaining eight countries there were no significant gender differences.
Similarly, in ICILS 2013, students were also asked to rate the level of their skills for a set of specialized CIL skills, and a self-efficacy scale for specialized CIL scales was constructed (again with a mean of 50 and a standard deviation of 10). The specialized skills were:
Use software to find and get rid of viruses;
Create a database (for example, using [Microsoft Access®]);
Build or edit a webpage;
Change the settings on the computer to improve the way it operates or to fix problems;
Use a spreadsheet to do calculations, store data, or plot a graph;
Create a computer program or macro (for example, in [Basic, Visual Basic]); and
Set up a computer network.
In contrast to the findings for general CIL skills, on average, male students showed higher self-efficacy when rating their ability in specialized CIL skills than female students in all 14 countries (Table 3.6), and the gender differences were much larger. The size of this difference was large in Germany, Norway, the Slovak Republic, the Czech Republic, Poland, Slovenia, and Lithuania, and moderate in Croatia, Australia, Turkey, the Russian Federation, and the Republic of Korea. Only in Chile and Thailand were the differences rated as small.
In order to examine the association of students’ CIL with ICT self-efficacy beliefs for this report, we computed correlation coefficients for each ICILS country by gender for basic skills (Table 3.7) and for specialized skills (Table 3.8), and calculated Cohen’s d to provide an estimate of the strength of the association. Self-efficacy in basic skills was found to be strongly positively related to achievement for male students in six countries (Australia, Chile, Croatia, the Republic of Korea, the Slovak Republic, and Turkey) and for female students in four countries (the Republic of Korea, Lithuania, the Slovak Republic, and Turkey). In most other countries the association was found to be moderate, while the effect was small for female students in the Czech Republic and Germany. This finding is in contrast to previous studies that have suggested that self-efficacy is not related to performance in CIL (for example, Siddiq et al. 2016).
Self-efficacy in specialized skills, however, was less consistently and less strongly related to CIL achievement (Table 3.8). While a number of the correlations for both male and female students reached statistical significance, the relationship was only found to be of moderate strength for males in Turkey. The strength of the relationship in all other countries was insubstantial.
These differences were noted in the ICILS 2013 international report (Fraillon et al. 2014). The report explains that the finding is not unexpected given the nature of the CIL assessment construct, which is framed around computer and information literacy skills that are not necessarily related to the more technical skills described in the specialized skills construct. Punter et al. (2017) also investigated ICT self-efficacy differences between male and female students, and concluded that the differences may arise as males tend to overestimate their abilities while females tend to underestimate their abilities.
3.5 Gender Differences in Time Taken to Respond to the Test
Another consistent finding in ICILS 2013 across all 14 countries was that male students spent less time responding to the test items, on average, than female students. On average, female students spent one to four seconds longer on each item than male students (Table 3.9).
Germany, the Republic of Korea, and Slovenia had relatively higher gender differences in the time taken to respond to items and also higher differences between male and female students’ average performance on the assessment (Table 3.9). Thailand, Lithuania, and the Russian Federation recorded much smaller (though still statistically significant) differences in average response times for male and female students, but varied somewhat in the size of their gender differences in achievement; this was small in Lithuania (17 points) and the Russian Federation (13 points), and non-significant in Turkey (see Table 3.1). These results suggest that response times for items may be a factor in the stronger average performance of female students on the ICILS 2013 CIL assessment. Taking more time to respond to these CIL items may be reflective of more careful and thoughtful responses, rather than being less familiar or less confident in their responses, or needing more time to identify the correct response, as is often the case in other assessments.
Research question RQ1 (Sect. 1.4) asked: What is the magnitude of the difference between male and female students in measured computer literacy overall, and for particular types of items?
The findings of ICILS 2013 clearly indicated that, on average, female students achieved higher scores for CIL than male students. This difference was statistically significant in 12 of the 14 countries considered, and averaged 19 scale points (or one-fifth of a standard deviation) across the countries reported here.
Within this overall pattern, we found that differential item functioning analyses indicated that female students generally performed relatively better on tasks that involved communication, design, and creativity skills. In contrast, male students generally performed relatively better on more technical tasks and those concerned with security, such as knowing the purpose of a captcha and recognizing spam emails. In addition, female students took a little longer to complete the test than male students; each item took students an average time of 35 seconds to complete, and female students took between one and four seconds longer to respond to items than male students.
Research question RQ2 (Sect. 1.4) asked: To what extent do female and male students differ in computer self-efficacy overall, and in particular aspects of computing?
We found few differences worthy of note between female and male students’ basic ICT self-efficacy. Differences were significant in only six countries, and of small size in two of these countries. However, on average, male students recorded higher specialized ICT self-efficacy than female students in all 14 countries, and the difference was moderate to large in 11 of the 15 countries. General ICT self-efficacy was positively associated with CIL achievement similarly for both sexes in all 14 countries. Advanced ICT self-efficacy, however, was less strongly and less consistently related to CIL achievement.
Cooper, J. (2006). The digital divide: The special case of gender. Journal of Computer Assisted Learning, 22, 320–334. Retrieved from https://doi.org/10.1111/j.1365-2729.2006.00185.x.
Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for life in a digital age: the IEA International Computer and Information Literacy Study international report. Cham, Switzerland: Springer. Retrieved from https://www.springer.com/gp/book/9783319142210.
Punter, R., Meelissen, M., & Glas, C. (2017). Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013. European Educational Research Journal, 16(6), 762–780. Retrieved from https://doi.org/10.1177/1474904116672468.
Siddiq, F., Hatlevik, O. E., Olsen, R. V., Throndsen, I., & Scherer, R. (2016). Taking a future perspective by learning from the past—A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educational Research Review, 19, 58–84. Retrieved from https://doi.org/10.1016/j.edurev.2016.05.002.
Volman, M., & van Eck, E. (2001). Gender equity and information technology in education: The second decade. Review of Educational Research, 71, 613–634. Retrieved from https://doi.org/10.3102/00346543071004613.
© 2019 International Association for the Evaluation of Educational Achievement (IEA)
About this chapter
Cite this chapter
Gebhardt, E., Thomson, S., Ainley, J., Hillman, K. (2019). Student Achievement and Beliefs Related to Computer and Information Literacy. In: Gender Differences in Computer and Information Literacy. IEA Research for Education, vol 8. Springer, Cham. https://doi.org/10.1007/978-3-030-26203-7_3
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26202-0
Online ISBN: 978-3-030-26203-7