Indicator system for managing science, technology and innovation in universities
- 59 Downloads
The formulation of standardized measurement indicators of science, technology and innovation at the international, regional and institutional level remains a continuing need. Although there are various schools of thought and different ways of obtaining information for measurement, one of the most favorable proposals today in the development of measuring instruments is the use of the researcher’s Curriculum Vitae. The objective of this research is to design a system of indicators to measure the performance of science, technology and innovation in universities. The proposal includes specific analysis for the definition of each indicator, the mathematical procedure for its calculation, aggregation levels and time periods, as well as its meaning and usefulness. The study compiles documentary analysis of the theoretical and conceptual references that support the proposal in the Latin American context. Furthermore, an empirical survey method is proposed to assess specific contexts in the institution under study. As a result, the design of a system of indicators adjusted to the characteristics of university institutions and current trends in the Latin American region is achieved. The use and analysis of these indicators allow us to establish patterns, trends and regularities in the organization that favour institutional knowledge management on science, technology and innovation processes; and deliver adequate information management and institutional knowledge for decision-making.
KeywordsMetric indicators, science, technology and innovation Curriculum vitae, curriculum information system Information and knowledge management
Much of the efforts of science itself focus on developing appropriate indicators that reflect standardized measurement of scientific and technological activities at regional and international level. However, inputs calculation is a task more closely related to economic sciences, statistics and administration, which have largely world-wide standardized methodologies and procedures. On the other hand, the theoretical-methodological concepts of science intended to formulate indicators in science and technology make this a complex and difficult undertaking (Albornoz 2007; Chavarro et al. 2014; Moravcsik 1986; Spinak 2001; Peralta et al. 2015; Sancho 2003). Measurement techniques for research results have been in existence for only a few decades and are not completely consolidated. There are excellent standards set by bibliometrics, such as the patent metrics and scientometrics expressed in indicators which are classified and applied to different situations, but there are still pending issues for the accurate measurement of results at institutional level, adapted to regional peculiarities, in addition to the use of other sources of information to establish measurement indicators (Rodrígues and Mello 2016; Spinak 2001).
In essence, scientific results, the knowledge generated, their impact and benefits to society are very difficult to quantify. However, the study of scientific literature (books, articles, reports, patents, new products, etc.) gives an approximate measure of results. It is usual to assess performance and productivity through the number of publications and citations in specialized, international, refereed and indexed journals. This practice can accurately reflect the work and quality of certain areas or fields such as physics, chemistry and biomedicine. But in other specialties and fields of application (such as in the social sciences) results and differentiated products are distributed through channels that are not always scientific journals with broad international impact (González and Molina 2009).
In bibliometrics, relevant methods have been established, as well as indicators and patterns to follow in the application of measurement tools, using scientific publications and traditional citation indexes which have been constantly improving (Peralta et al. 2015). From another perspective, innovative proposals can be found that use alternative information sources for the application of indicators, such as the Curriculum Vitae (CV) (Báez et al. 2008; Sempere and Rey-Rocha 2003; Rey-Rocha et al. 2006; Barandiarán and D´Onofrio 2013; Solís et al. 2010; Picinin et al. 2016). This approach reaffirms the need to develop Scientific Information Systems (SIS) to facilitate access to information related to the scientific results of research groups, institutions and regions to establish important parameters in the development of indicators adjusted to regional particularities and institutional realities (Cañibano and Bozeman 2009; Navarro et al. 2016).
SIS using the CV of the researcher as a source of information are called Curricular Information Systems and may have a level of institutional, national or regional aggregation. This type of computer system favorably influences the development of measuring instruments, complements quantitative analysis based on scientific publications and offers possibilities for normalization at institutional and regional level (Barandiarán and D´Onofrio 2013; Díaz et al. 2016). The CV has become a source of information that favors science, technology and innovation measurement and which can be supplemented by other sources of information such as surveys, bibliographic databases and patents. Despite this, CV standardization at field level is insufficient (Martín and Rey-Rocha 2009; Navarrete et al. 2005). However, significant progress has been made in metric resources in the Latin American region and in the integration of Curricular Information Systems. The following representative examples in the Hispanic world may be mentioned: Andalusia´s Scientific Information System (its Spanish acronym SICA) and the Latin American and Caribbean CV project in Science and Technology (its acronym in Spanish: CvLAC) (Ríos et al. 2016; Ríos and Santana 2001).
In this context, Cuba like any other nation needs to improve regulations, national policies, data sources, as well as the design and scope of its scientific indicators, adjusted to the new potential of the Latin American region. The challenge for quantitative studies of science is to go beyond a mere quantitative approach and to influence the process of strategic decision-making designed to promote, consolidate or improve scientific activity assessment in the country (Arencibia 2012; Chía and Escalona 2009). The Cuban university sector, as in other Latin American nations, is the main producer and disseminator of the knowledge sector in society. Consequently, the application of tools to manage science and technology in these institutions becomes a determining factor to promote scientific production and its management in other institutions within the region (Arencibia et al. 2012; Barandiarán and D´Onofrio 2013; Miguel et al. 2006).
In this sense, there are still some gaps in the measurement of science and technology, such as the need to know the level of specialization in several topic areas and the structural dimension of disciplinary and interdisciplinary phenomena of scientific results, among other outstanding issues (Arencibia et al. 2013).
The present research takes place within this whole context and investigates part of the problem, in this case, measurement and design of indicators tailored to data sources. The overall objective is to design a system of indicators to measure performance of science, technology and innovation in universities. The proposal includes specific analysis of the definition of each indicator, the procedure for its calculation, mathematical expression, aggregation levels and temporality, as well a its meaning and usefulness.
The use and analysis of these indicators will allow patterns, trends and regularities in the institutional knowledge organization to be established, favoring the management of the institution´s science, technology and innovation processes; and also an adequate level of institutional knowledge and information management for strategic, operational and functional decision-making in the organization.
This paper uses, as a starting point, documentary analysis of important methodological and conceptual referents internationally recognized and specifically in the Latin American context. The main manuals consulted were: Frascati Manual (2002), the Canberra Manual (1995), Manual of Bogota (2005), Manual of Lisbon (2007) and Manual of Santiago (2007) (Organización para la Cooperación y el Desarrollo Económico [OCDE] 1995, 2003; Red Iberoamericana de Indicadores de Ciencia y Tecnología [RICYT] 2007, 2009). In addition, the so-called Manual of Buenos Aires, conceived with a view to using the researcher’s CV as a source of insider information for the construction of indicators of trajectories of scientific and technological researchers (D’Onofrio et al. 2010).
The proposal uses as a tool the Information Management and Institutional Knowledge System at Pinar del Río University (CV-UPR), developed by its Information, Knowledge and Technology Management Group, (proGINTEC). The curricular structure of the platform is adjusted to the characteristics of the institution and national regulations. To manage the researcher’s CV, the CV-UPR system uses the structural foundations established by the CvLAC, also known as Curriculum Lattes. This regional platform is well used in our countries, so that its structural premise favors normalizing CV fields to generate measurement indicators (Díaz et al. 2016). In addition, the survey, as an empirical method is used along with the questionnaire as a tool to obtain information from science and technology processes observed in the institution. A questionnaire was applied to researchers who coordinate research projects, aiming to deepen the characteristics of the results obtained and their interdisciplinary relationships. The population was composed of researchers from the university who are responsible for coordinating research projects. For this study, the list of research projects in the period 2011–2014 was taken as the source. A population of 33 researchers was identified and the questionnaire was applied to the total. The Statistical Package for Social Science software (SPSS version 11.5 2004) was used for data processing and Mindjet MindManager software (version 8.0.217) was used to create diagrams visualizing the structures of variables and indicators.
The indicators obtained were grouped into six variables with common measurement objectives. This structure allows specific analysis of certain activities related to science and technology management and, at the same time, comparison of the metric values of the different variables. Variables cover the institutional research process from academic-research results and scientific publishing to institutional visibility at territorial and international levels. Each group of indicators describe the dimensions of each variable, aimed at identifying specific patterns in measuring science and technology at the institutional level which characterize institutional knowledge in its various dimensions. The values of the indicators can be compared to establish a relationship in the behavior of each variable. In this way, the science and technology process can be characterized in a more comprehensive way at the institutional level.
- Category: Personal characteristics:
Subcategory: according to sex
Subcategory: according to age
Category: Level of training of researchers.
Category: Teaching activities and directives of researchers
Category: Typology of researchers according to their scientific production
Category: Academic and research trajectory of the researcher.
Category: Institutional Production
- Category: Characteristics of publication in scientific journals: It is divided into two subcategories aimed at characterizing the publication process of scientific journals
Subcategory: Productivity and source of publications
Subcategory: Quality and authorship of publications.
Category: Research projects
Category: Teaching activities.
Category: Research activities.
Category: Collaboration in scientific publications
Category: Institutional collaborations
Category: Support for research
Category: Training activities and advice
Category: Relevance of publications in scientific journals in the territory.
Category: Training activities and advice
Category: Visibility of scientific results
Results and discussion
The proposed system of indicators characterizes a group of activities within the institution, linking the researchers’ behavior with the institutional environment. Therefore, the interest is not focused on obtaining specific values, but rather on the possibilities offered by the contrasts and comparisons between observations, approaches and analysis of variables that describe the process of science, technology and innovation, through the study of scientific and academic results. In this way, the analysis that can be performed by applying the indicators’ system can be interpreted as measuring institutional capacity for the generation, dissemination and evaluation of institutional knowledge.
Each indicator was identified with a denomination and a number with respect to the variable to which it belongs. Specific analysis was made of the definition of each indicator, the procedure for its calculation, its mathematical expression, and its meaning and usefulness, its level of aggregation and temporality examined (Rivero 2016). These aspects favor the implementation of this measurement system as a tool for science and technology process management at the institutional level. In the Electronic Supplementary Material of this article, there is a summary of these aspects and the specifics of each indicator.
Indicator 13: Researchers who have scientific publications in various areas of knowledge selects scientific publications with results that classify in several knowledge areas. To obtain this measurement, the results of the researcher’s scientific publications are classified (in various formats) from the items in their CV.
The CV-UPR System uses the taxonomy classification of the Organization for Economic Co-operation and Development (OCDE, for its acronym in Spanish). This classification of scientific knowledge has been featured in the main internationally established manuals as methodological tools for science and technology measurement. Its greatest influence is in European countries, but it has also been widely used in Latin America. Highlighted among its benefits is a more harmonious treatment of the social sciences disciplines which allows a closer approximation to social reality (Red Internacional de Fuentes de Información y Conocimiento para la Gestión de la Ciencia y la Tecnología e Innovación [Red ScienTI] 2004). Assessment of results classified into different areas of science can identify interdisciplinarity and transdisciplinarity processes of science, at least preliminarily (Elleby and Ingwersenb 2010). The study of this aspect by the researcher is proposed through their classification of items in their CV, specifically by selecting OCDE classification patterns (Hjørland and Albrechtsen 1995). It is therefore possible in the CV-UPR to assign various areas or disciplines of knowledge to the same scientific result to identify interdisciplinary intersection.
To enrich the analysis of this variable, the history of the researcher is studied together with the academic and research institution relationship. The average index of research performance and the average academic performance index are two indicators to evaluate researcher performance in relation to their research and academic results in a specific period. Furthermore, they can be calculated at the individual, group or institutional levels (D’Onofrio et al. 2010).
Indicator 14: Average index of research performance refers to the average of activities carried out by the researcher in their scientific research field over a certain period. The interpretation of this type of indicator requires data collection over a given period and the measurement of growth rate at least annually. It is very feasible to compare this indicator in an accumulated 5-year period. From the mathematical point of view, as the sum of the numerator increases, so do the results. Increase in the denominator is conditioned by the number of years to be analyzed and as this will be constant for each researcher, so the increase or decrease is due only to the sum of the numerator. The most productive researchers will have a high rate, related to the amount produced and not quality. For this reason, we suggest comparing this indicator to publication percentages in high impact journals, in Variable II indicators.
From another perspective, the Indicator 15: Average academic performance index refers to the researcher´s activities in the educational sphere, over time. Namely, activities in teaching undergraduate and graduate students averaged over a defined period. With the implementation of these two indicators, the history of the researcher in academic and research activities is linked in the same time window. The analysis combines the two performances, the teaching activities carried out by the researcher during the same period in which results are obtained through scientific research. Minimum and maximum standard values of these indicators depend on the number of researchers in the institution, the number of accumulated years in the period selected by the analyst and the total scientific production of the institution. Based on these parameters, a default value is set to limit the maximum value attained by the researcher to balance the two performance indices.
The proposed system of indicators allows precise monitoring of the results of the research activity of an institution, in close interaction with the academic activity. Knowing the results in any given period from the calculation of this indicator system, is essential for managing the science, technology and innovation process in any university. The analysis and interpretation of results reveal the research and academic strengths and weaknesses of the organization, aspects that will document strategic improvement, plans of action, measurement criteria and policies of the institution in the short, medium and long term.
Sources of reliable, standardized and accessible data to optimize measurement processes of scientific results are a requirement for a university. This study considers teacher-researcher CV data to manage the process of science, technology and innovation. The proposed indicators system is a working tool for the measurement, analysis and forecasting of scientific results in keeping with the characteristics of this type of institution.
Our thanks to the professors of the Information, Knowledge and Technologies Management Group (proGINTEC) of Pinar del Río University, to the professors who collaborated by updating their CVs, and to those who translated the article into English.
- Albornoz, M. (2007). La RICYT: Resultados y desafíos pendientes. Ponencia presentada en el VII Congreso Iberoamericano de Indicadores de Ciencia y Tecnología. Red Iberoamericana de Indicadores de Ciencia y Tecnología (RICYT) y la Fundación de Amparo a la Investigación del Estado de Sao Paulo (FAPESP). Sao Paulo, Brazil.Google Scholar
- Arencibia, J. R. (2012). Sistematicidad en la evaluación de la actividad científica desde una perspectiva cienciométrica. Acimed, 23(3), 215–218.Google Scholar
- Arencibia, J. R., Corera, E., Chinchilla, Z., & de Moya-Anegón, F. (2013). Relaciones intersectoriales, producción científica y políticas nacionales para el desarrollo de la investigación: un estudio de caso sobre Cuba 2003–2007. Revista Cubana de Información en Ciencias de la Salud, 24(3), 111–157.Google Scholar
- Arencibia, J. R., Vega, R. L., Araújo, J. A., Corera, E., & de Moya-Anegón, F. (2012). Hitos de la ciencia cubana en el siglo XXI, una revisión a partir de los trabajos más citados en Scopus en el período 2001–2005. Acimed, 23(1), 45–58.Google Scholar
- Barandiarán, S., & D´Onofrio, M. G. (2013). Construción y Aplicación de una tipología de perfiles de diversidad profesional de los investigadores argentinos: aportes al Manual de Buenos Aires. Ponencia presentada en el IX Congreso de Indicadores de Ciencia y Tecnología de la RICIYT. Bogotá. Colombia.Google Scholar
- Chía, J., & Escalona, C. I. (2009). La medición del impacto de la ciencia, la tecnología y la innovación en Cuba: Análisis de una experiencia. Revista CTS, 13(5), 83–96.Google Scholar
- D’Onofrio, M. G., Solís F., Tignino, M. V., & Cabrera, E. (2010). Indicadores de trayectorias de los investigadores iberoamericanos: Avances del Manual de Buenos Aires y resultados de su validación técnica. Informe de la Red de Indicadores de Ciencia y Tecnología Iberoamericana e Interamericana (RICYT). Elaboración del Manual de Buenos Aires. http://www.ricyt.org/manuales/doc_view/144-indicadores-de-trayectorias-de-los-investigadores-iberoamericanos-avances-del-manual-de-buenos-aires-y-resultados-de-su-validacion-tecnica. Accessed 11 March 2015.
- Díaz, M., Peña, D. A., Rodríguez, R. J., & Carrillo-Calvet, H. (2016). Sistemas curriculares para la gestión de información y conocimiento institucional. Estudio de caso. Revista General de Información y Documentación, 26(1), 11–24.Google Scholar
- Fernández, M. T., Gómez, I., & Sebastián, J. (1998). La cooperación científica de los países de América Latina a través de indicadores bibliométricos. Interciencia, 23(6), 328–337.Google Scholar
- González, M.V., & Molina, M. (2009). La evaluación de la ciencia: revisión de sus indicadores. Revista Contribuciones a las Ciencias Sociales, noviembre. http://www.eumed.net/rev/cccss/06/ggmp.htlm. Accessed 16 December 2014.
- Milanés, Y. (2016). Evaluación Multidimensional de la Investigación. Análisis micro en la Universidad de Granada durante el período 2009–2013. Universidad de Granada. Tesis Doctorales. ISBN: 978-84-9125-583-3. http://hdl.handle.net/10481/42894. Accessed 24 March 2016.
- Navarrete, J., Santa, S., Rios, C., González, A., de Moya, F., Banqueri, J., & Solis, F. (2005). Sistema de Información Científica de Andalucía (Spain). Un Modelo para la Gestión de la Ciencia y Tecnología. Revista CENIC Ciencias Biológicas, Vol 36 (Especial).Google Scholar
- Navarro, C., Vidal, A., González de Dios, J., & Aleixandre, R. (2016). Comunicación científica (XXXII). Cómo hacer un currículum vítae. Revista Acta Pediátrica Esp, 74, 3–4.Google Scholar
- Organización para la Cooperación y el Desarrollo Económico (OCDE). (1995). Manual on the measurement of human resources devoted to S&T “Canberra Manual”.Google Scholar
- Organización para la Cooperación y el Desarrollo Económico [OCDE]. (2003). Manual de Frascaty, 2002. Definiciones y convenciones básicas. http://www.edutecne.utn.edu.ar/ocde/frascati-03-30-34.pdf. Accessed 4 May 2009.
- Peralta, M. J., Frías, M., & Chaviano, O. G. (2015). Criterios, clasificaciones y tendencias de los indicadores bibliométricos en la evaluación de la ciencia. Revista Cubana de Información en Ciencias de la Salud, 26(3), 290–309.Google Scholar
- Picinin, C. T., Pilatti, L. A., Kovaleski, J. L., Graeml, A. R., & Pedroso, B. (2016). Comparison of performance of researchers recipients of CNPq productivity grants in the field of Brazilian production engineering. Scientometrics, 109(855–870), 2017. https://doi.org/10.1007/s11192-016-2070-7.Google Scholar
- Piedra, Y., & Martínez, A. (2007). Producción científica. Ciencias de la Información, 3(38), 33–38.Google Scholar
- Red Iberoamericana de Indicadores de Ciencia y Tecnología [RICYT]. (2007). Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo (CYTED). Manual de indicadores de internacionalización de la ciencia y la tecnología, Manual de Santiago.Google Scholar
- Red Iberoamericana de Indicadores de Ciencia y Tecnología [RICYT]. (2009). Manual de Lisboa. Pautas para la interpretación de los datos estadísticos disponibles y la construcción de indicadores referidos a la transición de Iberoamérica hacia la Sociedad de la Información.Google Scholar
- Red Internacional de Fuentes de Información y Conocimiento para la Gestión de la Ciencia y la Tecnología e Innovación [Red ScienTI]. (2004). Normalización de Clasificaciones. III Reunión de Coordinación Regional de la Red ScienTI. Buenos Aires, Argentina.Google Scholar
- Ríos C., Navarrete, J., Santa, S., Solis, F., Fernández, J. A., & Chaichio, J. A. (2006). Sistema de Información Científica de Andalucía: Una herramienta para la evaluación y gestión de los resultados de la actividad científica. Actas del 8vo Congreso Nacional de Bibliotecología y Ciencias de la Información. Cartagena de Indias (Colombia).Google Scholar
- Rivero, S. (2016) Sistema de indicadores para la gestión de la ciencia y la tecnología en la Universidad de Pinar del Río (Cuba), mediante la utilización del Curriculum Vitae del investigador como fuente principal de información. Universidad de Granada. Tesis Doctorales. ISBN: 978-84-9125-583-3. http://digibug.ugr.es/handle/10481/43331. Accessed 24 March 2016.
- Sancho, R. (2003). Versión española de la sexta edición del Manual de Frascati: Propuesta de norma práctica para encuestas de investigación y desarrollo experimental. http://redc.revistas.csic.es/index.php/redc/article/viewFile/200/255. Accessed 14 May 2009.
- Sempere, J. R., & Rey-Rocha, J. (2003). El currículum vitae y la encuesta como fuentes de datos para la obtención de indicadores de la actividad científica de los investigadores. https://www.researchgate.net/publication/242594292_El_’Curriculum_Vitae’_y_la_Encuesta_como_fuente_de_datos_para_la_obtencion_de_indicadores_de_actividad_cientifica_de_los_investigadores. Accessed 16 December 2016.
- Solís, F. M., Milanés, Y., & Navarrete, J. (2010). Evaluación de la investigación científica. El caso de Andalucía. Revista Fuentes, 10, 83–100.Google Scholar
- Spinak, E. (2001). Indicadores cienciométricos. Revista Acimed, 9(Suppl), 42–49.Google Scholar
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.