Introduction

Much of the efforts of science itself focus on developing appropriate indicators that reflect standardized measurement of scientific and technological activities at regional and international level. However, inputs calculation is a task more closely related to economic sciences, statistics and administration, which have largely world-wide standardized methodologies and procedures. On the other hand, the theoretical-methodological concepts of science intended to formulate indicators in science and technology make this a complex and difficult undertaking (Albornoz 2007; Chavarro et al. 2014; Moravcsik 1986; Spinak 2001; Peralta et al. 2015; Sancho 2003). Measurement techniques for research results have been in existence for only a few decades and are not completely consolidated. There are excellent standards set by bibliometrics, such as the patent metrics and scientometrics expressed in indicators which are classified and applied to different situations, but there are still pending issues for the accurate measurement of results at institutional level, adapted to regional peculiarities, in addition to the use of other sources of information to establish measurement indicators (Rodrígues and Mello 2016; Spinak 2001).

In essence, scientific results, the knowledge generated, their impact and benefits to society are very difficult to quantify. However, the study of scientific literature (books, articles, reports, patents, new products, etc.) gives an approximate measure of results. It is usual to assess performance and productivity through the number of publications and citations in specialized, international, refereed and indexed journals. This practice can accurately reflect the work and quality of certain areas or fields such as physics, chemistry and biomedicine. But in other specialties and fields of application (such as in the social sciences) results and differentiated products are distributed through channels that are not always scientific journals with broad international impact (González and Molina 2009).

In bibliometrics, relevant methods have been established, as well as indicators and patterns to follow in the application of measurement tools, using scientific publications and traditional citation indexes which have been constantly improving (Peralta et al. 2015). From another perspective, innovative proposals can be found that use alternative information sources for the application of indicators, such as the Curriculum Vitae (CV) (Báez et al. 2008; Sempere and Rey-Rocha 2003; Rey-Rocha et al. 2006; Barandiarán and D´Onofrio 2013; Solís et al. 2010; Picinin et al. 2016). This approach reaffirms the need to develop Scientific Information Systems (SIS) to facilitate access to information related to the scientific results of research groups, institutions and regions to establish important parameters in the development of indicators adjusted to regional particularities and institutional realities (Cañibano and Bozeman 2009; Navarro et al. 2016).

SIS using the CV of the researcher as a source of information are called Curricular Information Systems and may have a level of institutional, national or regional aggregation. This type of computer system favorably influences the development of measuring instruments, complements quantitative analysis based on scientific publications and offers possibilities for normalization at institutional and regional level (Barandiarán and D´Onofrio 2013; Díaz et al. 2016). The CV has become a source of information that favors science, technology and innovation measurement and which can be supplemented by other sources of information such as surveys, bibliographic databases and patents. Despite this, CV standardization at field level is insufficient (Martín and Rey-Rocha 2009; Navarrete et al. 2005). However, significant progress has been made in metric resources in the Latin American region and in the integration of Curricular Information Systems. The following representative examples in the Hispanic world may be mentioned: Andalusia´s Scientific Information System (its Spanish acronym SICA) and the Latin American and Caribbean CV project in Science and Technology (its acronym in Spanish: CvLAC) (Ríos et al. 2016; Ríos and Santana 2001).

In this context, Cuba like any other nation needs to improve regulations, national policies, data sources, as well as the design and scope of its scientific indicators, adjusted to the new potential of the Latin American region. The challenge for quantitative studies of science is to go beyond a mere quantitative approach and to influence the process of strategic decision-making designed to promote, consolidate or improve scientific activity assessment in the country (Arencibia 2012; Chía and Escalona 2009). The Cuban university sector, as in other Latin American nations, is the main producer and disseminator of the knowledge sector in society. Consequently, the application of tools to manage science and technology in these institutions becomes a determining factor to promote scientific production and its management in other institutions within the region (Arencibia et al. 2012; Barandiarán and D´Onofrio 2013; Miguel et al. 2006).

In this sense, there are still some gaps in the measurement of science and technology, such as the need to know the level of specialization in several topic areas and the structural dimension of disciplinary and interdisciplinary phenomena of scientific results, among other outstanding issues (Arencibia et al. 2013).

The present research takes place within this whole context and investigates part of the problem, in this case, measurement and design of indicators tailored to data sources. The overall objective is to design a system of indicators to measure performance of science, technology and innovation in universities. The proposal includes specific analysis of the definition of each indicator, the procedure for its calculation, mathematical expression, aggregation levels and temporality, as well a its meaning and usefulness.

The use and analysis of these indicators will allow patterns, trends and regularities in the institutional knowledge organization to be established, favoring the management of the institution´s science, technology and innovation processes; and also an adequate level of institutional knowledge and information management for strategic, operational and functional decision-making in the organization.

Methodology

This paper uses, as a starting point, documentary analysis of important methodological and conceptual referents internationally recognized and specifically in the Latin American context. The main manuals consulted were: Frascati Manual (2002), the Canberra Manual (1995), Manual of Bogota (2005), Manual of Lisbon (2007) and Manual of Santiago (2007) (Organización para la Cooperación y el Desarrollo Económico [OCDE] 1995, 2003; Red Iberoamericana de Indicadores de Ciencia y Tecnología [RICYT] 2007, 2009). In addition, the so-called Manual of Buenos Aires, conceived with a view to using the researcher’s CV as a source of insider information for the construction of indicators of trajectories of scientific and technological researchers (D’Onofrio et al. 2010).

The proposal uses as a tool the Information Management and Institutional Knowledge System at Pinar del Río University (CV-UPR), developed by its Information, Knowledge and Technology Management Group, (proGINTEC). The curricular structure of the platform is adjusted to the characteristics of the institution and national regulations. To manage the researcher’s CV, the CV-UPR system uses the structural foundations established by the CvLAC, also known as Curriculum Lattes. This regional platform is well used in our countries, so that its structural premise favors normalizing CV fields to generate measurement indicators (Díaz et al. 2016). In addition, the survey, as an empirical method is used along with the questionnaire as a tool to obtain information from science and technology processes observed in the institution. A questionnaire was applied to researchers who coordinate research projects, aiming to deepen the characteristics of the results obtained and their interdisciplinary relationships. The population was composed of researchers from the university who are responsible for coordinating research projects. For this study, the list of research projects in the period 2011–2014 was taken as the source. A population of 33 researchers was identified and the questionnaire was applied to the total. The Statistical Package for Social Science software (SPSS version 11.5 2004) was used for data processing and Mindjet MindManager software (version 8.0.217) was used to create diagrams visualizing the structures of variables and indicators.

The indicators obtained were grouped into six variables with common measurement objectives. This structure allows specific analysis of certain activities related to science and technology management and, at the same time, comparison of the metric values of the different variables. Variables cover the institutional research process from academic-research results and scientific publishing to institutional visibility at territorial and international levels. Each group of indicators describe the dimensions of each variable, aimed at identifying specific patterns in measuring science and technology at the institutional level which characterize institutional knowledge in its various dimensions. The values of the indicators can be compared to establish a relationship in the behavior of each variable. In this way, the science and technology process can be characterized in a more comprehensive way at the institutional level.

Variable I: Characterization of researchers, as its name suggests, researchers are characterized based on the scientific findings that are evaluated in the institution. The parameters characterizing researchers and their behaviour, over time, help us to understand the favorable or unfavorable trends in the scientific results of the institution. From this perspective, the measurement analysis is focused on the researchers and their performance assessment, their different activities and those aspects that distinguish them. The goal of this measure is to focus on the relationship of researcher performance evaluation with the institution they belong to. Although this type of assessment is complicated, using a statistical approach, it can be balanced with other types of qualitative analysis and other personnel management tools within the institutional management framework (Wildgaard 2016). The variable is structured according to the following categories and subcategories:

  • Category: Personal characteristics:

    • Subcategory: according to sex

    • Subcategory: according to age

  • Category: Level of training of researchers.

  • Category: Teaching activities and directives of researchers

  • Category: Typology of researchers according to their scientific production

  • Category: Academic and research trajectory of the researcher.

Variable II: Scientific and technological production looks at specific aspects of this type of production in the institution. The grouping of the categories is based on the concept of scientific and technological production of the institution. This covers scientific publication, results of research projects, participation in scientific events, patents and registrations obtained and other activities of institutional relevance (Piedra and Martínez 2007). Major types of institutional scientific and technological results are easily identified in the researcher CV data. Indicators can therefore be obtained that reflect institutional and personal performance in the production of scientific and technological knowledge. The advantage of the CV format as a source of information for measuring research results has been exploited in other studies at institutional or regional levels (Barandiarán and D´Onofrio 2013; Dietz et al. 2000; Milanés 2016). Accordingly, Variable II is divided into the following categories and subcategories:

  • Category: Institutional Production

  • Category: Characteristics of publication in scientific journals: It is divided into two subcategories aimed at characterizing the publication process of scientific journals

    • Subcategory: Productivity and source of publications

    • Subcategory: Quality and authorship of publications.

  • Category: Research projects

Variable III: Academic and research trajectory complements the previous variable by focussing on the impact that scientific research has on the development of institutional academic activities. This feature, typical of university institutions, needs accurate information related to academic and research processes to assess institutional performance balancing these two very relevant aspects for university excellence. Considering this close relationship, institutional knowledge is in constant interaction with academic training and scientific knowledge development. This third variable is structured into two categories:

  • Category: Teaching activities.

  • Category: Research activities.

Variable IV: Dynamics and scientific collaboration allows the study of the interaction between researchers and institutions to obtain science and technology results; aspect that expresses the level of institutional socialization and dissemination of scientific knowledge. It is particularly beneficial to merge CV data to analyze the different ways that collaboration achieves scientific results, as reflected in research mobility history. It is common for mobility to increase scientific production (Sandström 2009; Gaughan 2009). This fourth variable is divided into three categories:

  • Category: Collaboration in scientific publications

  • Category: Institutional collaborations

  • Category: Support for research

Variable V: Territorial visibility focuses on the local impact of the institution. One of the ways to enrich the process of measuring science and technology management in universities is to highlight the strategic role and influence they have in the development of the local area or nationally. This mission of the university to reach out to the local and national communities justifies the need for measurement standards to enhance scientific results and visibility at national level. The author affiliation approach together with the analysis of the researcher’s CV, is a commendable way to interpret scientific collaboration at institutional and regional levels, as it encourages the analysis and interpretation of the results obtained (Moed and Halevi 2014). From this perspective, this variable is composed of 4 main categories:

  • Category: Awards

  • Category: Projects

  • Category: Training activities and advice

  • Category: Relevance of publications in scientific journals in the territory.

Variable VI: International visibility is an approach to measure the internationalization of science at institutional level and allows international visibility of the institution to be assessed in any given period, as a result of the researcher’s performance in international cooperation activities. It is necessary to consult the results of Variable IV indicators to deepen the analysis of scientific results from research grants, interacting with international universities (Cañibano et al. 2010). The latter variable is divided into the following categories:

  • Category: Awards

  • Category: Projects

  • Category: Training activities and advice

  • Category: Visibility of scientific results

Results and discussion

The proposed system of indicators characterizes a group of activities within the institution, linking the researchers’ behavior with the institutional environment. Therefore, the interest is not focused on obtaining specific values, but rather on the possibilities offered by the contrasts and comparisons between observations, approaches and analysis of variables that describe the process of science, technology and innovation, through the study of scientific and academic results. In this way, the analysis that can be performed by applying the indicators’ system can be interpreted as measuring institutional capacity for the generation, dissemination and evaluation of institutional knowledge.

Each indicator was identified with a denomination and a number with respect to the variable to which it belongs. Specific analysis was made of the definition of each indicator, the procedure for its calculation, its mathematical expression, and its meaning and usefulness, its level of aggregation and temporality examined (Rivero 2016). These aspects favor the implementation of this measurement system as a tool for science and technology process management at the institutional level. In the Electronic Supplementary Material of this article, there is a summary of these aspects and the specifics of each indicator.

Figure 1 shows the set of 15 indicators to characterize the researchers working in the period chosen by the evaluator. From the generational point of view (age and institutional entry dates), it is possible to analyze the number of researchers who have been more time in the institution and also to evaluate the researchers’ training and their degree of involvement in teaching or management activities related to science and technology. In this dimension of analysis, the researchers are classified according to productivity levels in scientific journals and the areas of knowledge where they publish.

Fig. 1
figure 1

Indicator structure. Variable I

Indicator 13: Researchers who have scientific publications in various areas of knowledge selects scientific publications with results that classify in several knowledge areas. To obtain this measurement, the results of the researcher’s scientific publications are classified (in various formats) from the items in their CV.

The CV-UPR System uses the taxonomy classification of the Organization for Economic Co-operation and Development (OCDE, for its acronym in Spanish). This classification of scientific knowledge has been featured in the main internationally established manuals as methodological tools for science and technology measurement. Its greatest influence is in European countries, but it has also been widely used in Latin America. Highlighted among its benefits is a more harmonious treatment of the social sciences disciplines which allows a closer approximation to social reality (Red Internacional de Fuentes de Información y Conocimiento para la Gestión de la Ciencia y la Tecnología e Innovación [Red ScienTI] 2004). Assessment of results classified into different areas of science can identify interdisciplinarity and transdisciplinarity processes of science, at least preliminarily (Elleby and Ingwersenb 2010). The study of this aspect by the researcher is proposed through their classification of items in their CV, specifically by selecting OCDE classification patterns (Hjørland and Albrechtsen 1995). It is therefore possible in the CV-UPR to assign various areas or disciplines of knowledge to the same scientific result to identify interdisciplinary intersection.

To enrich the analysis of this variable, the history of the researcher is studied together with the academic and research institution relationship. The average index of research performance and the average academic performance index are two indicators to evaluate researcher performance in relation to their research and academic results in a specific period. Furthermore, they can be calculated at the individual, group or institutional levels (D’Onofrio et al. 2010).

Indicator 14: Average index of research performance refers to the average of activities carried out by the researcher in their scientific research field over a certain period. The interpretation of this type of indicator requires data collection over a given period and the measurement of growth rate at least annually. It is very feasible to compare this indicator in an accumulated 5-year period. From the mathematical point of view, as the sum of the numerator increases, so do the results. Increase in the denominator is conditioned by the number of years to be analyzed and as this will be constant for each researcher, so the increase or decrease is due only to the sum of the numerator. The most productive researchers will have a high rate, related to the amount produced and not quality. For this reason, we suggest comparing this indicator to publication percentages in high impact journals, in Variable II indicators.

From another perspective, the Indicator 15: Average academic performance index refers to the researcher´s activities in the educational sphere, over time. Namely, activities in teaching undergraduate and graduate students averaged over a defined period. With the implementation of these two indicators, the history of the researcher in academic and research activities is linked in the same time window. The analysis combines the two performances, the teaching activities carried out by the researcher during the same period in which results are obtained through scientific research. Minimum and maximum standard values of these indicators depend on the number of researchers in the institution, the number of accumulated years in the period selected by the analyst and the total scientific production of the institution. Based on these parameters, a default value is set to limit the maximum value attained by the researcher to balance the two performance indices.

The second variable groups a total of 14 indicators, in the first instance scientific production in its various types, as well as emphasis made on research projects and publication in scientific journals (see Fig. 2). Traditional bibliometric indicators are applied and combined employing the benefits of using the CV as a source of information (Arencibia et al. 2013; Fernández et al. 1998; Peralta et al. 2015). For example, Indicator 22: Origin of the publication identifies the origin of the scientific journal where results are published while Indicator 23 on impact levels, analyzes the databases in which the journal is indexed. For more information, view the Electronic Supplementary Material of this article. During the design of the indicators and their contextualization within the institution under study, differences were addressed in the classification of certain scientific results. The questionnaire technique allowed digging deeper into research projects with results interacting in various scientific disciplines, 80% of project coordinators agreed with this. Indicator 29: Research Projects with results in several areas of knowledge, is designed with this purpose in mind and takes into consideration the field classification of research projects in the researcher’s CV. This aspect of the measurement must be supplemented by in-depth analysis and discussion within the discourse communities of researchers grouped into projects specialized in each area of science (Hjørland and Albrechtsen 1995).

Fig. 2
figure 2

Indicator structure. Variable II

The third variable (see Fig. 3) focuses on measuring the relationship between teaching and research activities, performing further analysis. A set of 12 indicators are grouped that interact in the process of measurement of undergraduate activity, graduate studies and scientific research. Preliminary indicators show a measure of the impact of the research process in the development of the academic activities of the institution.

Fig. 3
figure 3

Indicator structure. Variable III

The indicators grouped into variable 4 concentrate on measuring institutional and author collaboration to obtain shared scientific results. This proposal can be integrated by determining collaboration in academic and research activities, which are detailed in the researcher CV. It achieves harmony between collaboration among researchers and institutions (see Fig. 4).

Fig. 4
figure 4

Indicator structure. Variable IV

The last two variables contrast influence at regional level and the visibility of the institution at international level. Nine indicators are grouped in the regional perspective, which are related to territorial and national impact, scientific awards, research projects linked directly to identified national or regional priorities and the participation of the institution in the postgraduate training of the territory in which it operates (see Fig. 5). The relevance of scientific journal publication for the territorial role of the university is also considered.

Fig. 5
figure 5

Indicator structure. Variable V

From the perspective of international visibility, nine indicators are proposed with a similar structure, but set in the context of university internationalization (see Fig. 6). The indicators explore related scientific and technical consulting activities, publication in scientific journals, counseling on academic graduate research and interaction in financing and co-authorship of scientific research projects; aspects that can be identified in the CV and visualize scientific findings internationally.

Fig. 6
figure 6

Indicator structure. Variable VI

Concluding remarks

The proposed system of indicators allows precise monitoring of the results of the research activity of an institution, in close interaction with the academic activity. Knowing the results in any given period from the calculation of this indicator system, is essential for managing the science, technology and innovation process in any university. The analysis and interpretation of results reveal the research and academic strengths and weaknesses of the organization, aspects that will document strategic improvement, plans of action, measurement criteria and policies of the institution in the short, medium and long term.

Sources of reliable, standardized and accessible data to optimize measurement processes of scientific results are a requirement for a university. This study considers teacher-researcher CV data to manage the process of science, technology and innovation. The proposed indicators system is a working tool for the measurement, analysis and forecasting of scientific results in keeping with the characteristics of this type of institution.