Skip to main content

Science in Transition Reduced to Practice

  • 4287 Accesses

Abstract

In the true spirit of Dewey and pragmatism, knowledge, insights and experience have to be translated into interventions and actions. Only when knowledge is ‘reduced to practice’ its social robustness and value will be determined. In light of the conclusions of the previous Chapter, to be able to have more impact and to hold up our promise to society we have to reflect who our science is organized and how it could be improved. From these reflections, several interventions in the practice of research have been proposed. When we, the Science in Transition Team, started to make public our critical accounts of the practice of science, I was ‘friendly advised’ by influential older scientists to first clean up the mess in my own institution, instead of pointing to others and to the system. As a matter of fact, that is what we have been doing at University Medical Centre Utrecht (UMC Utrecht) since 2009. In this chapter I present a brief outline of our actions ‘on the ground’ in UMC Utrecht and some early actions to promote these activities abroad.

UMC Utrecht is a large academic medical centre, to give an impression, key figures of 2009 that I typically used to show in my introductory talks about UMC Utrecht, are depicted below. I am happy to see that the slide did not show JIF’s and numbers of citations, the position on the Shanghai Ranking and amount of grant money won. At that time, some 1200 researchers were working on a PhD thesis project, which saw an enormous increase over the 15 years before.

figure a

The Matrix

In September 2008 in the USA the Lehman Brothers Bank was not bailed out and had fallen. Because of these and other ominous signs in the months before, we realized that the financial crisis was imminent and visible and was going to hit major banks and financial instituters in Europe as well. We anticipated a serious collapse of the economy in Europe but also in our country when we started the first of January 2009 in a new composition of the Board of UMC Utrecht. The chairman, Professor Jan Kimpen was a paediatrician who had, before joining the board, been chairman of the Wilhelmina Children’s Hospital, a division of UMC Utrecht. The third board member, Herman Bol, came from the financial sector. I had left Sanquin in 2004 to chair the department of immunology and of the Division Laboratory and Pharmacy since 2005. After a couple of months, we started to work on a new five-year strategy. We had held discussions with our regional partners, the partners in university and corporate partners and patient-advocacy groups. We, in fact our our staff, evaluated the two strategies of the past decade and looked at potentially interesting examples of institutional research strategies abroad. The conclusions were quite interesting and refreshing. Our UMC had since 2000 been organized as a collection of divisions, small hospitals each based on a set of related medical disciplines like internal medicine, surgery, paediatrics, neurology and psychiatry, gynaecology, cardiology and pulmonology. A number of divisions were about enabling methods and technologies, as laboratory sciences, epidemiology, medical imaging, radiology, molecular biology and clinical genetics. The divisions were very well organised, performed according to finance- and production-related key performance indicators (KPI’s) very well. The institute had because of that an excellent financial position. The divisions had their own overall strategies and goals which were discussed yearly with the board. For research, there had been a top-down formulated five-year strategy, which the organization had experienced as very nice but abstract, not including very concrete milestones and goals. As division management was held accountable for staying with in their budgets, the incentives for entrepreneurship were low and collaborations over ‘the borders’ of divisions was problematic. There was a wide gap between basic pre-clinical research, most of it done in a semi-separate building and the more clinically oriented research in the hospital. The people in interviews complained that this was inefficient for research and innovation of care but also for daily delivery of clinical care. This down-side to the governance model had been consciously considered against its advantages in 2000 by our predecessors in the Board, but as the institute came from an instable financial situation in 1998, sound financial results were the first priority, and it was hoped that the organizational issues could be mitigated by wise leadership in the divisions. In our opinion, based on the evaluation of the past ten years, this appeared to have become increasingly problematic and required an intervention to facilitate and incentivize necessary collaborations between the divisions, both in clinical care as in research and innovation.

In light of this, after much deliberation, we decided to aim for a maximum of six large strategic research programmes that should be goal-oriented and connect relevant classical disciplines and divisions. The programmes should by definition thus be multidisciplinary, bring the more fundamental, pre-clinical work in the context of the relevant clinical departments or extramural domains of prevention and public health. We anticipated that the programmes would be quite large but still should be focussed on a small number of concrete short-term and long-term public health or clinical targets. These programmes, we emphasized, should truly aim for impact in science and society. A small group of professors drafted ‘terms of reference’ to provide guidance to the writing of proposals and broadly also defined criteria for quality and feasibility once choices had to be made. Based on this groundwork, we invited our professionals to present ideas for strategic programmes.

In a one-day session early in 2010, with forty senior colleagues in the room, we democratically picked 22 of the best proposals from over sixty proposals that had been submitted. They were merged into six major disease-oriented programmes that each covered for their domain the whole spectrum of basic, applied and clinical research from the disciplines that involved. For example, in Personal Cancer Care researchers from epidemiology, medical oncology, molecular genetics, surgery, radiology but also representatives of patient advocacy groups and other stakeholders participate. Eventually, it was realized that we needed rehabilitation sciences and bioethics in such programmes as well. Because of the interactive loop, beyond the linear model, it was decided in the board that we called the new strategy UMCU 3.0.

The Innovation Loop

figure b

It is a bit beyond the scope of this book, but obviously most critical for the change-management we had engaged in, were management and administrative problems that come with building such multidisciplinary programs in a matrix organization with ten divisions that are disciplinary with respect to science and medicine. Building this type of programmes required collaboration and discussion across the classical boundaries of basic-applied, preclinical – clinical and between the different clinical disciplines that were organized in divisions with strong classical structures. Clinical disciplines are far more distinct that disciplines and fields of research. They link up to years of professional medical training and clinical work with severe and often distinct patterns of socialization for their professionals. In the case of oncology, professionals from internal medicine, pathology, surgery, radiotherapy, geriatrics and rehabilitation medicine need to seamlessly work together to achieve optimal patient care. As I discussed with regards to research, also here professional hierarchies, in which the professionals are socialized, covertly or explicitly are at play. In an academic (university) medical centre the ‘field’ of research is intertwined with ‘the field’ of the medical profession each with their own power struggles and stratification as described in Chap. 3 for research. With Bourdieu’s Distinction in the back of my mind, I often realized that surgeons and internal medicine doctors are very different people indeed (Bourdieu, 2010). The complexity of this dual world of science and medicine should not be underestimated.

Because of this, successfully building consortia to form strategic programmes, requires real leadership and brinkmanship from many senior professionals. Negotiations between program and divisions about choices to be made regarding research topics and clinical work, investments and joint decisions about human resource management, hiring and promotion of personnel, were complex. These issues of ‘alignment to the higher purpose’* are classical and abundantly discussed in the literature on innovation, R&D and research management in research intensive industries (* I borrowed ‘the higher purpose’ from Manfred Kets de Vries who has advised us on management issues in 2015). Even in institutions and corporations, like Philips or pharmaceutical companies were problem- and product-oriented research is normal, and despite a much higher corporate identity and shared value with the overall corporate goals, these interactions pose managerial challenges. It took literally years for, us, the institute to get used to the new organizational scheme. During my time as director of research in Sanquin, given the mission of Sanquin, our research was for a large part to be directed at the development of products and services related to development and safety of blood products. In those days I read the literature about managing top professionals and now and then in UMC Utrecht I went back to some of those books such as Maister’s True Professionalism and Third Generation R&D by Roussel, Saad and Erickson (Maister, 1997; Roussel, Saad, & Erickson, 1991). Interestingly, Mirko Noordegraaf and Paul Boselie, colleagues from the Utrecht School of Governance in 2012 showed interest to study our management intervention as a real-live case in their long-term research programme on public management. The key question in that program is how public organizations and private organizations with a public task deal with current social issues, how they shape their public responsibilities and deliver public value. Also though Noordegraaf’s contact in an EU project I was invited to present our case in Bologna in April 2012 to thousand (!) representatives of hospital management in the Italian Region of Emilia Romagna. Later, in September 2014 I did my talk at a meeting with the Karolinska and Stockholm hospital system. Noordegraaf and Boselie joined forces with Margriet Schneider to establish the Utrecht University Focus Area Professional Performance in 2015. Margriet Schneider then was chair of the Division of Internal Medicine and later that year became Chair of the Board of UMC Utrecht.

The next five-year strategy, was initiated in 2014 when Mirjam van Velthuizen-Lormans, who had a long career already in UMC Utrecht before, became Board member. With this strategy  that started January 2015 we took it to the next level.  The ‘innovation loop’ was shown to be totally interactive engaging with regional, national and international academic and non-academic partners outside the walls of the institute. Therefore, the strategy was appropriately called ‘Connecting U’. When I proudly presented the strategy at a 2014 Christmas party to our retired colleagues, a lady in the front loudly remarked that she thought it was a nice name, but perhaps better suited for an Utrecht public transport company. I agreed of course with a big smile, but politely retorted that we felt it was also very appropriate, and nice for an academic hospital serving and connecting with the greater Utrecht region.

figure c

How Do You Want to Be Judged?

There had been in our institute for over fifteen years already a strong focus on research. In line with what at these days was pretty normal, the more fundamental pre-clinical science was regarded the best, as was measured by the JIF of the venue of publication. Publication output, citations, JIF and top 10% of journals of the field and in addition high profile personal grants and the amount of grant money that was brought in were used to score the research performance of divisions. This type of metrics was every three-year period used to determine the number of professors each of the divisions was entitled to have. As the total number of professorships was capped, this was a zero-sum game where every three years some divisions lost, some gained. Fortunately, only a limited fraction of intramural money was allocated to the divisions based on these indicators, in addition to monetary rewards for the number of awarded PhD’s. As we have seen in Chap. 3, this was since the years 2000 common practice in the Dutch and the wider European and international research landscape. UMC Utrecht in that respect was not at all atypical. In this strategy we did very well in publications, numbers of PhD’s and grant money that was brought in. With our new strategy a year or two underway, however, we after some time had to admit that this incentive and reward system was not aligned with the different forms of science and academic output in the six multidisciplinary programs. In fact, we also realized that our research evaluation system did not acknowledge top professionals and clinicians engaged in more practical patient-driven research where journal impact factors are lower and no prestigious personal career grants are to be won. The more your research was to the left of the innovation loop, the better your chances were for high JIF publications and thus academic promotion. Of course, there were exceptions. When regarding clinicians who performed extremely well and were scarce because they did and thought surgery at stellar levels, after fierce debates in academic appointment advisory committees, their publication lists and grants won were regarded less important compared to their professional academic performance and impact.

The Higher Purpose

In 2014, the need to change the system of research evaluation forced itself upon us in UMC Utrecht. This was a couple of years after we had made the change in the organization of our research environment. For us this was quite logical, conceptually and in time but, to be honest, it had not been planned in 2010. This struck me again in January 2017, at a Washington DC bookstore-restaurant having a breakfast meeting with Paul Wouters and Dan Sarewitz. The three of us attended a special one-day meeting on incentive and rewards organized by Metrics Stanford. Sarewitz made it quite clear, he was not much into the problem of metrics, but had been thinking for decades about the organization of science and how to effectively change it. Sarewitz is well-known from his critical well-informed pieces about the science system in Nature and his book, Frontiers of Illusion, his book chapters and his opinionated excellent long read ‘Saving Science’ (Sarewitz, 1996, 2016). His work has focused on the politics of science and how all kinds of forces and powers keep science from living up to the promise to optimally contribute to society and the good life. He is highly critical of those who are pursuing intellectual interests of ‘blues skies’ research with reference to the linear model of innovation and value-free inquiry. It is an endless frontier, but in his analysis with a lot of illusion indeed.

Sarewitz, asked me why and how we had been able to agree on and then implement a new system of research evaluation at UMC Utrecht. I told him the story of our intervention in UMCU 3.0 and that it thus was a logical consequence of our strategy. It was the diversity of goals and academic roles defined in the six strategic programs, that after a couple of years forced us to implement a research evaluation system that matched with these goals and with the ‘higher purpose’ of UMC Utrecht. We had assigned this task to a group of midcareer young researchers and clinical professionals chaired by Marieke Schuurmans, professor of Nursing Science and secretarially supported by Rinze Benedictus who was by then already quite an expert on Incentives and Rewards. We invited them in August 2015 to start on the question ‘How do you want to be judged and evaluated?’ After six months they presented a more inclusive and less metrics-driven evaluation protocol. The result was a very open and generic scheme which allowed to honour the pluriform excellences related to the diversity of academic roles in the system. Not only papers published, or funding obtained had to be considered, but also results being used and applied closer to users by peers or by users and stakeholders themselves. Think of application in the clinic, in medical products and technical appliances via private partners, in a treatment advise by the Health Council, in the organisation of health care in the region, or in policy making of any kind. In addition, a lot of emphasis was on the ex-ante, or ‘how’ the research was organised in order to enhance on beforehand its potential impact. For instance, we asked, if there was early engagement of stakeholders. The scheme and its implementation were not uncontested. Some warned that ‘it would come at a cost to the quality of our research, was to hurt basic science and the reputation of our institute. ‘It very much depends on how, and who defines ‘research quality’, was my response. Of course, although we all believed we were moving in the right direction, I very well understood the issue. The risk of a first-mover disadvantage posed a serious and realistic worry in those days when DORA was barely known and there was a global addiction amongst academics and university administrators to JIF, h-indexes and the Shanghai Ranking. Even in 2020, when a lot has happened regarding Incentives and Rewards, nationally and internationally, understandably this is the worry still most frequently aired by young research professionals.

The worry about basic science, as we have seen in the previous chapters is of all times. Here I refer to Stoke’s ‘Pasteur’s Quadrant where the concept of ‘user-inspired basic science’ is explained as the kind of research most researchers in basic science do (Stokes, 1997).

figure d

User-inspired basic science takes on problems in the context of a larger problem in a given practice and investigates ‘blind spots’ and missing links in knowledge and understanding in that particular field. As we have seen, basic science has a higher standing than applied science, even with the public, and this still feels like a problem for the investigator. In a typical early evening show that until recently ran on Dutch TV and was famous for a host with boundless admiration of scientists, we often see the invited scientist first explain how terribly fundamental the work is, to demonstrate its scientific quality, in order to then proudly explain how it can be used to solve a clinical or social need. Even our recent Nobelist, the synthetic organic chemist Ben Feringa, who started his career at Shell Research, did not escape this knee-jerk reflex when in 2016 he explained in the Dutch evening news his price-winning work as totally ‘blue skies’, but a moment later proudly explained that his molecular motors once may be used to direct medicines to the right spot in the body of patients among other applications in practice.

How to Make the Right Choices

One day, at my job in the department of immunology of UMC Utrecht, I got a phone call from my sister that my brother, who was more than ten years my senior, had suffered a very serious stroke and was in hospital. I went to see him at the hospital, near where he lived. He was in very bad shape. It was a devasting sight. He was paralysed on his left side, but the most terrible thing was that he could not speak and probably had serious cognitive problems. He was moved to a well-known rehabilitation centre in the heart of Amsterdam. During visits we sat in a common room, with a view of the Vondelpark. His ability to move the left leg and arm returned pretty quickly. His speech did not return and communicating with him during visits and ever thereafter was very difficult, which frustrated him enormously. Looking around at the facility, its ambiance, shocked also by the sight of also relatively young patients and their visitors, I was reading the information leaflets about the rehabilitation therapy my brother was receiving. I could not help myself to think of the enormous investments made over the years in research on the pathogenesis of stroke, involving numerous PhD positions, sophisticated animal models, laboratory equipment and large expensive devices and the most innovating molecular and imaging technologies. As the fast majority of patients survive stroke but badly need medical rehabilitation for recovery of speech and mobility and cognitive recovery, the low academic priority and very modest investments in innovation in research and development of rehabilitation and mobility research, I realised, were a disgrace.

After a few years I became the dean and I was confronted with this problem in my own UMC Utrecht and later realised it had been noted at that time by the national Health Council. Because of the reward system, its metrics and definitions of excellence, rehabilitation sciences were suffering. Typical career advice to young MD’s therefore was: ‘go for a PhD on a topic of ‘hard science’ such as molecular pathogenesis. It has more esteem, gives better papers and a better CV than to work on applied problems of mobility and rehabilitation’. Be sure, such problems caused by ‘the system’ is nobody’s fault. People, even highly educated people ‘read the system’, behave according to the system and adept strategically to seek possibilities of advantage for themselves and their set up. I could fill many pages with similar problems of agenda setting being distorted by the incentive and reward system. Molecular cancer biology versus research on living with adverse effects of chemo, total immune ablation, radiotherapy and a bone marrow transplantation. The tumour is hopefully gone, but the patient is still there struggling with her poor quality of life.

The Call for Health from Society

At that time, a general resistance was rapidly rising against the dominant idea that even in the public sphere, literally all public services, should be left to private parties and in our case the market of health care. The classical economist’s idea was that ‘automatically’ this competition would result in more efficient and cost-effective services, compared to the situation in which non-profit semi-government organizations offered these services. This neoliberalism (and globalization), with its focus on the mechanism of the (international) corporate markets and competition steered by shareholder value or the principles of New Public Management, however, did not apply to schools and higher education, but also not to health care. Apparently, these services are not typical consumer products, but more of the type of common goods essential for the quality of social life and of the public sphere in civil society and democracy and must be regulated and provided through government. In the Netherlands but also in the wider EU in a similar vein, politicians both liberal-conservatives and social-democrats realized the down sides that the politics of the Third Way have had. In in our country since 1994 this is designated as ‘Paarse Politiek’. Science in Transition did not put all of the blame for problems in academia on the politicians and government, that was thought to be too easy. We realized and showed, as discussed in Chap. 3, that academia and academics, but also administrators in university and other academic and funding institutions had quite willing adjusted their strategies and practices to these neoliberal policies.

The Netherlands Health Council in response to a request of the Minister of Health, Welfare and Sport, in October 2016 published an advice on how to improve the impact on the Dutch health care system by the research done in the eight University Medical Centres. In the letter of request to the Health Council, to our pleasant surprise, the minister cited the analysis of Science in Transition on ‘how metrics shapes science’, and would that not be a problem? The committee installed by the Health Council that produced the above-mentioned advice was clear: to a great extend research is not driven by the needs of public health, of the care or cure system, but is too much focussed on research driven by parameters of esteem, clearly related to the metrics used in academia. The report specifically pointed out the fields with high societal and clinical relevance that got too little research attention and investment in the current system. These included mostly public health and prevention and research to improve the health care system focused on problems in the region around UMC’s and national issues. The mismatch between investments in biomedical research and disease burden at patient and societal level has over the years been regularly reported in Lancet or BMJ. The novelty was that it was causally linked to the perverse effects of the incentive and reward system. The Council, citing the relevant international literature, understood that researchers, make strategic choices in which field and on what topic to do research. That this was increasingly based on the chances of building a resume mainly with particular journal articles to get credit and esteem from peers, required for the next round of grants. The current JIF dominated metrics game, the committee concluded, steers researchers away from the fields where they are closer to patients in the wards and away from citizens in the region. Is seemed as if the idea was, in my words, ‘the further away from the patient, the cleverer you are’. The Boards of UMC’s were not all that amused by the Health Council’s advice, and in a knee-jerk reflex which made it to the frontpage of a national newspaper it was rebutted that the Council did not show respect for ‘the beautiful basic research with high international visibility that is being done in UMC’s which is the basis for excellence in Dutch health research’. Initially the usual evasive and defensive voices played the Council’s critique down, saying that there was not at all a problem with regional collaboration. After some months of discussion though, it was realized that the Health Council and the Minister were to be taken very seriously. It was a problem for patients, the public and society at large and thus action required from the UMC’s, since most of their research was paid for by tax money. With professor Albert Scherpbier, the Dean of UMC Maastricht, in the lead we got a group of national experts together to compose an action plan for the UMC’s, for which we consulted virtually every stakeholder in society. As a result, a bold plan was designed in response to the Health Councils advice.

https://www.nfu.nl/img/pdf/19.5200_Research_and_Innovation_with_and_for_the_healthy_region.pdf

The plan basically was to adjust the research agenda to better respond to societal needs with regard to public health, prevention and clinical care. A clear shift to more regional and national societal impact was one of the major aims. The UMC’s committed themselves to setting up a regional network around each UMC to deliberate on the most urgent problems and how to work together, through research and social action, to improve cure, care, health and welfare in the region. This transition was not going to be easy, as was realized by the Council and the UMC’s, this would not happen without explicitly changing the incentive and rewards system of research and researchers in the UMCs. Proper incentive and rewards are required to acknowledge the diversity in research excellence of researchers in for instance public health doing quantitative social research and work on lifestyle, nutrition, and quality of life in mental disease. In our own backyard in the larger Utrecht region, in line with this advice, we had already invested in setting up this type of regional collaborations for the treatment of high complex rare cancer types with the four hospitals https://www.umcutrecht.nl/nl/ziekenhuis/regionaal-academisch-kankercentrum-utrecht-raku,and had initiated a round table Gezond Utrecht that brought together all health care providers in the region. https://bestuurstafelgezondutrecht.nl

During the COVID-19 crisis in the spring of 2020, this type of non-competitive collaborations were top-down enforced and became highly visible when health care, and in particular clinical ward and ICU capacities, personal protective equipment and the testing for COVID-19 had to be nationally and regionally organized. With respect to the need to improve collaboration and dismiss market-type competitions between cure, care and health providers, it became clear that COVID-19 had the most devasting effect on the elderly that were not in hospital, but in dedicated care homes. In the first two months of the pandemic internationally the focus was on the ICU and hospital care and cure but medical care in care homes did get little attention. We thus were confronted with the hierarchy in the medical profession between cure and care but also of insufficient ‘scientific’ appreciation of research and innovation potential of fields like geriatrics, rehabilitation and preventive medicine, despite its immense social impact in our ageing populations, not only times of Corona. We now hear a serious call to rethink this policy, will it last after COVID-19 is under control? Do we, in ‘the cold phase’, then still want to invest and pay more to have better and reliable availability of medicines and be better prepared for the ‘hot phase’ of the next pandemic?

Science in Transition Abroad

We reassured our public that our initiative and most of our agenda was part of a larger emerging international movement. In all our talks we therefore took great pains to point to some of the high-profile initiatives already ongoing abroad in biomedical research. Most of them as discussed in Chap. 3 were mainly focused on quality with respect to design, clinical impact and reporting of research and not so much about change of the system. With two of those initiatives we got connected in 2015. I think this was good for visibility and crucial for our credibility, nationally as well as internationally. Apparently, as I had learned in the field of AIDS research 30 years before, also in the field of meta-science, although its funding was at least three orders of magnitude smaller and less defined as a research field, one had to spot the ‘right people’ and their international network to connect with, in order to enhance impact. I will pay due credits to two very different men.

EQUATOR, Meeting a True Pioneer: Doug Altman (1948–2018)

February 19, 2014 a small-format symposium was held at UMC Utrecht under the title Science in Transition. I was asked by the organiser, Carl Moons, to say a few words of warm welcome as an introduction to Doug Altman, as the Dean is expected to do. It was a month after the publication of the high-profile series of papers in The Lancet of which Doug Altman was a major initiator and author. (discussed in Chap. 3) Altman was one of the co-founders of The EQUATOR health research reliability network and has written major papers since the beginning of the century on quality issues in biomedical research and its reporting. Some of his early papers in BMJ are classics that are still widely read. In the midst of the launch of Science in Transition, knowing his strong position about science, I did not do my courteous Dean’s intro, but a strong pitch ‘how science went wrong and what should be done about it’. Altman, a bit surprised at first, but then feeling free to speak up, very British but passionately gave his talk. The next year, as he obtained an Honorary Doctorate from Utrecht University, we met again. Doug was a very nice, soft-spoken scientist who was really worried about the quality of science and not fond of a lot of attention and being in the spotlights. I still think though, he was truly pleased with ‘the honour bestowed upon him’ by our University.

I believe as a result of these informative interactions during dinners before and after the University Dies Natalis of March 2015, Dough invited me to speak at the Reward/Equator Conference in Edinburgh, September 2015. This was a meeting organized by the group of authors of the Lancet papers of January 2014. (P. Glasziou, 2014; Paul Glasziou et al., 2014; Macleod et al., 2014) It were mostly biostatisticians and methodologists, most of whom were present. In fact, many major players from all over the globe, who actively worked to improve biomedical science, at journals, funders and universities were present. I was the only one that day arguing for a systems approach to the problem from the ‘Dean’s Perspective’. My call was that we needed to break free from that perverse credit cycle, but in order to do so we needed to engage the people how have power in the system: university administrators, deans, board members of the Royal Societies, patient advocates, charities and government funders. Unfortunately, too few of them were to be found in the audience that day.

METRICS, the Relentless John Ioannidis

From the speaker’s podium he could not be overlooked, seated attentively in the front row, dressed in his habitual spotless white summer costume, often complete with a bright red tie. John Ioannidis (1965) is C.F. Rehnborg Chair in Disease Prevention, Professor of Medicine, of Health Research and Policy, of Biomedical Data Science, and of Statistics; co-Director, Meta-Research Innovation Centre at Stanford. Ioannidis is a Greek-American physician-scientist and writer who has made contributions to evidence-based medicine, epidemiology, and clinical research. Ioannidis nowadays is well-known for his studies of scientific research itself, primarily in clinical medicine and the social sciences. He writes on his Stanford webpage: ‘Some of my most influential papers in terms of citations are those addressing issues of reproducibility, replication validity, biases in biomedical research and other fields, research synthesis methods, extensions of meta-analysis, genome-wide association studies and agnostic evaluation of associations, and validity of randomized trials and observational research.’ We all know and keep citing his famous paper ‘Why Most Published Research Findings are False’. (Ioannidis, 2005) He has since 2010 published continuously at a dazzling speed on that same subject in different fields and from different angels. He was involved in the Equator initiative and established METRICS in 2014. (https://metrics.stanford.edu/about-us) METRICS is a meta-research and innovation centre at Stanford. Ioannidis is worldwide recognized as one of the scientists with Doug Altman, Richard Smith and some others who started the debate about issues of quality and reproducibility in biomedical research. Until the COVID-19 pandemic, he travelled almost continuously around the world to deliver his passionate presentations. In the spring of 2015, I was formally invited to become a METRICS affiliate and to give a talk about Science in Transition at the METRICS inaugural conference in November of that year at Stanford Campus. As in the good old days of my AIDS research team, we had organized some visits in the Bay Area the day before the meeting. At Berkeley we met with the Vice Chancellor for Research Christopher McKee and his policy advisors and representatives of the Center for Science, Technology, Medicine & Society. A group of people involved in actions to improve the relationship between science and society. The latter were enthusiastic, the former more critical and even a bit cynical in reaction to my short pitch on how to improve science. ‘You think you can change a system?’ Leaving the campus, we came across a number of reserved parking spots for Noble Laureates. It was a bit weird, as we had just been talking about the skewed appreciation of different types of science. Here I have to admit, at UMC Utrecht I used to have the privilege of reserved parking close to my office which came with the membership of the board. So, I should be quiet.

figure e

At the new UCSF campus we met with Ron Vale a pioneer who was about to found ASAPbio (Accelerating Science and Publication in Biology), promoting the use of preprints and an open and transparent peer-review process. Ron did believe we can change a system. Downtown San Francisco, we discussed with Paul Volberding and his team who were involved in an interesting novel funding scheme (RAP) at UCSF to ‘foster collaborative, novel, or preliminary research activity, and to further institutional research strategic goals. Paul was in the first decades of the AIDS epidemic a well-known pioneer in the organization of clinical care, anti-viral therapy and had deeply engaged with the gay community. We only knew each other’s names from these days, but still it helped to make the connection.

California Dreamin’

A bit gloomy from these encounters, we drove down Highway 1, from San Francisco, via Half Moon Bay and then through the foothills to Palo Alto. It was clear, the project of Science in Transition had a very long way to go. My colleagues Rinze Benedictus and Susanne van Weelden had carefully observed the various responses to my ‘elevator pitches’ in the meetings that day. They did what was to be expected from them and during the ride provided a critical analysis of the different reactions we had gotten. The higher in office the more evasive the responses appeared to be, which of course made perfect sense, given the reputational and financial interests linked up to the reward system we were all in. This did made it very clear, we were up against a major force. Fortunately, the foothills in the magical afternoon Californian light to the left and the sight of the incoming rolling clouds above the cold ocean to the right, cheered us up at least a bit. We reassured ourselves, we have to keep the ball rolling, we were doing a good thing for science and mankind. Sometimes ‘on a winters day’ you need these maybe naïve idealistic moments to keep you going. It’s a shame though, that I could not locate the oceanfront hippie- style restaurant that I remembered, or I thought I remembered, where we liked to go during my sabbatical at DNAX in 1994.

The METRICS conference was hosted in a venue at the heart of Stanford Campus, with its bright sunny sky and the skyline of the Foothills in the background. With a lot of well-groomed outdoor sports accommodations, Stanford Campus misleadingly looks like a Spanish holiday and golf resort. At the time of my sabbatical at DNAX, then an amazing academic-style small biotech institute, the age of biotech and of internet companies just had started. I saw again how misleading and seducing the leisurely appearance of Palo Alto, like most of Silicon Valley and its people, is. In Boston, New York and Chicago, you can tell from the way the cities look and how the people in public spaces behave how tough life must be, but for some mysterious reason not so in Silicon Valley. Yet, Stanford University and the biotech and fintech companies are the engine of that most competitive region in science and innovation of the world. They don’t show it, but people are very eager and work long hours with for most of them also long daily commutes across the bay where housing is affordable. The right place, I would say, to discuss the perversities and adverse effects of hyper-competition for social and professional credit to obtain research grants or investments from venture capital companies in order to make it to the next round. This is the world of science that Steven Shapin described and analysed in his The Scientific Life: A Moral History of a Late Modern Vocation. (Shapin, 2008).

The meeting was a warm bath, vibrant and full of positive energy. Everything and everyone was in tune, aiming in some way or another to improve the practice of science and inquiry. To be honest, at that time I did not realize how much experience, knowledge and involvement had been brought together in that meeting. There was a lot on methods, design and reporting, but fortunately the program was much broader in its approach. We also heard talks about preregistration, about animal studies and about education of representatives from patient advocacy groups, which reminded me very much of the AIDS advocacy we had seen in the 1980s. It was not only about biomedical research, as Jelte Wicherts from Tilburg University and Brian Nosek spoke about the ongoing actions with respect to reproducibility in the field of psychology. Nosek is the founder of the Centre for Open Science (COS) (https://cos.io/about/mission/) and only later I realized that Brian Nosek’s talk was my first conscious encounter with the broader movement of open data, data sharing and reproducibility in the frame of Open Science. I did my ‘Change the Incentive and Rewards’ pitch going through the Credibility Cycle and our ongoing initiative to implement a novel evaluation system.

At the meeting Monya Baker, of the journal Nature had expressed her interest in our actions a UMC Utrecht and wanted to stay informed. Rinze Benedictus met with her at the meeting and kept her up-to-date. When the evaluation scheme and our CV portfolio was accepted for use at our institute, we wrote a small piece about it for Nature telling the story but also discussed the problems that we had seen and still anticipated for the process of implementation. We were happy with this piece since it clearly signalled that this type of action to change an important aspect of the system can be done at the institute level. The article came out in October 2016 and was picked up by Dutch newspapers, probably because it was in Nature, which in light of our mission was paradoxical since it said that a paper in Nature is not per se top class science. But anyhow, it had impact because it reached a large public. (Benedictus & Miedema, 2016). Through these international contacts we set up an exchange and collaboration with Ulrich (Ulli)Dirnagle and his team who have set up QUEST, as part of The Berlin Institute of Health (BIH). The BIH and its QUEST Center is focused on improving and transforming biomedical research, in analogy to the Science in Transition movement with emphasis on reproducibility and translational medicine but also on changing the recognition and rewards system working closely with Equator and METRICS Stanford. 

Academic Rewards and Professional Incentives

At the METRICS meeting, we decided to work jointly on the problem of incentive and rewards. We focussed on the criteria applied in the career advancement system in medical schools. Steven Goodman, David Moher and I took up that task, working towards a workshop on Academic Rewards and Professional Incentives. David Moher and colleagues did almost all of the work. Luckily the committee working on this in my institute had delivered in the first quarter of 2016 and our evaluation scheme was included.

figure f

We decided to invite a select group of participants of whom most were happy to take part and could make it on Monday 23rd of January to Washington DC, just around the corner where Donald J. Trump the Friday before had been inaugurated as president of the United States.

We were most happy to welcome among the participants: Michael Lauer (NIH); Marcia McNutt (National Academy of Sciences); Jeremy Berg (Editor in Chief Science); Robert Harrington (Chair of Medicine, Stanford); James Wilsdon (University of Sheffield), Paul Wouters (CWTS, Leiden); René Von Schomberg, PhD (Team Leader–Science Policy, European Commission); Paula Stephan (Georgia State University); Ulrich Dirnagl (Charité –Universitätsmedizin, Berlin); Chonnettia Jones, (Director of Insight and Analysis, Wellcome Trust); Malcolm MacLeod, MD, Professor of Neurology and Translational Neuroscience, University of Edinburgh); Sally Morton (Dean of Science, Virginia Tech, Blacksburg); Deborah Zarin, (Director clinicaltrials.gov), Alastair Buchan (Dean of Med School, Oxford); Trish Groves (BMJ, BMJ Open); Stuart Buck (Laura and John Arnold Foundation). John Ioannidis was chairing the meeting.

Guess what we talked about during the pre-workshop dinner that Sunday night. With this presidency, what had the future in store for the world, the US and US science, the EPA and the NIH? Uncertainty and great worries prevailed. The meeting was productive in the unique sense that, many different perspectives on the problems of the current practice of science were exchanged. The methodologists, the bibliometricians, Open Access advocates and the people focused on the systemic problems were in the same room.

A paper, written by Moher et al. to share the information and insights discussed at the meeting, was published early 2018. (Moher et al., 2018) The abstract of the paper was clearly a call for action:

Assessment of researchers is necessary for decisions of hiring, promotion, and tenure. A burgeoning number of scientific leaders believe the current system of faculty incentives and rewards is misaligned with the needs of society and disconnected from the evidence about the causes of the reproducibility crisis and suboptimal quality of the scientific publication record. To address this issue, particularly for the clinical and life sciences, we convened a 22-member expert panel workshop in Washington, DC, in January 2017. Twenty-two academic leaders, funders, and scientists participated in the meeting. As background for the meeting, we completed a selective literature review of 22 key documents critiquing the cur- rent incentive system. From each document, we extracted how the authors perceived the problems of assessing science and scientists, the unintended consequences of maintaining the status quo for assessing scientists, and details of their proposed solutions. The resulting table was used as a seed for participant discussion. This resulted in six principles for assessing scientists and associated research and policy implications. We hope the content of this paper will serve as a basis for establishing best practices and redesigning the current approaches to assessing scientists by the many players involved in that process.

The Future of Science in Transition

In the meantime, in the spring of 2015, we decided to explore the future, if any, and we invited a few persons with visibility and authoritative in the field of science and society to join our core team of Science in Transition. Frank Huisman because of heavy duties, could not contribute anymore. These workshops were the basis for the program of the Third Symposium held in March 2016 again at the Royal Society. The problems and opportunities of the academy, the position of the PhD’s, the small-scale colleges and our relationship with society and the publics were the main themes. James Wilsdon, a long-time key opinion leader in the UK and Europe was the major guest speaker who presented the findings and recommendations of ‘Metric Tide’. (Wilsdon, 2016) The symposium was quite optimistic in tone, given the actions that were already ongoing in the field, but it was clear that in the next phase ‘academic leadership’ should step up and act. We concluded with a discussion on a typical Dutch experiment in the spirit of the Science Shops of the 1970s: The National Science Agenda. This much debated initiative from Jet Bussemaker, the Minister of Higher Education sought to invite proposals from the public about issues for scientific inquiry. As of this writing, after 16.000 proposals and allocation of the first rounds of money, the funding agency is still struggling how to deal with it and how to continue the next years. The question obviously is whether this was the right framework. Engaging the publics is critical, but that should be beyond an inventory or wish list of all kinds of questions for research (like ‘why is the sky blue?’). As argued above, engagement is not to be thought of as interests at the individual level, but a social action of publics focused on problems that have social and political priority, which may change over time.

figure g

Sarah de Rijcke, photo taken by Bart van Overbeeke

From 2016, the agenda and activities of Science in Transition very much resonated with Open Science in The Netherlands and the EU were experts wrote excellent reports on the main issues related to implementation of Open Science in the member states. At several institutes in member states and around the world actions especially regarding Recognition and Rewards and use of meaningful metrics were started. The responsible administrators of universities and funders many times went for advice to experts from the field of research evaluation like Sarah de Rijcke, a senior staff member and now a full professor and Scientific Director at CWTS Leiden, who had joined our ‘team’ in the spring of 2015. Sarah, co-author of the Leiden Manifesto then already for many years had specialized in social studies of research evaluation and had as group leader been engaged in many large international EU projects. She had then already started a large research effort to empirically evaluate interventions in the incentives and rewards system. This included our intervention at UMC Utrecht with Rinze Benedictus as one of the PhD students. Sarah, with James Wilsdon, in 2019 established a new high-profile institute, the Research on Research Institute, with international support from major relevant parties. http://researchonresearch.org. Sarah is an internationally recognized expert involved in international outreach activities related to use and meaning of metrics and science management and policies. Finally, but promising for the future of science, at ‘my own’ UMC Utrecht, a group of four young female PhDs took the initiative to launch Young Science in Transition with highly relevant and visible activities: ‘How young researchers can re-shape the evaluation of their work. Looking beyond bibliometrics to evaluate success.

https://www.natureindex.com/news-blog/how-young-researchers-can-re-shape-research-evaluation-universities

References

Download references

Author information

Affiliations

Authors

Rights and permissions

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Reprints and Permissions

Copyright information

© 2022 The Author(s)

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Miedema, F. (2022). Science in Transition Reduced to Practice. In: Open Science: the Very Idea. Springer, Dordrecht. https://doi.org/10.1007/978-94-024-2115-6_6

Download citation