The last week of June is eagerly anticipated by editors as it marks the release of the Impact Factor from Clarivate Analytics. As is the case every year, some journals have seen an increase in their rankings, while others have experienced a decline, and a few have remained stable. However, what truly fascinates me is observing the discourse on social media.

Last year, when the majority of journals experienced an upward trend in their numbers, many editors took to advertising their journals’ success in climbing the rankings, although these increases were often unrelated to the journals’ quality. This year, however, as anyone familiar with the calculation of Impact Factor could have predicted, numerous journals have witnessed a decline. With the fading hype surrounding COVID-19, citations are following a similar trajectory. Consequently, editors have remained relatively quiet, with bombastic announcements noticeably absent from social media.

As we have emphasized for years, EJNMMI does not adhere to an impact factor strategy but instead prioritizes the scientific quality of manuscripts. Editors, reviewers, and authors are human and prone to errors, but we are doing our best to keep the bar straight. Our goal is quality, coupled with serving the scientific community. We are proud to be a well-known journal in nuclear medicine, attracting a substantial number of authors and readers from all five continents. Each year, the European Journal of Nuclear Medicine and Molecular Imaging draws closer to an incredible number of scientists who enhance the quality and level of discourse within our pages.

You may have noticed an increasing number of editorial commentaries being published, offering readers diverse perspectives and analyses of papers featured in EJNMMI and other pertinent journals in the field. We encourage all of you to openly comment on the papers you read, especially if you disagree with the authors’ viewpoints, and contribute to the enrichment of our scientific community.

EJNMMI has also introduced Collections, which are series of papers focused on specific topics. Original articles, systematic reviews, and editorials are grouped together based on their subject matter, providing readers with an up-to-date and dynamic view of relevant subjects. It is worth noting that Collections can be linked within the EJNMMI family of journals, broadening the range of perspectives offered to readers.

In conclusion, I would like to extend my gratitude once again to our readers, authors, reviewers, and editors for the tremendous work you contribute each day to uphold the deserving excellence of EJNMMI.

For those who may not be familiar with the topic, I will provide a brief comment on the Impact Factor.

The Impact Factor has become a significant metric, often regarded as a measure of success and prestige for academic journals. Initially intended to gauge the importance and reach of scientific publications, the Impact Factor has shaped the landscape of scientific writing.

One limitation of the Impact Factor is its focus on citation counts. This metric overlooks various other aspects that contribute to the quality and impact of scientific research. Factors such as the originality of the study, methodological rigor, or potential practical applications are disregarded. Consequently, it fosters publication bias, encouraging researchers to pursue topics with broader appeal, while neglecting potentially groundbreaking but less popular areas of study.

Furthermore, the Impact Factor fails to capture the dynamic nature of scientific fields. It is calculated based on the number of citations received by articles published in a journal only over a period of 2 years. Although there are other indices to address this issue, they are not considered relevant as the Impact Factor.

Unfortunately, the Impact Factor has become susceptible to manipulation, leading to unethical practices in scientific writing. The pressure to publish in high-Impact Factor journals has fueled the proliferation of practices such as salami publishing, selective citation, and even fraudulent behavior. These practices compromise the integrity of scientific publishing as they prioritize quantity over quality and hinder the progress of knowledge.

While the Impact Factor has served as a useful tool for assessing research influence in the past, it is imperative that we embrace a more comprehensive and multidimensional evaluation system. Such a system should consider various aspects, including research quality, societal impact, collaboration, and open access availability. Several metrics, which take into account social media attention and online engagement, should be utilized to provide a broader perspective on research impact.

One initiative by the European Commission worth considering in this regard is the Coalition for Advancing Research Assessment (COAR). For further information, please visit https://coara.eu.