Dear Editor,

We write in response to the letter by Schaffalitzky de Muckadell and Strom [1] on behalf of Pharmacosmos, the manufacturer of the iron (III) isomaltoside 1000 product. This letter contains a number of criticisms regarding the method used in our article Evaluation of the Reported Rates of Severe Hypersensitivity Reactions Associated with Ferric Carboxymaltose and Iron (III) Isomaltoside 1000 in Europe Based on Data from EudraVigilance and VigiBase™ between 2014 and 2017 [2]. We appreciate the interest the manufacturer, Pharmacosmos, has in the details of the published analysis. However, we take issue with and wholly disagree with the claim that the article was either incorrect, scientifically invalid or misleading. In the following, we address the specific criticisms that were articulated, none of which we believe alter the analysis or its findings, and in doing so, we reaffirm the methods and conclusions in our article.

  1. 1.

    Our article used a pharmaco-epidemiological methodology to assess reporting rates with respect to exposure, using real-world data, including sales data to assess drug exposure. This methodology has been previously published [3] and is used in periodic safety update reports, which are highly valued by regulatory authorities. While the article very clearly articulates the potential limitations of methodologies that rely upon spontaneous adverse event (AE) reporting, such analyses do provide valuable insights when potential biases are anticipated and when the results are used together with other sources of data to form part of the overall mosaic of insights from prospective randomized controlled studies, epidemiological analyses, case studies, and the like. We agree with the intent of the European Medicines Agency (EMA) quote [4] that solely depending upon the number of reported AEs without considering exposure, known reporting biases or other known confounders, is inappropriate. Indeed, our article clearly states that “the presented results do not allow a conclusion to be drawn about the absolute and relative risk for severe hypersensitivity reactions (HSRs) associated with ferric carboxymaltose (FCM) and iron (III) isomaltoside 1000.” Nevertheless, our analysis mitigates a number of known and potential biases, including using a uniform method of normalizing for patient exposure as well as a number of other potential biases such as the time trends in AE reporting, which we address in the article itself and subsequently in the following.

  2. 2.

    The letter by Schaffalitzky de Muckadell and Strom [1] suggests that the analysis period may have introduced a bias. This assumption refers to time trends in AE reporting, also called the “Weber effect,” sometimes described as “AE reporting peaks at the end of the second year after a regulatory authority approves a drug” [5]. We discuss the impact of time trends in AE reporting in our article, and the study period was chosen to represent a time window where both products had been on the market for significantly more than 3 years. The most recent market event, namely the EMA referral on intravenous irons, would have affected both products simultaneously.


    Furthermore, it has been published that, after the first years of use, there is a tendency for only serious adverse drug reactions (ADRs) to be reported [6], and the article focused on severe HSRs in a period where both products were past their first years of use. Our analysis also provided reporting rate ratios by year. The last year of analysis (2017)—with a time difference of more than 4 years from the period the products were last launched in the majority of countries—also provided a higher reporting rate for iron (III) isomaltoside 1000 than for FCM.


    The letter [1] presents a graph summarizing a calculation on Swissmedic data for FCM for the time period 2010–2013 [7] and VigiBase and EudraVigilance data for iron (III) isomaltoside 1000 for the period 2014–2017, reportedly taken from our article. However, it should be noted that the Swissmedic data have been generated via Standardized MedDRA® Questions, which have a wider definition for anaphylactic reactions, and is therefore not identical to the four MedDRA® preferred terms used in the article [2], which invalidates the comparison in the letter [1]. Furthermore, when applying the same consistent methodology to Swissmedic data, we find reporting rates for FCM similar to those we described in the article, namely a rate of 0.53 per 100,000 defined daily doses (DDDs) in the period 1 January 2010 to 31 December 2013 and 0.61 per 100,000 DDD in the period 1 January 2014 to 31 December 2017 (unpublished data), which speaks against any analysis period bias for severe hypersensitivity.

  3. 3.

    The letter [1] criticizes the restriction of the MedDRA® search to four terms. In fact, the objective and focus of this study was specifically to assess reporting rates for severe HSRs, as was clearly stated in the study. In contrast to clinical trial data, the use of real-world data introduces variability and heterogeneity, and—for this reason—a narrow definition was utilized to minimize other sources of heterogeneity with respect to anaphylactic/anaphylactoid reactions. For both studied products, anaphylactic/anaphylactoid reactions are specifically stated as a serious and potentially fatal reaction in the warning section (4.4) in the respective summaries of product characteristics. HSRs are generally serious adverse events but vary regarding severity grades. In order to focus on the more severe reactions with substantial consequences including fatality, we narrowed the search to severe reactions. The term “anaphylactoid/anaphylactic reactions” is also a specific term stated in the table of ADRs for both products. For those reasons, we defined severe hypersensitivity as anaphylactic reaction, anaphylactic shock, anaphylactoid reaction or anaphylactoid shock and used the specific MedDRA® preferred term for the extraction of data for both products.

    Broadening the search terms does not necessarily reduce the risk for bias, as other factors such as reduced specificity of the search terms might become relevant. Thus, it is difficult to comment on the calculation presented in the letter for a broader definition of hypersensitivity events, as the underlying exposure data have not been presented in the letter, invalidating any comparison.

  4. 4.

    The letter [1] suggests, without providing quantitative reasoning, that the use of iron (III) isomaltoside 1000 would be underestimated. In fact, for both substances, 100, 500, and 1000 mg vials are available and were included in the analysis. As presented in the article, analysis of administered doses that were reported in EudraVigilance did not reveal noticeable differences. The assumption in the letter, that inclusion of exposure to specific formulations of low-dose iron (III) isomaltoside 1000 results in overestimation of the reporting frequency of iron (III) isomaltoside 1000, can therefore not be substantiated.

    The IQVIA MIDAS data were chosen to allow a consistent basis across geographies and products. We are aware that individual company sales data can differ from MIDAS because of individual manufacturer’s sales reporting preferences, but a consistent database covering all products (and not specific to any one product or even class) is used to enable analyses unbiased by these effects. It should be noted that MIDAS data, due to this consistency and lack of company bias, are used widely in both market research and pharmacoepidemiology, for example by the World Health Organization (http://www.who.int and references therein).

  5. 5.

    The assertion in the letter [1] that relevant literature and clinical trial findings were ignored, is untrue. In fact, the available literature was discussed (e.g., quoting Aksan et al. [8], Bager et al. [9] and Mulder et al. [10]). However, randomized controlled trials and real-world data provide complementary perspectives, as acknowledged by the US FDA in their recent strategic framework [11].


    Specifically, in this context, clinical trial settings are limited by the fact that anaphylactic/anaphylactoid reactions and shocks are fortunately very rare events. An extremely high number of subjects would be required to generate sufficient statistical power for these side effects in a trial. Given the rarity of the events, we analysed post-marketing spontaneous reporting of ADRs in the real world to base our analysis on a much larger sample size and exposure.


    With respect to the references named in the letter [1], it is difficult to see how they support the Pharmacosmos case; one reference compares iron (III) isomaltoside 1000 with iron sucrose and not with FCM [12]. Another is a meta-analysis and provides no head-to head data [13]. The results from the PHOSPHARE studies [14] comparing FCM and iron (III) isomaltoside 1000, are, to the best of our knowledge, not yet published in full in a peer-reviewed journal as this refers to an abstract that will be presented in March 2019. Hence, we cannot comment on how these studies would deal with the sample size challenge for rare events. To conclude, the referenced studies only confirm the need for further comparative studies on severe HSRs associated with FCM and iron (III) isomaltoside 1000, as stated in the discussion in our article.

The use of real-world data provides valuable information, which contributes unique insights that assist in developing a broader and more accurate assessment of the true nature of illness and the impact of therapies. The results of our analysis are consistent with other insights that we describe in our article.

While we are convinced about the validity of our study and its findings utilizing real-world data, we value different perspectives and are eager to engage in a spirited and evidence-based debate about the interpretation of the data and magnitude of effect that was demonstrated. However, we vigorously refute any assertion that impugns the motivations of the analysis that was performed. Its goal was to contribute relevant, valid insights based on accredited rigorous scientific methods, which would broaden the evidence for the benefit of patients. The article was clear that the study was funded by Vifor and that the IQVIA authors were part of the study team. While IQVIA received funding for the study, IQVIA employees receive their salaries from IQVIA; they received no remuneration from Vifor, and it is improper to suggest otherwise. Furthermore, to suggest that our work is a form of “concealed promotion” is wholly inaccurate and scurrilous and demeans the spirit of vigorous debate on which the peer-review and scientific method are based.