European Exposure and Vulnerability Models: State-of-The-Practice, Challenges and Future Directions

Open Access
Part of the Springer Tracts in Civil Engineering book series (SPRTRCIENG)

7.1 Introduction

Initiated as a Joint Research Activity of the European Commission’s Horizon 2020 Project SERA (, a European Seismic Risk Model (ESRM20) (Crowley et al. 2019) is being developed using open/publicly available data on all components of seismic risk from catalogues, to active faults, building data and vulnerability models. This model will be released by the European Facilities for Earthquake Hazard and Risk (EFEHR) Consortium ( under the following general principles for open, transparent and reproducible hazard and risk models:
  • Reproducibility. Reproducibility of an experiment—or of a complex model—is one basic and unavoidable principle of modern science (Popper 2002).

  • Transparency. Transparency ensures that all aspects of scientific methods and results are available for critique, checking, compliment, or reuse.

  • FAIR principles (European Commission 2016). The data and models used or produced should be Findable, Accessible, Interoperable and Reusable.

  • Respect of the intellectual property and clear scientific ownership. A proper recognition (and citation) of data support or scientific contribution is indispensable.

An update to the 2013 European seismic hazard model (ESHM13: Woessner et al. 2015) together with a regional site amplification model (based on the methodology presented in Weatherill et al. 2020) will provide the probabilistic estimates of surface ground shaking for this risk model. This chapter summarises the current status of the exposure and vulnerability components of this seismic risk model, addresses where the key modelling challenges presently lie, and looks towards the future directions that are being explored to address those shortcomings and move towards improved European seismic risk and loss modelling under the general principles outlined above.

7.2 Exposure Modelling

7.2.1 Summary of European Exposure Model

A European exposure model describing the spatial distribution of residential, commercial and light industrial buildings in terms of building count, population, and replacement cost, and classified in terms of building classes, is being developed for 44 European countries (Crowley et al. 2020a).

These residential and non-residential exposure models have been derived based on the latest national population and dwelling censuses, socio-economic indicators (e.g. labour force, population and floor area per worker per economic sector), mapping schemes (to map the available data to building classes) developed together with local experts, as well as engineering judgment. All of the source data that has been collected, as well as the assumptions used in the development of each version of the model, are being openly released on a GitLab repository1 with a Creative Commons license. This repository will also be used to store the final exposure models for all European countries, which will be released towards the end of 2020.

The European exposure model contains a total of around 145 million buildings with a total replacement cost (of structural, non-structural elements and contents) of around 45–50 Trillion EUR, of which 20% is attributed to industrial buildings, 20% to commercial buildings and 60% to residential buildings. The top 10 countries in terms of number and value of buildings is shown in Fig. 7.1.
Fig. 7.1

Top 10 European countries in terms of number of buildings (left) and replacement cost (right)

Around 70% of the total buildings in Europe are found in these top 10 countries, whereas about 80% of the value is concentrated in 10 countries. Poland, Turkey and Romania, which have a large number of buildings and are found in the first figure are replaced by the Netherlands, Sweden and Switzerland in the second figure because, despite having a lower number of buildings, they have a higher total replacement value due to the much higher construction costs in these countries. It has also been found that around 35% of the European population is exposed to moderate levels of seismic hazard (>0.1 g) (Crowley et al. 2020a).

Maps of the exposure models, and associated web services following Open GeoSpatial Consortium (OGC) standards, are being made available through a web platform (see and thus allowing the exposure data to be easily integrated within other web applications and platforms. Figure 7.2 presents one of these maps which shows the the distribution of total replacement cost on a hexagonal grid, with a spacing of 0.30 × 0.34 decimal degrees (approximately 1000 km2 at the equator) (using the methodology described in Silva et al. 2020).
Fig. 7.2

Screenshot of the gridded exposure model viewer showing the distribution of total replacement cost on a hexagonal grid (

7.2.2 Challenges and Future Directions in Exposure Modelling

There are, however, a number of shortcomings of the approach used above to model the buildings at risk over large regions. Many assumptions are required to compensate for the lack of open/public data on buildings (e.g. the assumptions needed to convert dwellings to buildings, or the use of labour force statistics to spatially distribute commercial buildings), and often the model uncertainty in not explicitly estimated or documented, nor propagated through the risk/loss model. Some initial explorations of the uncertainty in the European exposure model have been undertaken, whereby the coefficient of variation in the replacement cost has been estimated to be of the order of 40–50%. Further sensitivity studies are however still needed, in particular related to the impact on the distribution of the building classes, which is often based on expert judgment.

Some countries in Europe (Italy, Portugal and Greece) have undertaken a building census in conjunction with the national population and dwelling census, and they have classified the buildings already into classes that correlate with the seismic performance of buildings, thus reducing the uncertainty in this part of the exposure model. The main attributes that are collected include the main construction material, total number of storeys, age and presence of soft storeys (‘pilotis’). Ideally such an effort would be carried out in more countries across Europe, and whilst there are ongoing efforts within some countries to lobby for such censuses to be carried out as input to the National Risk Assessments, required by the European Commission in support of the Sendai Framework for Disaster Risk Reduction (Veronika Sendova, personal communication), it is unlikely that the next round of censuses in 2021 will differ significantly from those undertaken in 2011. Given the significant manual work used to develop these models (which needs to be repeated when the new round of census data will be collected and made publicly available in each country across Europe), the resulting models are “static” and are unlikely to get regularly updated.

The resolution of the data varies significantly from country to country and between residential, industrial and commercial buildings. For most countries the distribution of buildings in the industrial exposure model is based on the 30 arcsec (approximately 1 km at the equator) grid of industrial built-up area developed by Sousa et al. (2017) and is thus available at a very high resolution. The residential and commercial models, on the other hand, depend on the resolution at which the census data for dwellings, buildings or labour force statistics is publicly available. Figure 7.3 shows the variation in the resolutions (in terms of highest administrative boundaries) for all countries, which shows how much the resolution varies and highlights that the commercial resolution is quite poor in most European countries.
Fig. 7.3

Highest administrative level resolution of the exposure models for residential (left) and commercial (right) buildings

As commonly known, the uncertainty in the location of assets introduces a bias in the level of ground shaking and, consequently, the level of damage (see e.g. Bal et al. 2010). Moreover, the bias can be magnified by the various site conditions at different locations. A study to investigate the impact of the spatial resolution of the exposure on the risk metrics being developed for the European Seismic Risk Model has been initiated (see Crowley et al. 2020b). The residential and commercial occupancies have been disaggregated to six resolutions 30, 60, 120, 240, 480 and 960 arcsec. In this process, buildings are redistributed using remote sensing information at 38 × 38 squared metre resolution and then aggregated to the different grid resolutions. More details on the disaggregation methodology can be found in Dabbeek and Silva (2020). In addition to the gridded exposure models, three additional workflows (wf) were investigated: (1) locations based on the centroid of administrative unit and the closest site conditions, (2) locations based on the centroid of administrative unit and average site conditions weighted by the density of built-up areas across the unit, (3) locations based on the maximum density of built-up areas and the average site conditions weighted by the density of built-up areas across the unit.

The risk metrics have been calculated using the probabilistic event-based calculator of the OpenQuake-engine (Pagani et al. 2014; Silva et al. 2014) with stochastic event sets covering 100,000 years for all modelling cases. Figure 7.4 presents the change in the national average annual loss (AAL) between the different exposure modelling cases and the benchmark model (30 arcsec). For the gridded exposure, the results indicate relatively stable losses up 240 resolution with a maximum difference of 3%. After this, the results become inaccurate, reaching a maximum difference of 25%, which can be seen in the case of Iceland and Turkey. Similar analyses were undertaken for the risk metrics at the sub-national level (aggregated to the first administrative level). These analyses illustrated that at the national level, the AAL is better estimated with the second workflow described above (wf2). The percentage change is likely to be proportional to the size of the administrative boundary, population distribution and the attenuation of ground motions and plots to demonstrate this will be produced in future versions of the study. These relationships could then be used to help identify the lowest resolution that could be used in regional/national exposure modelling in a given country to ensure a balance between computational efficiency and accuracy.
Fig. 7.4

Change in national average annual loss (AAL) due to changes in exposure resolution for nine European countries with low exposure resolution. Results are provided relative to the benchmark case (30 arc-second exposure)

To address some of the limitations described above, the future of exposure modelling is likely to focus on producing dynamic high-resolution exposure models with the necessary tools and web services that will allow them to be automatically updated. Within the European Horizon 2020 RISE project (, an effort led by GFZ Potsdam is being undertaken to develop a high-resolution Global Dynamic

Exposure (GDE) model. The GDE aims to describe exposure on the building level of every building on Earth employing a fully open big-data approach including open geographic data such as OpenStreetMap,2 open remote-sensing data, machine learning, and other open data like cadastral data-services. The GDE provides a server infrastructure to automatically compute exposure indicators for ~375 million buildings at a global scale (a number which is growing by approx. 150,000 buildings daily as more buildings are mapped in OpenStreetMap). Some of these indicators are shown on the OpenBuildingMap3 and its 3D version.4 Currently, the high-resolution building data from GED is being combined with the building classifications from the European exposure models described above as a first step to producing a high-resolution European exposure model that can be used for earthquake loss assessment under specific scenario events.

7.3 Vulnerability Modelling

7.3.1 Summary of European Vulnerability Model

Whilst vulnerability models can be developed directly from empirical loss data (e.g. Jaiswal et al. 2009), often the resolution and quality of ground motion and loss data in public databases is not sufficient for this purpose, and vulnerability models are thus commonly developed by combining fragility functions with consequence models, which define the probability of loss, conditional on the level of damage.

Fragility models for the elements at risk within an exposure model provide the probability of reaching or exceeding a set of damage states, conditional on the level of ground shaking. Whilst these models can be developed using observed damage data, the large uncertainties in the ground shaking to which the buildings have been subjected often mean that the resulting functions are flatter and highly uncertain (e.g. Ioannou et al. 2014). Analytical modelling is thus preferred as hazard consistent ground shaking at the site can be considered, the relative difference between building classes (some of which may not yet have experienced earthquake damage in past events) can be explicitly modelled, and data on the characteristics of specific buildings (when available) can be used to update the models. The latest developments, as well as limitations, in analytical vulnerability modelling has been covered in Silva et al. (2019).

A European vulnerability database, comprising capacity curves, fragility functions, damage-loss models and vulnerability functions has been compiled within the SERA project and is available on a GitLab repository (Romão et al. 2020). This database currently has 828 models from 63 separate studies obtained from the literature. Such a database is particularly useful for sanity checking new fragility models as it allows modellers to compare their models with those from the literature (see Crowley et al. 2020b).

In addition to collecting existing vulnerability models, a new set of models for the building classes in the European Seismic Risk Model is being developed. As part of this effort, the spatial and temporal evolution of design codes for reinforced concrete buildings across Europe has been studied (Crowley et al. 2021) and the basic principles of seismic design according to four main categories of design (pre-code—CDN, low—CDL, moderate—CDM and high—CDH) has been used to design prototype buildings, which have then been numerically modelled to obtain their lateral strength and deformation capacity. Buildings of design class CDN were typically designed to older codes (from before the 1960’s) that used allowable stresses and very low material strength values and considered predominantly the gravity loads. Buildings of design class CDL were designed considering the seismic action by enforcing values of the design lateral force coefficient (defined as the lateral force applied as a fraction of the weight of the building). Structural design for these codes was typically based on material-specific standards that used allowable stress design or a stress-block approach. Seismic design including modern concepts of ultimate capacity and partial safety factors (limit state design) was the basis of the CDM category of codes. The seismic action was also accounted for in the design by enforcing values for the lateral force coefficient. Finally, the CDH class refers to modern seismic design principles that account for capacity design and local ductility measures, similar to those available in Eurocode 8 (CEN 2004).

Numerical models of the MDOF designed buildings are produced and pushover curves are obtained in the two orthogonal directions and these are transformed into SDOF systems (or, for building classes which are not explicitly designed and numerically modelled, the SDOF systems are directly inferred following the approach of Martins and Silva 2020) and these SDOF models are then subjected to a range of ground motion recordings (through dynamic nonlinear analysis) to model the relationship between ground shaking intensity and displacement response, using the censored cloud approach described in Crowley et al. (2017) and Martins and Silva (2020). Uncertainties in the characteristics of the buildings (geometrical and material), the design parameters, the quality of construction (and thus adherence to code), the displacement thresholds to damage, and the record to record variability can all be accounted for in the procedure and are modelled as aleatory variabilities represented in the final dispersion of the fragility functions, constructed from the cloud as illustrated in Fig. 7.5.
Fig. 7.5

Construction of a fragility function following the cloud analysis approach (Martins and Silva 2020)

The fragility functions are then converted into vulnerability models using damage-loss models which provide loss ratios for each damage state (slight, moderate, extensive and complete). For losses due to the repair of damage, the loss ratios are inferred from a number of existing studies (e.g. Di Pasquale and Goretti 2001; FEMA 2004; Kappos et al. 2006; Bal et al. 2008). For loss of life, the probability of collapse given complete damage is first estimated by combining the proposals from FEMA (2004) with engineering judgment, and comparing these with observed damage data available in databases such as the Italian Department of Civil Protection’s Da.D.O. database (, Dolce et al. 2019), and the Cambridge Earthquake Impact Database ( Fatality ratios (i.e. the probability of loss of life given collapse for different building classes) are still being developed through the evaluation of fatality data from a number of past damaging earthquakes.

7.3.2 Challenges and Future Directions in Vulnerability Modelling

Whilst the latest approaches for vulnerability modelling account for a wide range of uncertainties, modelling uncertainty is typically not considered, where the latter is defined herein as the uncertainty associated with the selected modelling approach, rather than the commonly considered ‘parameter uncertainty’ which is the uncertainty associated with the parameters of a particular modelling approach. As more experimental tests of components and full-scale buildings become available, there is scope to quantify the bias or lack of precision of the structural modelling methodology used in the development of analytical fragility functions (e.g. Bradley 2013). However, even when the selected modelling approach is tested/calibrated against some experimental tests, blind prediction exercises show that results from plausible models can still vary significantly (see e.g. Terzic et al. 2015). The buildings for which the fragility functions are the being developed will not necessarily have the same characteristics as those for which the numerical modelling approach has been calibrated against. Sensible variations of the model should thus still be undertaken when developing fragility functions for a given structural typology, or the impact should be applied ex post through engineering judgment, based on the normalised results of other similar studies into modelling uncertainty – of which more are needed. An example publication that proposes values for modelling uncertainty is FEMA P-58 (FEMA 2018). Such variations in the model might include, for reinforced concrete buildings, varying the assumptions regarding rigidity of the beam-column joints, the bond between rebars and concrete, the contact between infill panels and the frame, and for masonry buildings the interlock between orthogonal walls, connection between slabs, roof and walls or assumptions on the equivalent frame discretisation. The modelling of this epistemic uncertainty should thus become standard practice in future analytical fragility modelling, and might be based on a backbone approach with the aleatory model uncertainty represented through a logic tree (Crowley et al. 2017), an approach that is being increasingly used for ground motion modelling (e.g. Atkinson et al. 2014; Douglas 2018). Figure 7.6 shows an example of how the epistemic uncertainty in the fragility models (by producing a large number of fragility functions due to the variations in the modelling approach) can be transformed to a three-point distribution that is used for the logic tree. The advantage of including the model uncertainty as an epistemic uncertainty, rather than an aleatory variability, allows the correlation of this uncertainty across building classes that are based on similar modelling approaches to be more readily accounted for in the risk model.
Fig. 7.6

Illustrative example of the representation of the model uncertainty in fragility functions as a three-point distribution used for the logic tree

In order to improve the transparency and reproducibility of fragility models and to render more explicit the uncertainties that have been modelled, it is recommended that, in addition to providing the parameters of the models through vulnerability databases such as the one described above, the underlying data (e.g. SDOF model parameters, selected records, damage thresholds) and the software used to develop the models should also be made openly available. The Global Earthquake Model (GEM) is currently developing open source Python scripts and tools (the ‘vulnerability modeller’s toolkit’) that follow the vulnerability methodology used by GEM in their Global Seismic Risk Model (Silva et al. 2020; Martins and Silva 2020). These tools will allow users to produce fragility models that are based on a common methodology and can be readily compared, and advanced users will be able to make modifications to the scripts that can be openly shared.

Another effort that is being undertaken to improve the reliability of future vulnerability modelling is the formalisation of a testing framework for risk models (Crowley et al. 2020b, c). Simple sanity checks, so-called ‘unit tests’ can be included in software for developing fragility functions (such as the one described above) to ensure the median and dispersion values are within sensible ranges, and to compare with existing functions from the literature. However, it should be considered when undertaking such comparisons that many of models from the academic literature have not been calibrated or tested using past earthquake damage and loss data. Hence, although comparisons with existing models is an important test, it is even more important to ensure that the proposed models are tested against empirical data. Useful, and openly available, data for this purpose includes the empirical vulnerability models developed by PAGER (Jaiswal et al. 2009; Jaiswal and Wald 2013), as well as fatality, economic loss and damage data from various databases including the Centre for Research on the Epidemiology of Disasters (CRED)’s EMDAT database (EMDAT 2019), the Italian Department of Civil Protection’s Da.D.O. database (Dolce et al. 2019), NOAA’s Significant Earthquake Database (NGDC/WGS), and the Cambridge Earthquake Impact Database ( Despite the current availability of damage and loss data for the verification of seismic risk models, continued efforts to standardise and harmonise the collection of open and publicly available consequence data is still needed. Efforts to combine these data sources with the USGS ShakeMaps for all earthquakes in Europe above magnitude 4 since 1960 are currently being undertaken by the author to produce an open standardised data source that can be used for the testing of European risk models.

Figure 7.7 shows two examples of tests that have been undertaken with the European vulnerability model (Crowley et al. 2020b). In the first example, a mean vulnerability function calculated through an exposure-weighted combination of all the building classes in the country has been produced and compared with the empirical models developed by PAGER, following conversion of the spectral ordinates to macroseismic intensity (with the associated uncertainty in the conversion shown by the mean and +/1 standard deviation vulnerability curves in Fig. 7.7). In the second example, the INGV ShakeMap for the 2009 L’Aquila earthquake has been used together with the exposure and vulnerability models to estimate the damage distribution and this has been compared with the damage reported in the Da.D.O. database (using the method outlined in Silva and Horspool 2019).
Fig. 7.7

Example tests of the European vulnerability model: comparison with the PAGER vulnerability model (left), comparison of estimated and observed damage for the L’Aquila earthquake using damage data from the Da.D.O. database ( (right)

7.4 Concluding Remarks

This chapter has presented the latest status of the exposure and vulnerability components of the European Seismic Risk Model (ESRM20) which is under development and will be released in autumn 2020 by the risk services of the European Facilities for Earthquake Hazard and Risk (EFEHR) Consortium.5 These models follow the state-of-the-practice of large-scale, regional exposure and vulnerability modelling. Some of the challenges in the current practice, such as limited access to public data, manual updating, difficulties in reproducing current models, and lack of testing, have been discussed herein and the future directions being taken to address these issues have been outlined. On the whole, it is believed that a move towards releasing all underlying data sources of the components of risk models in an open and transparent manner, together with the software used to develop them, will ensure the continued improvement of European risk modelling.



  1. Atkinson GM, Bommer JJ, Abrahamson NA (2014) Alternative approaches to modeling epistemic uncertainty in ground motions in probabilistic seismic-hazard analysis. Seismo Res Lett 85(6):1141–1144. Scholar
  2. Bal IE, Crowley H, Pinho R (2008) Detailed assessment of structural characteristics of Turkish RC building stock for loss assessment models. Soil Dyn Earthq Eng 28:914–932CrossRefGoogle Scholar
  3. Bal IE, Bommer JJ, Stafford PJ, Crowley H, Pinho R (2010) The influence of geographical resolution of urban exposure data in an earthquake loss model for Istanbul. Earthq Spect 26(3):619–634. Scholar
  4. Bradley B (2013) A critical examination of seismic response uncertainty analysis in earthquake engineering. Earthq Eng Struct Dyn 42(11):1717–1729CrossRefGoogle Scholar
  5. CEN (2004) Eurocode 8: design of structures for earthquake resistance. European Standard, European Committee for Standardisation, BrusselsGoogle Scholar
  6. Crowley H, Polidoro B, Pinho R, Van EJ (2017) Framework for developing fragility and consequence models for local personal risk. Earthq Spect 33(4):1325–1345CrossRefGoogle Scholar
  7. Crowley H, Rodrigues D, Silva V, Despotaki V, Marins L, Romão X, Castro JM, Pereira N, Pomonis A, Lemoine A, Roullé A, Tourlière B, Weatherill G, Pitilakis K, Danciu L, Correira AA, Akkar S, Hancilar U, Covi P (2019) The European seismic risk model 2020 (ESRM20). In: Proceedings of 2nd international conference on natural hazards and infrastructure, ICONHIC 2019. Chania, CreteGoogle Scholar
  8. Crowley H, Despotaki V, Rodrigues D, Silva V, Toma-Danila D, Riga E, Karatzetzou A, Fotopoulou S, Zugic Z, Sousa L, Ozcebe S, Gamba P (2020a) Exposure model for European seismic risk assessment. Earthq Spect.
  9. Crowley H, Dabbeek J, De Maio FV, Despotaki V, Rodrigues D, Faravelli M, Borzi B, Silva V, Martins L, Kalakonas P, Weatherill G, Riga E, Karatzetzou A, Pitilakis K, Anastasiadis A, Pitilakis D, Fotopoulou S, Michelini A, Faenza L (2020b) D26.8 testing and verification of the European Seismic Risk Model (ESRM20), SERA Project Deliverable.
  10. Crowley H, Despotaki V, Silva V, Dabbeek J, Romão X, Daniell J, Veliu E, Bilgin H, Adam C, Deyanova M, Ademović N., Atalic J, Riga E, Karatzetzou A, Bessason B, Sendova V, Toma-Danila D, Zugic Z, Akkar S, Hancilar U (2021) Model of seismic design lateral force levels for the existing European building stock, Bull Earthq Eng.
  11. Crowley H, Silva V, Kalakonas P, Martins L, Weatherill G, Pitilakis K, Riga E, Borzi B, Faravelli M (2020c) Verification of the European seismic risk model (ESRM20). In: Proceedings of 17th world conference on earthquake engineering. Sendai, JapanGoogle Scholar
  12. Dabbeek J, Silva V (2020) Modeling the residential building stock in the Middle East for multi-hazard risk assessment. Nat Hazards 100:781–810. Scholar
  13. Di Pasquale G, Goretti A (2001) Vulnerabilità funzionale ed economica degli edifici residenziali colpiti dai recenti eventi sismici italiani. In: Proceedings of the 10th National Conference “L’ingegneria Sismica in Italia”. Potenza-Matera, ItalyGoogle Scholar
  14. Dolce M, Speranza E, Giordano F, Borzi B, Bocchi F, Conte C, Di Meo A, Faravelli M, Pascale V (2019) Observed damage database of past Italian earthquakes: the Da.D.O WebGIS. Bollettino di Geofisica Teorica ed Applicata 60(2):141–164.
  15. Douglas J (2018) Capturing geographically-varying uncertainty in earthquake ground motion models or what we think we know may change. In: Pitilakis K (ed) Recent advances in earthquake engineering in Europe, Chapter 6, pp 153–181Google Scholar
  16. EMDAT (2019) International disasters database of the centre for research on the epidemiology of disasters.
  17. European Commission (2016) Guidelines on FAIR data management in horizon 2020, July 2016.
  18. FEMA (2004) HAZUS-MH technical manual. Federal Emergency Management Agency, Washington DC, USAGoogle Scholar
  19. FEMA (2018) Seismic performance assessment of buildings. Volume 1 – Methodology. FEMA P-58-1, Second Edition, Federal Emergency Management Agency, Washington DCGoogle Scholar
  20. Ioannou I, Douglas J, Rossetto T (2014) Assessing the impact of ground-motion variability and uncertainty on empirical fragility curves. Soil Dyn Earthq Eng. Scholar
  21. Jaiswal K, Wald D (2013) Estimating economic losses from earthquakes using an empirical approach. Earthq Spectra 29(1):309–324Google Scholar
  22. Jaiswal K, Wald D, Hearne M (2009) Estimating casualties for large worldwide earthquakes using an empirical approach, US Geological Survey Open-File Report 1136Google Scholar
  23. Kappos A, Panagopoulos G, Panagiotopoulos C, Penelis G (2006) A hybrid method for the vulnerability assessment of R/C and URM buildings. Bull Earthq Eng 4(4):391–413. Scholar
  24. Martins L, Silva V (2020) Development of a fragility and vulnerability model for global seismic risk analyses. Bull Earthq Eng. Scholar
  25. National Geophysical Data Center/World Data Service (NGDC/WDS): significant earthquake database. National Geophysical Data Center, NOAA. Access date: 22 Apr 2020
  26. Pagani M, Monelli D, Weatherill G, Danciu L, Crowley H, Silva V, Henshaw P, Butler L, Nastasi M, Panzeri L, Simionato M, Vigano D (2014) OpenQuake Engine: an open hazard (and risk) software for the global earthquake model. Seism Res Lett 85(3):692–702CrossRefGoogle Scholar
  27. Popper K (2002) The logic of scientific discovery. Rutledge classic, p 545Google Scholar
  28. Romão X, Pereira N, Castro JM, De Maio F, Crowley H, Silva V, Martins L (2020) European Building Vulnerability Data Repository (Version v1.1) [Data set]. Zenodo.
  29. Silva V, Crowley H, Pagani M, Monelli D, Pinho R (2014) Development of the OpenQuake engine, the global earthquake model’s open-source software for seismic risk assessment. Nat Hazards. Scholar
  30. Silva V, Akkar S, Baker J, Bazzurro P, Castro JM, Crowley H, Dolsek M, Galasso C, Lagomarsino S, Monteiro R, Perrone D, Pitilakis K, Vamvatsikos D (2019) Current challenges and future trends in analytical fragility and vulnerability modelling. Earthq Spect 35(4):1927–1952CrossRefGoogle Scholar
  31. Silva V, Amo-Oduro D, Calderon A, Costa C, Dabbeek J, Despotaki V, Martins L, Pagani M, Rao A, Simionato M, Viganò D, Yepes-Strada C, Acevedo A, Crowley H, Horspool N, Jaiswal K, Journeay M, Pittore M (2020) Development of a global seismic risk model. Earthq Spect.
  32. Silva V, Horspool N (2019) Combining USGS XE “USGS” shakemaps and the openquake-engine for damage and loss assessment. Earthq Eng Struct Dynam. Scholar
  33. Sousa L, Silva V, Bazzurro P (2017) Using open-access data in the development of exposure data sets of industrial buildings for earthquake risk modeling. Earthq Spect 33(1):63–84CrossRefGoogle Scholar
  34. Terzic V, Schoettler MJ, Restrepo JI, Mahin SA (2015) Concrete column blind prediction contest 2010: outcomes and observations. PEER Report 2015/01. Pacific Earthquake Engineering Research Center, Berkeley, CAGoogle Scholar
  35. Weatherill G, Kotha SR, Cotton F (2020) Re-thinking site amplification in regional seismic risk assessment. Earthq Spect. Scholar
  36. Woessner J, Danciu L, Giardini D, Crowley H, Cotton F, Grunthal G, Valensise G, Arvidsson R, Basili R, Demircioglu M, Hiemer S, Meletti C, Musson R, Rovida A, Sesetyan K, Stucchi M (2015) The 2013 European seismic hazard model—key components and results. Bull Earthq Eng 13(12):3553–3596. Scholar

Copyright information

© The Author(s) 2021

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.European Centre for Training and Research in Earthquake Engineering (EUCENTRE)PaviaItaly

Personalised recommendations