We are pleased to present this special volume composed of papers presented at the IUTAM Symposium-Integrated Computational Structure-Material Modeling of Deformation and Failure under Extreme Conditions. This meeting was held June 20–22, 2016, at the Royal Sonesta Harbor Court Hotel, Inner Harbor, Baltimore, MD. We would also like to call your attention to a complementary issue of Computational Mechanics, which presents the second series of papers fostered from this meeting.

This symposium addressed the state of the art topics and emerging issues in the area of Integrated Computational Structure-Material Modeling of Deformation and Failure under Extreme Conditions. It brought together experts in the complementary fields of Computational and Experimental Mechanics and Materials Science to discuss multidisciplinary approaches for integrating modeling and simulation, and experiments to predict non-homogeneous deformation and failure in heterogeneous materials including metals, ceramics and composites. Its focus was on different material classes and covered a range of spatial and temporal scales needed for physics-based modeling of deformation and failure. Effective methods of coupling multiple scales in regions of homogeneous and localized deformation leading to intense damage and failure were discussed. Use of probabilistic mechanics, incorporating data from imaging into modeling capabilities through uncertainty characterization of material structure, uncertainty identification in material properties, mapping material structure uncertainty to structural performance were discussed as essential ingredients of a robust modeling process. This conversation took place with the goal of developing a ten-year vision and plan for advancing the field and solving these technological, economic and social challenges. The symposium was broken down into four thematic parts. The technical focus was on the understanding and prediction of extreme and rare events in material response, e.g. fatigue, failure, impact, blast, etc. It addressed research needs in the technically complementary areas of (1) physics-based multi-scale model development; (2) multi-scale data acquisition, characterization and experiments at different scales; (3) probabilistic modeling and uncertainty quantification; (4) structure-material integration and design.

The fourteen papers compiled in this volume of the International Journal of Fracture speak directly to the breadth of the problems defined in the stated goals of the meeting discussed above. One of the most important themes addressed within these contributions is the substantial acknowledgement of the statistical nature of the physics of deformation leading to damage in materials. This special issue represents work on fiber-based polymer composites, polycrystalline metals, ceramics, and concrete. Each material type is considered a statistical composite that, by its very nature, causes it to produce heterogeneous deformation fields. This significant heterogeneity then is responsible for determining the point in the deformation history and also the spatial position of discrete damage events within the material. The contributions here also represent the type of strong collaboration necessary between experiments, theory, and simulation to address this challenging and opportunity-rich area of material damage mechanics. The work represented clearly suggests that the traditional approaches in classical continuum mechanics must be advanced in the direction of statistical and probabilistic theory, which accounts for spatial variability of response. Classical internal state variable theories for engineering application are based upon spatial mean value quantities and therefore cannot adequately capture the nature of materials damage. In addition, this work also illustrates needed advancements in computational frameworks to enable statistically accurate theories as well as new surface creation, which observes conservation principles and does not compromise the integrity of computed history dependent fields. Furthermore, this collection of work illustrates an exciting future for technological advancement and demonstrates its demand in government and industrial partners.

This Symposium also contained significant time for discussion with the aim of identifying trends in opportunity and future needs along the lines of the technical themes. A summary of this discussion is as follows

  • Two future trends are emerging within the experimental sciences community—(1) the ability of our instrumentation to get both high spatial resolution and rapid time resolution, (2) the capabilities to reach 3D spatial information.

  • Many regimes of behavior discussed in this meeting are not currently approachable experimentally. It is critical to supplement experimental regimes through other means such as advanced physical theory and first principle techniques.

  • Connection between workers at different length scale is critically important to progress.

  • Many of the easy problems in materials behavior are understood. The hard problems remain—those problems are of interest to this community.

  • Data science is an area of opportunity for this field since we must access and combine many different types of information to give a physical picture of damage and failure events.

  • The models we currently use are, in many cases, still phenomenological in nature (to varying degrees) and uncertainty exists in that state of phenomenology and inherently in the degree of physicality in the parameter sets. Statistical quantification of that is generally not done. Sensitivity analyses are related. Physical basis of models can lower uncertainly levels of given models or systematic quantification of parameter uncertainty can also be used. Reduction of parameters in models is a related goal. Determining the right approach for real engineering problems is generally dependent on the problem and circumstance.

  • Sharing experimental or simulation data should be done more frequently than currently practiced. Artificial Intelligence techniques will likely be used more extensively in the future to learn from large data sets or combine different data sets in a systematic way. The next ten years should see rapid growth in this arena. Having a way to attribute/credit datasets properly is important (e.g., tenure or performance decisions). Should we also be recording the experimental samples in some way?

  • The community should do a better job in how we develop codes and code bases, how we distribute codes, and how maintenance is done and perhaps begin borrowing more from other communities in how codes are written (to some extent this is happening with some of the national initiatives). For example, the animation or graphics design communities use computing extensively.

  • Code robustness for general applications is a significant and far-reaching consideration, which is generally not reported on in published articles yet has critical practical implications.

  • Porting or writing codes for new hardware architectures is a serious issue that requires a great deal of resources to achieve.

  • Training the next generation of students is an important responsibility of this community—regardless of whether we are in industry, government, or university.

  • Are we as a community thinking about what problems we really want to solve or are we just off developing our own techniques and methods? We could probably do a better job of collaborating to solve these remaining problems, which are the most challenging.

  • In industry, where survivability relies on making a successful structural component that is acceptable to a customer at an agreeable price, product specifications, quality standards, and design criteria are essential. These are always quantifiable and are necessary to the design engineer. How do we talk about these engineering quantities in the context of the mechanisms of behavior and challenging statistical responses?

  • From an industrial perspective within the aerospace industry, it is not at all clear that we have the right models yet, so we should be working together. What about mechanistic uncertainty in the context of model application?

  • We discuss lifecycle response of structural materials, but we should also keep in mind that the aging of materials is an important issue under extreme environmental conditions.

  • From an industry perspective, in an envisioned future workforce that is more bountiful and a bit more diverse and therefore more capable, many of these advanced experimental and computational tools are needed. In that way, industry can either use these advanced tools or understand them to the point where they can frame the right questions so that we can actually use these tools in the design process.

  • Writing code for parallel application should be more of an element of the education process for students.

  • In retrospect, the community has made significant progress over the past ten years and we should expect that to continue.

  • In industry today, we have to be faster, and we have to be as effective and economical as we can—but using the types of tools discussed in this meeting to guide us has been very helpful.

  • Bringing together new technology, as discussed here, will determine the rate of change within industry, government, and academia.

It is our hope that the richness of the interactions and the challenges of the topical areas discussed are well represented by the papers presented here and in the parallel special issue of Computational Mechanics. In addition, we would like to thank all the authors who contributed their manuscripts to these two issues. Finally, the participants express our appreciation to the editor-in-chief Krishnaswamy Ravi-Chandar for his willingness to host this collection of works.