Three-dimensional Analysis and Reconstruction of Additively Manufactured Materials in the Cloud-Based BisQue Infrastructure
- 687 Downloads
A microstructure analytics and 3D reconstruction software package, DREAM.3D, was integrated as a module into a cloud-based platform, BisQue. Parallelization of DREAM.3D module executions and the ability to parameterize pipeline variables over a range of values have led to insights about the grain segmentation misorientation tolerance in TriBeam-collected 3D EBSD datasets of additively manufactured materials with complex anisotropic microstructures. Furthermore, a comparison in grain size measurements was made between standard 2D metallographic slices and 3D measures using BisQue’s parallelized DREAM.3D module executions. The direction of cloud-based data infrastructure and the prospects for impact in material science are also discussed.
KeywordsTomography 3D materials science Microstructure analytics Cloud infrastructure Additive manufacturing 3D metal printing
Recent external pressures mandated by government grants requiring data be accessible to the public, and internal pressures within academic communities for data sharing and reproducibility have led to the emergence of data-driven platforms within the scientific community [1, 2, 3, 4]. Requirements for these platforms include providing data provenance, data sharing, and hosting services for data that may otherwise disappear when a lab closes or hardware fails . Recently, data repositories have shifted from simply siloing data to include data analytics and predictive capabilities. Furthermore, some of these repositories are directly linked with high-performance computing in order to populate an ever growing database, such as with thermodynamic databases such as the Materials Project  and the Open Quantum Materials Database (OQMD) . More sophisticated predictive/interactive databases, such as the aforementioned thermodynamic ones, have been slower to develop with the 3D Materials Science community. This is likely due to the less standardized and more diverse data modalities, the complexity and diversity of microstructures, and the relative infancy of the community itself.
In almost all regards, using 3D data to answer scientific questions in Materials Science is challenging due to the need for dataset acquisition, reconstruction, and analysis. Reconstruction of 3D datasets is a complex process with many degrees of freedom. The algorithms employed, and tolerances selected by the user to align slices, segment features, and remove data artifacts (i.e., remove noise), ultimately determine the quantitative data extracted from the volume. Significant progress on the reconstruction and analysis of electron backscatter diffraction (EBSD) data has been made by a software package DREAM.3D , which creates a unified data structure for 3D data treatment and aggregates tools and algorithms collectively developed by the community. In DREAM.3D, this process takes the form of a pipeline, which typically consists of well over a dozen steps, and in which each step has its own set of input variables and outputs. DREAM.3D typically runs in a serial process on a workstation, with predefined input parameters. Due to the long computation times required to process 3D data, which often scales proportionately as dataset size and complexity increases, it is impractical to explore the full parameter space by manually running reconstructions. Typically, only several variables are adjusted by a user until a qualitatively acceptable reconstruction is attained. This process inherently imparts bias and typically provides little opportunity to justify specific reconstruction variables.
The bias generated during reconstruction of 3D datasets is subsequently carried through to data analysis, which, for microstructural data, is still an ill-defined process. Standardization of analysis involving new characterization techniques is generally a slow process. Automated orientation imaging analysis via EBSD first emerged in the early 90s, but standards did not emerge for nearly two decades [9, 10]. These new standards covered basic quantitative analysis, including methods for grain size measurement, as well as guidelines governing data quality, quantity, and resolution sufficient to make such measurements . Even in 2D EBSD analysis, such standards are rare and do not necessarily address additional subtleties arising from more complex, albeit commonly studied, microstructures, including twinned materials, nanocrystalline materials, and deformed materials [12, 13, 14, 15, 16, 17]. Standardization of characterization methods becomes critical as the complexity of the analyses increases. For example, high-resolution EBSD (HR-EBSD) and Heaviside Digital Image Correlation (HDIC)  are extensions of conventional EBSD and digital image correlation (DIC) that require specialized post-processing to measure elastic and plastic strain at a resolution of 10-4 or smaller [19, 20] and 1.5×10− 3 , respectively. The variety of analytic approaches available to make accurate strain measurements, as well as the sensitivity of these measurements to instrumentation and sample preparation, motivates more rigorous standardization of the technique via sample reference standards [22, 23].
In 3D characterization techniques, such as X-ray-based diffraction contrast tomography (DCT), the problems of data reconstruction and analysis are also convoluted. Coherent volumes must be reconstructed from individual 2D images via back projection techniques . The diffraction spots from the 2D projections are generally assumed to originate from a single grain, limiting the characterization of materials with large internal strain gradients . Moreover, stereological conventions established in 2D do not necessarily capture the real 3D microstructures, particularly at extrema of property distributions [26, 27]. Given the wealth of additional data provided by 3D characterization, new tools aimed at exploring the parameter space offered to 3D reconstruction are required. In the following manuscript, we describe the use of a cloud-based platform, BisQue [28, 29], enhanced with the DREAM.3D microstructural software analysis package for the 3D reconstruction and analysis of additively manufactured microstructures characterized using TriBeam tomography .
The following sections describe the software infrastructure and materials investigated.
BisQue for Cloud-Based Data Storage and Analysis
BisQue DREAM.3D Integration
Materials and Data Collection
The 3D datasets were collected via TriBeam tomography from two additively manufactured alloys in their as-built state, meaning no additional heat treatments were performed on the samples after additive processing. TriBeam tomography  utilizes a femtosecond laser for micromachining within a dual-beam scanning electron microscope (SEM) equipped with a focused ion beam (FIB). The ultrashort pulse length minimizes damage to the machined surface while enabling high material removal rates, roughly four to five orders of magnitude faster than FIB milling alone, resulting in an accessible volume on the order of a cubic millimeter [30, 34]. The FIB can also be used at a glancing angle to minimize surface roughness due to laser-induced periodic surface structures (LIPSS); however, this cleaning step is material dependent and is not always required to obtain high-quality 3D EBSD data.
LENS-Processed 304L Stainless Steel
EBM Inconel 718
Electron beam melting (EBM) is a powder-bed additive technique which utilizes an electron beam as a heat source. A fine layer of powder on the order of 50–100 μm is spread over a build plate, and the electron beam selectively melts the powder. The build plate is then lowered and the process is repeated to create complex geometries in 3D. Unlike laser melting techniques, electron beam melting requires vacuum processing environment, and the electron beam heat source can also be used to achieve preheat temperatures in excess of 1000 ∘C. A sample of EBM Inconel 718 was fabricated with 50-μm-thick build layers and was characterized in the TriBeam system with 1.5-μm-thick slices as described previously , and EBSD data was collected to characterize grain texture and morphology. Roughly 25 GB of data was collected from a total of 204 slices over the course of three days, resulting in a volume of 952.5 × 501.0 × 306.0 μm. Raw EBSD patterns were not collected, but the Hough transforms of the patterns were saved for post-processing. Glancing-angle FIB was not required to resolve EBSD patterns, which were collected with an in-plane resolution of 1.5 μm to create square voxels. As with the 304L sample, a minimum CI of 0.1 was chosen for the initial mask. Grains were segmented using a misorientation tolerance of 2∘ using the same segmentation algorithm as explained in “Misorientation Tolerance Parameterization for Feature Identification.” A minimum size requirement of 125 voxels was applied to delete grains that were too small to be properly characterized. This corresponds to 5 × 5 × 5 points across a feature and an equivalent diameter of 9.3 μm, roughly five times smaller than the average grain size determined from 2D analysis done prior to 3D characterization. Gaps remaining from these removed features were filled via iterative dilation of the surrounding grains. Indexed EBSD data was uploaded to BisQue for analysis after data collection.
Misorientation Tolerance Parameterization for Feature Identification
In order to analyze 3D tomography data, individual features, or grains, must be segmented from the reconstructed volume. When working with 3D EBSD data, orientation information exists at every voxel in the volume. Straightforward segmentation algorithms such as connected component analysis (CCA) are used to identify grains from the 3D data using a misorientation tolerance between adjacent voxels. The misorientation tolerance is one of the most critical parameters for the segmentation and analysis of EBSD data. A mask is defined that distinguishes good data from noise in the dataset by using a threshold on the electron backscatter pattern confidence index (CI) . To perform grain segmentation, a seed is selected from voxel 0 within the mask volume and its orientation is compared to each of its nearest neighbors that are still within the mask (up to six neighbors in a rectilinear voxelized grid). If the misorientation between two adjacent voxels is below the user-defined misorientation tolerance, then the voxel is added to the current grain. This process is repeated until no additional neighboring voxels can be added to the grain. A new seed is then selected at the smallest voxel index not previously grouped and the process continues until all voxels in the mask have been assigned to some grain ID.
It is essential that an optimal misorientation tolerance value is used during dataset reconstruction due to the impact of the misorientation tolerance on the definition of grain structure. Two-dimensional analysis of EBSD data typically uses a misorientation tolerance of 5∘, which is generally sufficient for well-annealed or recrystallized microstructures of various morphologies . However, in deformed or highly textured materials, as are commonly observed in as-built additive structures [39, 40], such large misorientation tolerances can inadvertently merge grains that should otherwise be segmented separately. The additional voxel connectivity provided by 3D data further exacerbates this problem, and as there is generally not an accepted misorientation tolerance that applies to all 3D microstructures, these values must be chosen intuitively and the resulting segmentation analyzed for efficacy. This is a time-consuming, laborious process, and the ultimate selection of a user-defined parameter is not necessarily verifiable as the optimal choice. The use of parameterization, enabled by the BisQue framework, allows for the generation of many 3D volumes to better assess these and other important reconstruction parameters.
Keeping all other pipeline parameters constant (minimum number of two neighbors, minimum allowed defect size of 125 voxels), the misorientation tolerance was tested over a wide range of values, from 0.1 to 10∘ of misorientation in 0.1∘ increments, generating a total of one-hundred 3D reconstructions of the steel sample described in “LENS-Processed 304L Stainless Steel.” This represents some seventy gigabytes of generated data. To assess the quality of the reconstruction, two high-level metrics were used, the fraction of “good” data, and the total number of grains found during the reconstruction. “Good” data is defined by the confidence index of the EBSD data, and therefore contains a constant number of voxels that does not change between reconstructions. During the reconstruction process, various cleanup steps are employed which retain varying amounts of this initial group of voxels. DREAM.3D filters such as “Minimum Size,” “Minimum Number of Neighbors,” and “Fill Bad Data” can affect whether individual voxels are retained or released from segmented features. At the end of a reconstruction, post-processing scripts are used to count the number of voxels belonging to segmented features, which will always be a subset of the initial group of “good” voxels.
As can be seen in Fig. 5a, these two metrics tend to have opposite trends with increasing misorientation tolerance. As larger misorientation tolerances are used to segment the data, not only are more grains merged together, decreasing the overall number of grains found, but noisier data on the sample surfaces can also be connected to existing grains during segmentation. This trend is true except for the lowest misorientation tolerances, below 0.9∘. At these lower misorientation tolerances, the minimum grain size requirement of 64 voxels eliminates many of the grains found during the segmentation step, and for tolerances below 0.4∘, no continuous grains larger than 64 voxels are found, so grain-level metrics do not exist in these reconstructions.
Subvolume Creation and Analysis of Large 3D Volumes
Beyond investigation of individual pipeline parameters, the BisQue framework also enables coupled parameterization, wherein sets of variables are linked to each other. This allows for automated generation of random samples of equal-sized subvolumes of 3D datasets, among other applications. The characterization of subvolumes in 3D datasets has been previously applied to define microstructural and property volume elements (MVEs and PVEs) for various material properties, to inform component design [41, 42].
Morphological Complexity of As-printed Additive Structures
Of the 467 slices generated, only 32 (6.85%) of the slices retained a single feature, and 76.7% of slices were found to contain five or more subfeatures, despite all of the data having been obtained from a single 3D grain. Due to the connectivity information that is missing in 2D, the gradual changes in orientation that exist in the material are interpreted as higher-angle grain boundaries (Fig. 7), resulting in their separation during segmentation via CCA. Similarly, attempts to characterize effective grain size of additive materials are inconsistent when working with 2D data. For example, the equivalent diameter of the large grain was found to be 320.0 μm as determined from the 3D reconstruction. Taking the largest subfeature found in a particular slice, the equivalent diameter of the grain can also be calculated from the 2D cross-sections. The 2D cross-sections under-predict the grain size as measured from the full volume of the dataset, with a median equivalent diameter of 189 μm. There is also a clear dependence on the location and orientation of the slice in the calculation of this value, with X- and Y-normal slices yielding median equivalent diameters of 156 and 155 μm, respectively, whereas the Z-normal slices have a median equivalent diameter of 267 μm. X- and Y-normal slices also show similar variability in the number of subfeatures found during 2D analysis, as well as the relative fraction of data contained in the largest subfeature. The distinctive trend of Z-normal slices, namely larger predicted equivalent diameters and proportionally larger subfeatures, indicates that these slices contain a larger number of small subfeatures, which can be seen in Fig. 7. These generated cross-sections reveal the inadequacy of 2D analysis to accurately assess grain size and shape, and emphasize the importance of 3D characterization in additively manufactured materials. The directional dependence of calculated properties also highlights the anisotropic properties that exist as a result of the microstructure in additively manufactured materials.
Anisotropic Defect Generation in Additive Processes
The extent of these deviations from a fully heterogeneous microstructure are plotted in Fig. 9. Local maxima in the curve correspond to local clusters of equiaxed grains. The width of the associated peaks are consistently smaller than the build layer thickness of 50 μm, indicating that all of these clusters appear on individual build layers. Analysis of the entire volume shows that several clusters of equiaxed grains exist throughout the volume beyond the large cluster previously identified. In every case, clusters of equiaxed grains appear in the build layer above the lack of fusion defect. The change in thermal gradients arising from the lack of fusion result in formation of equiaxed grains that interrupt the otherwise columnar grain morphology observed in the bulk of this sample . As successive build layers are deposited above the lack of fusion defects, columnar growth can resume, but equiaxed grains persist above the porosity. Secondary electron images taken during TriBeam tomography confirm that these clusters are spatially correlated to planar porosity defects occurring during part fabrication (Fig. 9). Many of these planar voids are < 5μm thick, below the detectable limit of lab-scale X-ray computed tomography and microtomography methods , so would not be readily identified with nondestructive evaluation techniques, despite their impact on changing the local microstructure morphology and texture. Quantitative analysis of the EBSD data in BisQue reveals that these smaller lack of fusion defects are much more prevalent throughout the sample, beyond the more easily identifiable, large-scale defect previously characterized.
The ability of the BisQue platform to integrate materials science tools such as DREAM.3D can significantly advance the emerging 3D and 4D characterization communities. As the routines used to segment, reconstruct, clean, and analyze higher-dimensional data have not yet reached technical maturity, it is essential to develop tools that allow for simple and scalable exploration of parameter space offered by these data. This approach is especially important for certain classes of materials that are not easily analyzed through conventional segmentation algorithms. Additively manufactured microstructures in particular result from compound melting and solidification cycles, as well as associated solid-state transformations in the thermal histories of these parts. The complex grain morphologies and long-range orientation gradients present in additive microstructures cannot be accurately assessed using conventional techniques. Unlike reconstruction and analysis of 3D data based on trial-and-error efforts, adopting necessarily simplified approaches that cannot guarantee fidelity of an individual analysis, BisQue enables dataset-specific reconstruction parameters to be rigorously defined and studied. To that end, parallelization methods for assessing reconstruction fidelity offer a path forward for standardizing 3D materials characterization in a materials agnostic form; there may be no “one-size-fits-all” misorientation tolerance for segmentation of 3D EBSD data, but there could be a “one-size-fits-all” approach to determining the optimal value, streamlining the process of 3D materials analysis and more accurately representing these rich datasets. Such parameter studies could be extended to entire 3D reconstruction pipelines in DREAM.3D, rather than focusing on individual reconstruction parameters. Combined with other services in BisQue, such as a volume and table viewer compatible with common 3D data formats, much of the 3D reconstruction and analysis process is already possible completely through a web browser. Development of 3D analysis tools such as DREAM.3D for BisQue could take the entire process off of individual workstations, simplifying the process of 3D data analysis. Capabilities for raw data storage, reconstruction, and analysis of 3D microstructures discussed here are also useful because they can be combined with tools for predicting microstructure-sensitive properties. Such tools which may include analytic bounds (e.g., Voigt/Reuss) or emerging data-driven models for microstructure-property relationships  can also be incorporated into BisQue, which is discussed in detail elsewhere .
The scalable infrastructure of BisQue offers numerous advantages to the materials science community. Scale-up and replication of BisQue servers is achievable using well-known methods such as reverse proxies and replicated servers. Leveraging high-performance computing capabilities provides the potential to process very large datasets in a distributed manner. Accessed via a web-browser, BisQue enables efficient and straightforward use of complex analysis algorithms with minimal software and hardware requirements on the user end. Underlying all of this is continual updating and tracking of metadata, whose structure and context can be formulated to cater to specific research community needs, maintaining high flexibility of the infrastructure. A core component of metadata is the retention of data provenance at every stage of the process. When a resource is created or modified, it is marked by the system, so a user can track the full history of a dataset from its creation, even when shared among several users. This fosters collaboration and allows multiple researchers to analyze and edit the same dataset without fear of modifying the original raw data. Due to the highly specialized equipment and facilities required to generate 3D materials data, 3D materials science is necessarily collaborative. The ability to share and distribute large datasets around the globe, as well as the associated analyses, is critical to the growth of the 3D materials community at large.
A distributed computing infrastructure can also benefit the need to certify additively manufactured parts and characterize the build process, which requires powerful analytic methods. During operation, additive manufacturing machines record a myriad of variables simultaneously, ranging from the power and location of the heat source to chamber pressure and moisture content, all of which are correlated to observed microstructures and defect generation . The use of specialized in situ sensors, such as near-IR (infrared) imaging for defect detection, thermographic measurement of temperature profiles, or high-speed imaging to investigate fluid flow and powder dynamics are essential to advancing our understanding of additive manufacturing techniques [53, 54, 55, 56]. At the same time, the emergence of these techniques increases the complexity of data associated with individual additively manufactured parts. Not all sensors record data at equivalent rates, and many do not use regular sampling intervals. Data visualization tools such as FALCON have been specifically designed to mitigate these issues and enable machine users to determine processing-property linkages . New high-throughput ex situ characterization of additively manufactured parts has also demonstrated the sensitivity of part performance to build parameters and machine-to-machine variability [58, 59], underlying the need for a more comprehensive understanding of the build process. Of particular concern to additive material qualification are the so-called rare events at extrema of property distributions not easily seen without such high-throughput testing, severely limiting confidence in the application of this technology to critical components . Incorporation of such tools, in combination with advanced statistical approaches and emerging machine-learning algorithms from other communities, into a collaborative environment such as BisQue can help advance these algorithms and grow user communities. Coupling of BisQue to additive machines could enable real-time experimental analysis and provide feedback to machine operators. Distributing analysis and collection of build data and part properties allows users to leverage advanced sensor capabilities, enabling additive manufacturing to become a self-correcting, adaptive process that can deliver “born qualified” parts .
As tools for BisQue develop, there is enormous opportunity to support growing research trends. Integration with GitHub repositories would enable version management of custom analytic tools, providing archival management of developing codebases. The use of dockerized environments for running such software in BisQue ensures that datasets analyzed using out-of-date code will still function properly, and old analyses will be preserved. Annotation on datasets can be utilized as a new approach for performing collaborative analysis and marking data for automated analysis. New analysis tools and custom scripts for common packages such as ImageJ and Jupyter notebooks can constantly be updated and linked via BisQue’s data provenance construct. Automated analysis for specific data structures could be implemented, akin to facial recognition processes performed by Google’s Vision API on images as they are uploaded to the Cloud. BisQue could also serve as a repository for published datasets, similar to the databases such as Mendeley Data and Journals such as Data in Brief, Scientific Data, and Data which publish scientific datasets, providing a Digital Object Identifier (DOI) and enabling their use by other researchers. Beyond sharing the data, however, BisQue can fully track the usage and analysis of the data all the way back to its source. Particularly for 3D datasets, which contain a wealth of information, BisQue can enable multiple research groups with diverse specialties to work on the same data, providing new insights and fostering new connections among researchers.
Archival preservation of data provenance
High-performance cluster computing integration (parallelized DREAM.3D module execution)
Parameterization of DREAM.3D pipelines for reconstruction optimization
Algorithms for the 2D and 3D volume element sampling from large 3D datasets
The DREAM.3D pipeline parameterization combined with parallelized module executions has allowed for the exploration and optimization of dataset reconstructions. For example, new insights into the choice of grain segmentation misorientation tolerances for additively manufactured materials were discovered. Using algorithms for the virtual 2D and 3D volume element sampling from large 3D datasets, traditional 2D metallographic sectioning was found to yield grain size measurements that were roughly half the size of the equivalent 3D measurements in heterogeneously distributed microstructures such as those found in AM materials. These measurements were performed with a massively parallelized array of slicing/cropping pipelines to simulate the 2D vs 3D statistical grain measurements.
The authors would like to acknowledge the support of NSF EAGER Grant No. 1650972 and NSF SI2-SSI Grant No. 1664172. Andrew T. Polonsky and Tresa M. Pollock also acknowledge Ryan R. Dehoff and Michael M. Kirka for providing additive material and Oak Ridge National Laboratory for support under Award No. 400156470 and acknowledge George T. Gray III and Veronica Livescu of Los Alamos National Laboratory for providing additive material. The authors acknowledge Mike Jackson from BlueQuartz software for DREAM.3D discussions.
Compliance with Ethical Standards
Conflict of Interest
The authors declare that they have no conflict of interest.
- 1.US Department of Energy (2014) Public access plan. Technical report, US Department of Energy, WashingtonGoogle Scholar
- 2.National Science Foundation (2015) Today’s data, tomorrow’s discoveries. Technical report, National Science FoundationGoogle Scholar
- 3.Ward C National Materials Data Initiatives. In 2015: UCSB workshop on collection and analysis of big data in 3D materials science, Santa BarbaraGoogle Scholar
- 4.Gewin V (2016) An open mind on open data, vol 529Google Scholar
- 5.The Minerals, Metals & Materials Society (TMS) (2017) Building a materials data infrastructure: opening new pathways to discovery and innovation in science and engineering. Technical report, TMS, Pittsburgh PAGoogle Scholar
- 11.ASTM E2627-13 (2013) Standard practice for determining average grain size using electron backscatter diffraction (EBSD) in fully recrystallized polycrystalline materials. ASTM International, West Conshohocken, PAGoogle Scholar
- 14.Geiss RH, Read DT (2007) Need for standardization of EBSD measurements for microstructural characterization of thin film structures. In: AIP conference proceedings, vol 931, pp 168–172. AIPGoogle Scholar
- 20.Britton TB, Holton I, Meaden G, Dingley DJ (2013) High angular resolution electron backscatter diffraction: measurement of strain in functional and structural materials. Microscopy and Analysis, (May): 1–5Google Scholar
- 29.Kvilekval K, Fedorov D, Gaur U, Goff S, Merchant N, Manjunath BS, Singh AK (2012) Bisque: advances in bioimage databases. IEEE Data Eng Bull 35(3):56–64Google Scholar
- 31.The HDF Group Hierarchical data format, version 5, 1997-NNNN. http://www.hdfgroup.org/HDF5/
- 32.Ahrens J, Geveci B, Law C (2005) Paraview, an end-user tool for large data visualization. In: Visualization handbook. ElsevierGoogle Scholar
- 35.Brockmann R, Candel-Ruiz A, Kaufmann S, Müllerschön O (2015) Strategies for high deposition rate additive manufacturing by laser metal deposition. In: International congress on applications of lasers & electro-optics, vol 680, pp 680–683. Laser Institute of AmericaGoogle Scholar
- 42.Lenthe WC, Stinville J-C, Echlin MLP, Pollock TM (2016) Statistical assessment of fatigue-initiating microstructural features in a polycrystalline disk alloy. In: Superalloys 2016: Proceedings of the 13th international symposium of superalloys, pp 569–577. Wiley Online LibraryGoogle Scholar
- 48.Kurzydłowski JK, Ralph B (1995) The quantitative description of the microstructure of materials. CRC PressGoogle Scholar
- 51.Latypov MI, Khan A, Lang CA, Kvilekval K, Polonsky AT, Echlin MP, Beyerlein IJ, Manjunath BS, Pollock TM (2019) BisQue for 3D materials science in the cloud: microstructure–property linkages. https://doi.org/10.1007/s40192-019-00128-5
- 53.Dinwiddie RB, Dehoff RR, Lloyd PD, Lowe LE, Ulrich JB (2013) Thermographic in-situ process monitoring of the electron-beam melting technology used in additive manufacturing. Number May 2013, page 87050KGoogle Scholar
- 56.Niyanth Niyanth S, Baba JS, Jordan BH, Dinwiddie RB, Dehoff Ryan R (2018) Understanding part to part variability during directed energy deposition processes using in-situ and ex-situ process characterization. Technical reportGoogle Scholar
- 58.Salzbrenner BC, Rodelas JM, Madison JD, Jared BH, Swiler LP, Shen Y-L, Boyce BL (2017) High-throughput stochastic tensile performance of additively manufactured stainless steel , vol 241Google Scholar
- 59.Madison JD, Underwood OD, Swiler LP, Boyce BL, Jared BH, Rodelas JM, Salzbrenner BC (2018) Corroborating tomographic defect metrics with mechanical response in an additively manufactured precipitation-hardened stainless steel. In: AIP conference proceedings, vol 1949, pp 020009Google Scholar
- 61.Roach AR, Abdeljawad F, Argibay N, Allen K, Balch D, Beghini L, Bishop J, Boyce B, Brown J, Burchard R, Chandross M, Cook A, Dressler A, Forrest E, Ford K, Ivanoff T, Jared B, Kammler D, Koepke J, Kustas A, Lavin J, Leathe N, Lester B, Madison J, Mani S, Martinez MJ, Moser D, Murphy R, Rodgers T, Seidl T, Shaklee-brown H, Stanford J, Stender M, Sugar J, Swiler LP, Taylor S, Trembacki B, Van Bloemen Waanders B, Whetton S, Wildey T, Wilson M (2018) Born qualified grand challenge LDRD final report. Technical Report September, Sandia National LaboratoriesGoogle Scholar
This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.