The Materials Super Highway: Integrating High-Throughput Experimentation into Mapping the Catalysis Materials Genome
- 1.5k Downloads
The materials genome initiative (MGI) aims to accelerate the process of materials discovery and reduce the time to commercialization of advanced materials. Thus far, the MGI has resulted in significant progress in computational simulation, modeling, and predictions of catalytic materials. However, prodigious amounts of experimental data are necessary to inform and validate these computational models. High-throughput (HT) methodologies, in which hundreds of materials are rapidly synthesized, processed, and characterized for their figure of merit, represent the key experimental enabler of the theoretical aspects of the MGI. HT methodologies have been used since the early 1980s for identifying novel catalyst formulations and optimizing existing catalysts. Many sophisticated screening and data mining techniques have been developed and since the mid-1990s, this approach has become a widely accepted industrial practice. This article will provide a short history of major developments in HT and will discuss screening approaches combining rapid, qualitative primary screens via thin-film techniques with a series of quantitative screens using plug flow reactors. An illustrative example will be provided of one such study in which novel fuel-flexible sulfur tolerant cracking catalysts were developed. We will then illustrate a path forward that leverages existing HT expertise to validate, provide empirical data to and help guide future theoretical studies.
KeywordsCombinatorial chemistry Methodology and phenomena Nanotechnology High throughput
As theory becomes more routinely applied to catalysis problems, prodigious amounts of experimental data are required to inform and validate these computational models, thus powering the MGI computational “engine.” Modeling studies rely on accurate depictions of materials properties including crystal structure, composition, and surface coverage/arrangements under catalytic reaction conditions . The latter property is of paramount importance, as it is often the case that the surface of a catalyst under reaction conditions (and thus the very nature of its active sites) does not resemble the equilibrium surface at ambient conditions, let alone at 0 K . Moreover, the role of interfaces, in particular those between the catalyst and the support, have only recently begun to be modeled theoretically [13, 14]. When one considers, however, that these interfaces are likely as dynamic as the catalyst surface, it becomes clear that without experimental validation of the role these important parameters play in catalysis and how they change during reactions a speed bump in the “Materials Super Highway” is imminent.
Early adopters of high-throughput technologies in catalysis included start-up companies, such as Symyx, Avantium and HTE, large chemical companies, such as DOW, BASF, and ExxonMobil, as well as a (small) number of academic labs. While these first studies were important demonstrations of the potential of high-throughput in catalysis, frequently they were plagued by irreproducible results and overly aggressive patenting of large swaths of the periodic table. Early studies often prioritized throughput in terms of synthesis and rapid reaction screening over ensuring that catalysts were created and tested in a manner consistent with the actual large-scale application. Smotkin et al. reported, for instance, that the blanket use of NaBH4 as a reductant during production of fuel cell catalysts arrays rendered such an approach susceptible to local maxima that are largely a function of synthesis method, and are not reflective of the ultimate material figure of merit . Early studies often characterized samples deposited on planar samples using scanning probe mass-spectrometry, IR thermography, or optical imaging techniques, such as fluorescence [21, 22, 23]. Many of these techniques inevitably suffer from sample cross-talk since there is no barrier preventing reactant and product spill-over between catalyst sites. There have been recent studies that sought to minimize cross-talk in planar samples by introducing physical barriers such as capillaries between individual catalysts .This approach also removes a means of evaluating the catalyst-support interaction and the role of nanoparticle properties in determining activity and selectivity . IR thermography has the additional caveat of not being capable of product speciation, but rather only identifying “active” catalysts by monitoring their heat signatures.
Modern HT catalysis studies are often multi-faceted with multiple stages that first rapidly identify potential hits using qualitative measurements and subsequently hone in on materials of interest in increasingly greater detail under industrially important conditions [25, 26, 27]. Each study begins with an analytical search of the relevant technical literature to identify important parameters for a particular catalytic reaction. An example of such a heuristic would be the importance of the ratio of zeolite pore size to feed stock and product molecular sizes in biofuel platform chemical upgrading . Once a suitable heuristic is identified, a rapid primary screen is then undertaken that uses either robotic pipetting systems or thin film depositions to create arrays of hundreds to thousands of samples on a single chip. These arrays are then screened via a qualitative primary screen, e.g., using optical techniques or thermography to identify initial leads, such as a coke inhibiting additive . This is quite similar to the approaches used above with the same caveats, however, the goal is to quickly identify potential winning/losing regions and then move the most promising regions of the parameter space directly to bench-scale reactor studies. At the second stage of the screen, milligram quantities of the catalyst are produced via a host of traditional methods, which can be parallelized and automated. Such methods include wet impregnation, reverse micelles, parallel hydrothermal synthesis, etc. including impregnation onto suitable structured supports. Quantitative screening is generally undertaken via parallel reactor studies, where cross-talk between catalysts is prevented and effluent streams are analyzed separately, often via serial GC/MS or imaging FTIR [30, 31, 32, 33]. Final screening of select catalysts can be done in parallel pilot scale reactors, generally single-bed reactors although commercial multi-bed reactors are available, with catalysts prepared in kilogram quantities and run under true industrial conditions.
In addition to the catalytic performance of the synthesized materials, a great deal of effort is devoted to obtaining fundamental understanding of the structure and surface composition of promising new catalysts. This information is just as vital as catalytic performance for guiding theoretical MGI studies as the baseline necessary for DFT studies is the bulk crystal structure and possibly the surface configuration. To this end, a number of truly HT structural characterization techniques are routinely applied including scanning synchrotron X-ray diffraction (bulk structure) [34, 35, 36], X-ray absorption spectroscopy (local coordination) , and X-ray photoelectron spectroscopy (surface composition and element coordination) . A variety of compositional tools, such as atomic adsorption spectroscopy, inductively couple plasma analysis, and SEM/EDS are readily parallelized and are utilized regularly in HT studies [39, 40]. Conversely, measurements that provide detailed information about the microstructure of nanoparticles such as high resolution TEM are inherently slow and time consuming, and are used only for ultimate material validation . This despite the fact that a truly integrated theoretical–experimental workflow may benefit from the early insights provided to each other as hits and misses are identified and related between practitioners.
2 Case Study for JP-8
We recently applied HT techniques to develop catalysts for on-site production of LPG from diesel and JP-8 to power portable energy sources . Portable energy sources are vital to a number of applications where grid power is difficult to obtain, such as cell towers in remote locations and military operations in forward operating bases. The recent advent of solid oxide fuel cells capable of operating on liquefied petroleum gas (LPG) provides an avenue towards better overall energy efficiency over the traditional diesel fuel generators currently in use. However, LPG supply lines do not exist in rural locations, and in disaster response situations it would work against humanitarian aims to divert critical transportation resources from carrying foodstuffs and medical supplies to transporting LPG. This need motivated our group to develop a fuel flexible catalyst that can convert diesel, gasoline, or JP-8, readily available fuels in such locations with established supply lines, to LPG on-site.
Catalytic cracking is the most promising and economic process to generate LPG from diesel or JP-8. However, during direct catalytic cracking of heavy hydrocarbons at high conversion, catalyst deactivation via coke formation is inevitable . Thus, one must maintain the rate of cracking while promoting air-only burn-off of the coke via additives. Additionally, although modern diesel fuels have been de-sulfurized substantially (15 ppm) to meet EPA requirements, diesel in other parts of this world may contain much higher levels of sulfur and JP-8 can contain as much as 3,000 ppm of sulfur. Sulfur deactivation of catalysts is well-known in the field , and often is an irreversible process, thus catalysts that can “select out” sulfur containing molecules, and thus prevent deactivation, are required.
During this study there were several opportunities where timely intervention by either theory or data science could have further reduced the time required to identify optimal catalyst compositions, and possibly further increased the overall figure of merit, as shown in Fig. 3. Theoretical understanding of the role of the transition and rare earth metals in the zeolitic framework, in particular their role in mitigating coking, could have provided guidance towards other and better alternative additives. Illumination of the role of zeolite structure and Si/Al ratio on cracking of long chain hydrocarbons might have provided additional rational design criteria for zeolite selection. Likewise, a structured database illustrating secondary catalytic properties of importance to the project such as coking resistance of transition metals, coking rates of zeolites, and their use in direct cracking of even simple hydrocarbon mixtures would have been of tremendous use. In fact many of these secondary characteristics of interest for the catalyst eventually selected have been investigated previously but the results are scattered throughout the technical literature in myriad papers, extended abstracts, and buried in patents or worse, unreported due to perceived failure. A coalesced and searchable version of this technical literature would have greatly accelerated hypothesis creation and materials development.
3 Catalyzing the Next Stage in the Materials Super Highway
To this moment, in the field of HT catalysis approaches that synthesize and screen catalysts using scalable techniques have been used only by a limited portion of the community. Many synthesis techniques are capable of producing sufficient quantities of catalyst for a bench-scale reactor, but would be commercially prohibitive, owing to excessive solvent or reductant usage or the use of expensive precursors. Also to meet the commercialization aspirations of the MGI, catalysts should not be screened for an abstract figure of merit under semi-realistic conditions. Instead it would be beneficial to embrace the full complexity of catalytic systems with synthesis techniques, inexpensive and commercially viable supports and reaction conditions (pressures, temperature, gas-hour space velocities) comparable to those used at industrial-scale. Likewise, if the purpose of catalyst development is to create catalysts that can be used to manufacture useful chemicals at industrial scales the use of surrogate feedstocks could lead to unexpected roadblocks at the development stage arising due to exposure to more complicated feedstocks. Even fundamental HT studies could benefit from running in more realistic conditions, although this would necessitate more complex data analytics to extract mechanistic understanding.
Future HT-MGI studies could be substantially empowered if an emphasis were to be placed on understanding the linkages between theoretical identification of a new catalyst and its actual experimental synthesis. This will require the creation of multi-scale modeling tools that can be coupled with in situ experimental tools for monitoring catalyst composition, size, morphology, and elemental coordination to create new catalyst design rules. It is an almost trivial matter to build a statistical design based on design of experiments and use that to identify a condition that will synthesize the desired materials, however, turning that into transferable knowledge that can be generalized to new systems is a great challenge yet to be addressed.
Even once a material can be produced after theoretical identification, if its catalytic properties and structure are not well understood then the theory-experimental cycle is not closed. The current state-of-the-art in HT experimental catalysis techniques are largely results driven, this is particularly the case for parallel reactor studies, and thus provide limited insight into why one material or material synthesis procedure produces markedly different activity and selectivity. The current standard in the HT community is to perform detailed structural investigations ex situ or at substantially reduced pressure/temperature/concentration of the catalyst to try to extract information regarding how the catalyst “reacts” to its environment. It is well understood in the field, however, that catalyst active sites can be generated or changed under reaction conditions [45, 46].
Merely producing a database is a meaningless enterprise, however, if there do not exist sufficient resources for extracting new knowledge from it . Moreover, performing hundreds of in situ studies on catalysts which are not active for a particular reaction could potentially clog the database with meaningless data. A two-pronged attack would help to maximize the amount of information provided by each data point. Firstly, new methods of analyzing structural, spectral, and catalytic data that utilize recent advances in machine learning would be of great use for establishing correlations between composition, processing and catalytic activity in large open databases. The functional materials community currently leads the HT field in this respect with freely available software such as CombiView that can use a number of cluster analysis tools as well as supervised machine learning algorithms [55, 56, 57]. Such approaches can be adopted and modified by the HT catalysis community to provide a means of linking structure, processing, and functionality. Secondly, these techniques can not only benefit end of the experiment analysis but can also be applied in real-time to facilitate on-the-fly design of experiments. As XANES, XRD, and catalytic activity data are being taken, data can be analyzed and clustered into regions of interest, permitting an optimal distribution of data density that prioritizes not “winning catalysts” but those materials and conditions that reside on the edge of being effective.
It would also be worthwhile to establish an accepted metric for estimating the economics of experimental studies. Very few academic experimentalists consider cost-benefit analysis in the process of designing a new set of experiments. While we do not suggest this criteria be used as the basis of funding for academic research, this is a skill set that could guide future industrial studies by providing a ball park return on investment for new research projects. On the other hand, systems level engineering perspectives could benefit funding agencies, by providing the argument against technologies that, though technically feasible, would not provide sufficient impact to merit large programs. Such tools could be used to provide guidance for the exploration of new catalytic systems by helping to determine if an exploitative study of existing catalytic materials might be more cost-effective than an explorative study into exotic catalysts and synthesis techniques.
To effect a large scale change in the manner by which new materials are conceived will require the creation of high throughput centers spanning large number of fields, including catalysis (Fig. 8). Leveraging the existing network of practitioners of high-throughput experiments can create an “HTE materials superhighway,” and as a result maximize scientific impact and return on public investment. The network can be mobilized to address the urgent, high impact, and materials-constrained technologies mentioned earlier. Moreover, although each field has its own particular figure of merit, many of the characterization techniques required in the different fields are common e.g. diffraction and high resolution TEM for bulk structural information. There are also similar gaps in the ability to create standards to acquire, process, archive, and effectively probe truly large databases containing data and meta-data generated by disparate groups. These mutual goals make it possible to create centralized facilities with expertise in designing, standardizing, and formatting combinatorial datasets, which can then be used to inform experimentalists and theorists across multiple fields of research. Brick and mortar physical centers located at synchrotron light sources are natural fits for generalized structural investigations; while virtual centers specialized catalysts synthesis and parallel bed catalytic reactors would be established in academic and industrial labs with pre-existing expertise. Centralizing data collection will help de facto in the creation of data acquisition, processing, and formatting standards, with multiplicative increases in the generalizability of each new data set. With these tools in hand, a new era of accelerated catalyst discovery, optimization, and commercialization would be enabled, solving a large number of pressing industrial and environmental issues, while providing the data that will drive a revolution in knowledge based catalyst design.
Hattrick-Simpers would like to acknowledge the support by the US National Science Foundation under Grant DMR 1439054.
- 1.Holdren JP (2011) Materials genome initiative for global competitiveness. Office of Science and Technology Policy and National Science and Technology Council, Washington, D.CGoogle Scholar
- 17.Hagemeyer A, Volpe AF (2014) Modern applications of high throughput r&d in heterogeneous catalysis. Bentham ScienceGoogle Scholar
- 32.Snively CM, Lauterbach J (2002) Spectroscopy 17:26Google Scholar
- 44.Lauterbach J, Glascock M, Bedenbaugh J, Chien CY, Jangam A, Salim S, Kim S, Tilburg R (2013) US patent appl 20130041198Google Scholar
- 52.Persson K http://www.materialsproject.org. Accessed 18 Nov 2014
- 54.Hattrick-Simpers JR, Green ML, Takeuchi I, Barron SC, Joshi AM, Chiang T, Davydov A, Mehta A, Fulfilling the promise of the materials genome initiative via high-throughput experimentation (MRS 2014)Google Scholar