ACS Sustainable Chemistry & Engineering August 2022
"Evaluation of Machine Learning Models on Electrochemical CO2 Reduction Using Human Curated Datasets"
Featuring Kevin C. Leonard and graduate student Brianna R. Farris, and their collaborators Tevin Niang-Trost and Michael S. Branicky
Abstract
Machine learning holds the potential to be a powerful tool to aid in designing catalytic and sustainable chemical systems. However, it is important for experimental researchers to understand the capabilities of different machine learning models when trained on experimental data. In this work, we trained three different machine learning algorithms (decision tree, random forest, and multilayer perceptron) with a hand-curated dataset of 127 reaction conditions for electrocatalytic CO2 reduction on heterogeneous catalysts in aqueous electrolytes. The input to the machine learning models were the experimental conditions, and we posed four separate outputs to each of these machine learning algorithms: (1) if the number of proton-coupled electron transfer events was greater than two, (2) if carbon–carbon coupling occurred, (3) if ethylene was the major product, and (4) major product prediction. We observed that with a dataset of this size, all three machine learning models could achieve accuracies between 0.7 and 0.8 for the three binary classification problems (1, 2, and 3). Also, the shallow learning decision tree and random forest models performed equal to or better than the deep learning multilayer perceptron models. In the multiclass classification problem (i.e., predicting the product) the accuracy for all models decreased, with the random forest model producing the highest accuracy of 0.6. Analysis of the models showed that machine learning can independently arrive at conclusions that are well-known in the literature, e.g., that Cu is an important catalyst for producing high-carbon content products, and discern more-complicated patterns, with respect to feature importance.
Citation
ACS Sustainable Chem. Eng. 2022, 10, 33, 10934–10944