A new electrochemical series should offer scientists a better way to understand and predict oxidation states in materials. The team that created the series with the help of machine learning says it could help to accelerate research in areas such as batteries and materials discovery.
The oxidation states of ions are essential for inferring structure–property relationships in materials. Their values depend on both the positions of nearby species and their electronic chemical potential. ‘Different ions gain or lose electrons – or change oxidation state – at different electronic chemical potentials,’ explains Tim Mueller, who led the study at Toyota Research Institute in the US. He adds that there’s an established list of potentials at which that happens, which is known as the electrochemical series.
But this ‘standard’ series is based on ions in aqueous solution, so it has several drawbacks when used in other systems. Although the approach is a valuable tool in electrochemistry, its effectiveness in materials chemistry is limited because of the relatively small set of reactions on which it was built. ‘Most importantly,’ notes Mueller, ‘an ion in a solid-state material is in a very different environment than an ion surrounded by water.’ He explains that constructing an electrochemical series for ions in materials is experimentally challenging, which is why this hasn’t been done before. ‘We realised that we could instead construct one using machine learning, by analysing the patterns in an existing database of materials.’
The team trained the model using a dataset of over 50,000 entries from the Inorganic Crystal Structure Database. The approach is based on the principle that all ions in a single material experience the same electronic chemical potential. To start with, they randomly guessed the electronic chemical potential values at which an ion would transition from one oxidation state to another. ‘We call these values the boundaries between oxidation states,’ says Mueller. The exact positions of such boundaries depend on the local atomic environments, which vary among materials, so to account for this variety, the researchers made the boundaries fuzzy.
‘The training algorithm iteratively shifted the boundary positions to find the ones that maximised the likelihood of observing our data set,’ explains Mueller. To test the accuracy of the approach, he and his colleagues trained models on subsets of the training data. The models were then evaluated based on how well they predicted oxidation states for compositions on which they weren’t trained. ‘We found that our method significantly outperformed other state-of-the-art models,’ says Mueller, emphasising that the tool can be used to predict oxidation states based on only a material’s chemical composition.
The new series doesn’t directly provide potential values but instead determines them by a linear fit to the existing series. It also includes uncertainty in the potentials to account for the different environments in which an ion may be found in in a solid-state material. ‘The authors’ trick of aligning the scale to the reduction potentials of ions in solution is neat, especially in the context of growing interest in the electrochemistry of solids,’ comments Aron Walsh from Imperial College London in the UK, who wasn’t involved in the research.
Walsh is impressed that the model correctly predicts the mixed-valence nature of compounds like BaBiO3, with a combination of bismuth(III)/bismuth(V) instead of bismuth(IV). He says that, although the dataset excludes some types of materials like molecular crystals, hybrid frameworks and glasses, the model has many applications for inorganic crystals.
Mueller points out that the tool covers common oxidation states for 84 elements, which makes it applicable to a broad range of materials, but he recognises that there’s room for improvement. ‘The main limitations of this approach right now are that we don’t cover all known oxidation states for all elements and the accuracy of our model could be improved. Both of these could be at least partially addressed if more data were available to train the model.’
References
T Mueller et al, Proc. Natl. Acad. Sci. USA, 2024, 121, e2320134121 (DOI: 10.1073/pnas.2320134121)
No comments yet