A clever modification to the mass spectrometry method has enabled researchers to develop a generalised and quantitative technique to analyse high-throughput experiments. The accelerated analytical technique evaluated samples 150 times faster than conventional methods and will be a key step towards incorporating big data and machine learning into organic chemistry.

High-throughput screening has become a staple of drug discovery and reaction development, facilitating the exploration of huge areas of chemical space through efficient blocks of experiments. However, while automation has streamlined the experimental stage, the analysis of these screening results lags further and further behind.

‘The problem is that every new molecule we make has a different signature in an instrument,’ explains Tim Cernak, an organic chemist at the University of Michigan. Correspondingly, the majority of methods analyse samples individually and, while two minutes is a short time to wait for a single result, over hundreds or thousands of experiments, this analysis time adds up to a huge bottleneck. Accelerated analysis therefore requires a much more generalised approach, evaluating common features across the whole panel of experiments. ‘But even for methods that are really high-throughput, there’s not a good way to quantify exactly how much you have made,’ Cernak adds.

Now, Daniel Blair and his team at St Jude’s Children’s Research Hospital have identified this missing common feature, building a generalised and quantitative mass spectrometry method around the fragmentation fingerprint of synthetic starting materials. ‘You always have a starting material and you always have a product, and certain aspects of those starting materials are incorporated into the product,’ explains Blair. ‘If you take the starting materials and analyse them for how they fragment, that directly informs the analysis of all the products that result from that particular starting material.’

Crucially, the team evaluate the relative outcomes of each reaction, taking the ratio of starting material and product fragments and enabling them to quantify the output of every set of conditions. ‘It’s a very clever math trick,’ says Cernak. ‘Fragmentation happens at a very fast and well-defined rate inside the mass spectrometer. They look for the product mass and the fragment mass simultaneously and, because the signal that we observe in the mass spec is constant, we can do quantification at a rapid pace.’

To demonstrate the generality of this method, which employs acoustic droplet injection mass spectrometry, Blair’s team performed a screen of 384 chemical reactions, evaluating conditions for six different synthetic transformations on a single substrate. With an average analysis time of 1.2 seconds per sample, they identified the best-performing reagent combinations for each of the six targets in under 8 minutes, roughly equivalent to the time required to analyse two LCMS samples.

‘This solves a big challenge for the pharmaceutical industry and I’m excited for how this can be used for novel reaction development,’ says Cernak. ‘I’m hopeful that this tool can give us another order of magnitude of data to get towards that future state where we can really predict the outcome of chemical reactions with machine learning.’

For Blair, the next stage will be to see how this method performs in the wider chemical space and he’s optimistic about the impact the technique will have for the chemical community. ‘This whole new way of thinking blows open the doors to making lots and lots of molecules very quickly and very efficiently,’ he says. ‘It’s showing that we can start to scout complex chemistry on complex scaffolds with very little material where it would ordinarily be high risk.’