Whether it’s robots, automation or software hacks, Nessa Carson finds ways for everyone to improve how they work in the lab
A modern chemistry laboratory looks very different to the historic dawn of the field, when huge vessels of mysterious liquids would be left to settle for days, or stirred at length by a diligent assistant. Lab automation is now all around us in ways so ubiquitous as to be unremarkable – the NMR autosampler carousel, automated flash chromatography and the magnetic stirrer to save the assistant from boredom and RSI. As well as physical lab equipment, we habitually automate software tasks. The NMR carousel would not be much use without autoshim functionality, and most of us probably struggle to solve a Fourier transform manually.
This basic type of lab automation is now commonplace – but where is the cutting edge, and how can labs which have never before considered it exploit its benefits? Historically, automation chemists have learned from biologists and chemical biologists, who raced ahead with large liquid-handling robots and peptide synthesisers. However, chemistry needs more varied environments than biology: we need to consider using highly reactive chemicals, employing oxygen-free atmospheres, and stirring and sampling heterogeneous mixtures.
Top of the range pharma and materials robots that perform these tasks can be very expensive, but semi-automated chemistry with some or all of these features is much more readily in reach. And the great news is that much automation is built with user-friendliness in mind. Many robots don’t require coding skills to run, being controlled by simple graphical interfaces or the creation of recipe-like programs using drag-and-drop software or familiar Excel worksheets. Interoperable control and results files that can be integrated into broader systems are increasingly important for high-end customers, so these simple structures are likely to stay.
Expand your parameter space
It makes sense to use automation to run your chemistry more quickly, but simply doing the same work faster is of limited benefit. Instead, imagine the new possibilities you could work on with robots at your disposal. Automation and miniaturisation allow chemists to do certain types of chemistry much more effectively, although it takes up-front planning and a mindset switch to design higher-throughput sets of experiments compared to a linear progression between one or a few reactions at a time. With appropriate equipment, there is little difference in setup time and effort to run 24 sets of conditions versus 48, so the ‘free’ extras can be used to explore around your reactions for little extra cost.
Chemists optimise reactions for many reasons: getting a yield that’s just good enough for medicinal chemistry, for example, or finding general conditions for a set of substrates. Running reactions one or a few at a time in an optimisation begets the risk of exploiting successes too early and becoming unknowingly stuck in a local maximum of reaction desirability. Perhaps this is why the organic chemistry literature is at present pervaded by a paradigm known as OFAT – ‘one factor at a time’ optimisation. This involves optimising a single variable such as the base, followed by perhaps a solvent screen, and lastly finding the best ligand – unfortunately, only providing the certainty that this is the best ligand for that highly specific combination of base and solvent. Chemists with robots or just multichannel pipettes at their hands consider parameter space when running parallel plates of vials for optimisation.
Parameter space can be pictured as the multidimensional combinations of all experimental variables being considered. For the example above, this would be every possible combination of all the bases, solvents and ligands in the experiment. Somewhere in this domain lies a maximum of desirability. There are many statistical paradigms such as design of experiments (DoE) or closed-loop optimisation to help one find this, but equally, don’t be ashamed to run simple yet rigorous combinations of variables. Full-factorial experiments with every possible combination of every level of your chosen variables are habitual in expert automation labs too, due to the simplicity of their setup, analysis and understanding. In this way, they actually increase the throughput of understanding, giving readily comprehensible results, and the statistical redundancy reassuringly decreases fears of unseen error.
The liquid handlers ubiquitous in biology are usually only made to deal with fairly benign solvents
Similarly, chemical ‘libraries’ of any size to explore substrate scope are best thought of in terms of how they cover chemical space – a space covering all dimensions the experimentalist thinks are of interest. These could be calculated descriptors such as lipophilicity, or more simply for methodology, just categorical identity of the R groups. Running substrate scope with these thoughts in mind leads to more structured experiments, giving more insightful conclusions.
Practical for both of these paradigms, aluminium wellplates to contain 2–8mL vials for heating and cooling will cost you just a few hundred pounds, though be aware that 96-well plates need either shakers or preferably specialised stirrers to keep mixing homogeneous across the plate. Multichannel pipettes are handy for dispensing the same amount of solvent or liquid reagent into different vials in parallel, and repeater pipettes for performing the same task sequentially, and these are also commonplace in specialist automation labs. To splash a little more cash but also gain flexibility and utility, the next step up is likely a robotic liquid dispenser. Such robots can cost anywhere from a few thousand pounds to a six-figure sum, depending on their capabilities. Just like the variety of liquid dispensing tools to use by hand in the lab, there are several mechanisms such as gravimetric, volumetric and piezoelectric dispensing – however, the main thing to check on the lower-budget end of the scale is that your robot of interest is actually compatible with the solvents or reagents, and the volumes you hope to dispense. The liquid handlers ubiquitous in biology are usually only made to deal with fairly benign solvents such as DMSO, ethanol and aqueous buffers.
Rather than run many reactions, you may wish to look at a single reaction in greater detail. Automated sensors and feedback loops can be incorporated into both flow and batch to retrieve in situ time course data on anything from infrared spectra to colour to pH for kinetics and mechanistic understanding. Some of these can be obtained very cheaply as individual components, though the less expensive they are, the more likely they will require coding knowledge and some trial-and-error engineering to implement.
Benefits for all
Publication benefits from automation, too. The lack of reproducibility of published procedures is often decried. Robots have the handy habit of logging what they actually did rather than what they should have done. Even simply connecting balances to computers provides the benefit of effortlessly recording dispensed masses accurately and precisely, when the experimentalist assumes ‘about 20mg’ is good enough – a benefit when troubleshooting later. Of course, robots – and reagent sources – are diverse enough that small-scale reactions will not suddenly be fully reproducible to exact percentage values, but controlling process parameters on a small scale helps us understand and regulate the reaction, just as in manufacturing.
There is an extensive problem of full datasets being dumped as barely readable tables in image format in supplementary information files, sometimes even with details absent. Literature datasets generally have limited machine-readability, and therefore diminished reusability. This major frustration for an experimentalist trying to decipher the work is only felt more strongly by chemoinformaticians working to decrypt and understand the literature as a whole. Since robots require logical input and provide logical output, their software allows result tables to be generated in a more accessible and reusable format with less effort. This concept can be extended to writing procedures for patents and papers, and exporting analytical data from software, to make life far easier for both consumers and writers of the literature.
In another nod to the ‘better, not just faster’ paradigm of lab automation, since exporting data is now so easy, we can export more of what matters. Typically, the yield of a synthetic organic reaction is the most important output variable, so literature and even industrial project reports tend to focus on it. However, any optimisation chemist will tell you it’s unnecessarily frustrating to try to optimise a reaction (to make it work to new specifications or to incorporate a new substrate, convert it to flow, etc) when only yield information is known. Now, since exporting all our data at once is so simple, we can rapidly explore chemical space or parameter space, and visualise fuller profiles for every reaction we’ve performed simultaneously. It is a way to truly explore and understand our chemistry.
With software automation, we can manipulate this large result dataset such that we are faced with something more compelling than a huge table of numbers. In the past, this was limited to hiring expert coders to create black box programs in which chemists could point and click and hope to do all the tasks they were expecting. Nowadays, open source code is popular, and anybody can learn enough of the basics to perform powerful operations very quickly. No-code and low-code software automation options such as Microsoft Power Automate and KNIME Analytics Platform have a lower learning requirement and can perform diverse and powerful tasks, letting us complete tasks in seconds that are otherwise so repetitive and time-consuming to be unsuitable for the most patient of lab staff.
Python coding has now become the lingua franca of lab automation, diversifying the possible tasks even further and with many free ‘libraries’ premade specifically for interacting with written chemical structures and laboratory equipment. Scripted workflows cost only the time to write, and any additional time on user training – though ideally they should be fairly self-explanatory. Again, not just faster but better: even when performing the same task as a chemist, an optimised script will obviate transcription errors, and can feature greater functionality and rapid scalability. We can also be creative with our inputs to make chemistry easier: an example is using speech-to-text to easily match allowed tasks from a predetermined list without requiring exact knowledge of the list. Similarly to how exporting the full reaction profile rather than yield was almost free in terms of effort, now that interacting with code takes only seconds, we can use this to get the maximum value out of our chemistry.
Lab hardware and software automation has been around for decades, and many aspects are now business as usual. The newer dawn of cheap components and accessible data manipulation is bringing the power of cutting-edge automation to more chemists, if we are only willing to accept a mindset change to how we do chemistry.
Nessa Carson is a digital chemist based in Macclesfield, UK
How will AI and automation change chemistry?
- 1
- 2
- 3
- 4Currently reading
How to automate your lab
- 5
- 6
- 7
- 8
- 9
No comments yet