Thomas Hartung discusses the next generation of toxicity testing and the regulatory science of the future
Regulatory toxicology is used to control about $10 trillion (£6.2 trillion) worth of worldwide trade in chemicals, drugs, pesticides, food and consumer products. However, regulators and the regulated community – as well as the general public – are increasingly uncomfortable with the toolbox used: only one in eight commercially used substances has been tested for any toxicological effect at all, novel products are not suited for the tests available, the precautionary approach taken hampers the use of many likely beneficial substances and animal use is viewed ever more critically by the public.
While Europe has been pursuing change under the label of alternative methods and the 3Rs (reduce, replace, refine) for more than two decades, it was only a 2007 National Research Council report on the subject that prompted serious efforts to change the US system. The 3Rs approach had suggested either optimising animal tests (fewer animals or less pain and distress) or replacing traditional tests with in vitro or in silico models. However, there are too many cell types, too many interactions and too many compromises with regard to culture conditions and the data available for modelling to allow a redesign of the current system based on these first generation alternatives. Thus, there is a need to accelerate the renovation of regulatory sciences.
The answer emerging in Europe is to combine various information sources in integrated (or intelligent) testing strategies (ITS). Despite the enormous possible impact this could have with EU programmes like Reach or the testing ban for cosmetic ingredients, efforts to further this are astonishingly limited. Combining and validating different tests in ITS is still an issue, but promising projects such as the EU’s ReProTect have shown the feasibility of the approach. Although imperfect, it might soon be better than the compromise of using different species to assess human safety issues.
The answer emerging in the US is to adopt a new, molecular description of toxic action based on pathways of toxicity (PoT) – the biomolecular pathways that lead to a cell being harmed. The hypothesis is that there is a limited number of such pathways, thus permitting the design of simple testing batteries. The Environmental Protection Agency (EPA) is spearheading the evaluation of PoT-based approaches with their ToxCast programme using high-throughput assays mainly developed for the purposes of the pharmaceutical industry. ToxCast has already proven its utility by informing the decision of choosing a dispersant for the gulf oil spill disaster of 2010.
This approach represents a paradigm shift in toxicity testing – identifying a substance’s effect on harmful cellular pathways, rather than interpreting effects on cell lines or entire organisms. Modern technologies, especially those to determine gene expression and metabolite profiles in cells (transcriptomics and metabolomics), allow wholesale characterisation of the disturbances produced by a toxicant. These signatures are increasingly used for hazard identification but are blurred by biological noise and the unique features of the given cell system. By deducing the underlying PoT, it should be possible to annotate them to cell types, species, hazards or classes of toxicants. Activities led by the Johns Hopkins Center for Alternatives to Animal Testing aim to develop a public database that will map the entirety of these pathways for human biology – the human toxome. This will not yet be new regulatory toxicology, but a point of reference upon which it can be built (just as the human genome does not represent a therapy, but a point of reference for the life sciences). With the expectation that we face only a few hundred PoT, a substantial number should be identified in a decade of work. The Organisation for Economic Cooperation and Development is already embracing this concept under the label ‘adverse outcome pathways’.
There is a third approach emerging in the US prompted by the need to develop and evaluate countermeasures for biological and chemical warfare and terrorism. Although both ITS and PoT are viable options in this field, the last few months have seen a remarkable $200 million go toward exploring organ models on microfluidic chips. There is a lot to be said regarding the promise and limitations of this concept, but it will certainly bring a large number of developers into the arena of predictive in vitro testing. It will be of utmost importance to make use of the lessons learned over the last two decades regarding alternative testing methods and their validation to guide these developments.
The future toxicology will be based on multiple information sources, not stand-alone animal tests. This will include strengthened exposure assessments, integrated use of assays defined at the molecular level (PoT) as well as computer modelling for in vitro to in vivo extrapolation. Toxicology will express its results, not as a black-and-white verdict, but as a probability of harm with an uncertainty measure.
With global coordination, and with the support of the corporate sector, agencies, the scientific community, NGOs and the general public, efforts to improve toxicity testing may come to fruition closer to 88 months than 88 years. But this remains to be seen, once regulatory toxicology 2.0 has emerged.
Thomas Hartung is professor of toxicology, pharmacology, molecular microbiology and immunology at Johns Hopkins Bloomberg School of Public Health, Baltimore, US, and the University of Konstanz, Germany
References
- Toxicity Testing for the 21st Century – a vision and a strategy, 2007, National Research Council
No comments yet