Exploring the transformation from traditional toxicology to precision risk assessment powered by genomic technologies
For decades, protecting people from harmful chemicals in our environment has relied on methods that would be familiar to a 1970s toxicologist: exposing laboratory animals to high doses of chemicals and observing the consequences. While this approach has served public health well, it's time-consuming, expensive, and leaves critical questions unanswered.
How do these chemicals actually affect human cells? Why are some people more vulnerable than others? What about the thousands of chemicals we encounter in tiny amounts every day?
We're now witnessing a dramatic transformation in chemical safety assessment, where cutting-edge technologies are revealing the inner workings of toxicity at the molecular level.
This isn't just an incremental improvement—it's a complete overhaul of how we understand the relationship between environmental chemicals and human health 5 .
For decades, regulators worldwide have followed a systematic process for evaluating chemical dangers. The United States Environmental Protection Agency (EPA) breaks this down into four key steps 1 :
Determining whether a chemical has the potential to cause harm under specific circumstances
Establishing the numerical relationship between exposure amount and effect severity
Examining how, when, and to what levels people encounter the chemical
Combining all information to estimate the probability of harm occurring
Traditional risk assessment struggles with several modern challenges. It's notoriously slow—a single comprehensive assessment can take years and cost millions of dollars. This creates a critical bottleneck when we need to evaluate thousands of existing chemicals plus new materials like nanomaterials and biopolymers entering the marketplace 5 .
Time Consumption
Cost
Human Variability Consideration
Throughput Capacity
"Children may be more vulnerable to environmental exposures than adults because their bodily systems are developing; they eat more, drink more, and breathe more in proportion to their body size" 1 . Traditional methods have limited ability to identify such susceptible populations or explain why they're more vulnerable.
The transformation began with the sequencing of the human genome, which provided the first comprehensive blueprint of human biology. Scientists realized that instead of waiting to see obvious signs of sickness in laboratory animals, they could detect early warning signals at the molecular level—changes in gene activity, protein production, and metabolic processes 3 .
This shift in perspective launched the field of toxicogenomics—the study of how chemicals affect our genes and cellular processes. Researchers discovered they could identify unique molecular "fingerprints" that reveal not just whether a chemical is toxic, but how it causes harm 3 .
The study of how chemicals affect our genes and cellular processes, enabling identification of molecular "fingerprints" of toxicity.
The integration of genomic science into risk assessment is unfolding in three overlapping phases 3 :
| Phase | Focus | Key Activities | Status |
|---|---|---|---|
| Augmentation | Strengthening existing methods | Using genomic data to support traditional risk conclusions | Current practice |
| Integration | Combining old and new approaches | Developing quantitative predictors that blend molecular and animal data | Emerging now |
| Reorientation | Transforming the foundation | Creating new models based on human disease pathways | Future direction |
This evolution represents a fundamental shift from observing what happens to understanding why it happens—and using that knowledge to predict risks more accurately and quickly.
In 2012, a consortium of federal and state agencies launched the NexGen (Next Generation Risk Assessment) program with a bold mission: to systematically incorporate molecular and systems biology into risk assessment practice 5 . This collaborative effort brought together the EPA, National Institute of Environmental Health Sciences, Centers for Disease Control, Food and Drug Administration, and others to pool knowledge, data, and analyses.
The core experiment was both simple and revolutionary: develop prototype risk assessments that compare results from traditional, data-rich methods with insights gained from new types of molecular data. By running these parallel tracks, researchers could validate new approaches, improve traditional methods, and determine the value of different types of scientific information 5 .
NexGen Program Launch
The NexGen program developed a structured approach to integrate new data types 5 :
Engaging diverse stakeholders to define the specific risk questions to be addressed
Creating systems to mine and integrate diverse data sources—from molecular biology databases to traditional toxicology studies
Implementing a flexible framework that matches assessment methods to the specific risk context
Building and testing model assessments to demonstrate proof of concept
The tiered assessment approach is particularly innovative, creating a continuum of methods from rapid screening to comprehensive evaluation 5 :
| Tier | Assessment Method | Cost & Time | Scientific Certainty | Best Use Cases |
|---|---|---|---|---|
| Tier 1 | High-throughput screening, computational models | Low | Moderate | Priority setting, chemical design |
| Tier 2 | Targeted testing, limited new data | Medium | Medium-high | Most existing chemicals |
| Tier 3 | Comprehensive traditional and novel data | High | High | High-priority or high-uncertainty chemicals |
The NexGen program has demonstrated that new approach methodologies can provide deeper insights into how chemicals cause harm, particularly at low exposure levels that have traditionally been difficult to study 5 . The molecular data has proven especially valuable for identifying susceptible subpopulations and understanding the biological pathways that lead to adverse effects.
Perhaps most significantly, these methods are beginning to address long-standing questions about the "mode of action"—the precise biological steps through which a chemical produces toxic effects. This knowledge doesn't just help identify hazardous chemicals; it provides crucial information for designing safer alternatives.
The revolution in risk assessment is powered by an array of cutting-edge technologies that would seem more at home in a science fiction novel than a traditional toxicology lab.
These "New Approach Methodologies" (NAMs) include 2 5 :
Rapidly tests thousands of chemicals using automated systems to identify potentially hazardous chemicals from large inventories.
Measures changes in gene expression after chemical exposure to reveal mechanisms of toxicity and molecular warning signs.
Analyzes changes in protein patterns in cells or tissues to identify protein biomarkers associated with toxic responses.
Measures changes in small molecules involved in cellular processes to detect early indicators of metabolic disruption.
Uses computational tools to analyze complex biological data and integrate information from multiple sources to predict toxicity.
Maps sequences of events from molecular initiation to adverse effects, providing framework for using mechanistic data in risk assessment.
These tools don't just work in isolation—they're increasingly used in Integrated Approaches to Testing and Assessment (IATA) that combine multiple methods to build a comprehensive picture of chemical safety 2 .
The transformation of risk assessment is happening worldwide. Europe's REACH legislation has generated substantial new data on chemicals in commerce 5 . The Tox21 consortium—a collaborative effort of U.S. federal agencies—is screening approximately 10,000 chemicals using more than 100 automated assays 5 . Meanwhile, the European Partnership for the Assessment of Risks from Chemicals is developing innovative methods for chemical safety assessment 2 .
European legislation generating substantial new data on chemicals in commerce
Screening approximately 10,000 chemicals using more than 100 automated assays
Developing innovative methods for chemical safety assessment
Despite the exciting progress, full implementation of these new approaches is likely to take 10-20 years 5 . The transition requires not just scientific advances but also new policies, procedures, and training for the next generation of risk assessors.
The ultimate goal is a comprehensive system where we can rapidly screen new chemicals, thoroughly evaluate those of greatest concern, and design inherently safer products and processes—all while reducing our reliance on animal testing 5 .
The genomic revolution in risk assessment represents more than just technical progress—it promises a fundamental shift in how we protect public health from environmental chemicals.
By understanding not just whether chemicals are toxic but how they cause harm, we can make smarter decisions about which chemicals to use, how to use them safely, and how to design better alternatives.
This new approach doesn't discard decades of toxicology knowledge but builds upon it, adding deeper layers of understanding that come from viewing toxicity through the lens of human biology. As these methods continue to evolve and improve, they offer the promise of faster, cheaper, and more accurate protection of public health—ensuring that we can keep pace with the rapidly expanding chemical landscape of the 21st century.
The journey from observing sick laboratory animals to understanding molecular pathways of toxicity has been long, but the destination—a world where we can confidently identify chemical hazards before they cause harm—is finally within sight.