Big data beats animal testing for finding toxic chemicals

(Credit: Getty Images)

Scientists may be able to better predict the toxicity of new chemicals through data analysis than with standard tests on animals, according to a new study.

The researchers say they developed a large database of known chemicals and then used it to map the toxic properties of different chemical structures. They then showed they could predict the toxic properties of a new chemical compound with structures similar to a known chemical, and do it more accurately than with an animal test.

“A new pesticide, for example, might require 30 separate animal tests, costing the sponsoring company about $20 million…”

The most advanced toxicity-prediction tool the team developed was on average about 87 percent accurate in reproducing consensus animal-test-based results across nine common tests, which account for 57 percent of the world’s animal toxicology testing. By contrast, any given test on animals had only an 81 percent chance, on average, of obtaining the same result for toxicity when repeated.

“These results are a real eye-opener,” says principal investigator Thomas Hartung, professor of environmental health and engineering at the Bloomberg School of Public Health at Johns Hopkins University. “They suggest that we can replace many animal tests with computer-based prediction and get more reliable results.”

The computerized approach could also be applied to many more chemicals than animal testing, he says. Due to costs and ethical challenges, only a small fraction of the roughly 100,000 chemicals in consumer products has been comprehensively tested.

Animals such as mice, rabbits, guinea pigs, and dogs annually undergo millions of chemical toxicity tests in labs around the world. Although this animal testing is usually required by law to protect consumers, there is opposition to it on moral grounds. It is also unpopular with product manufacturers because of the high costs and because of uncertainties about testing results.

“A new pesticide, for example, might require 30 separate animal tests, costing the sponsoring company about $20 million,” says Hartung, who also directs the university’s Center for Alternatives to Animal Testing.

The most common alternative to animal testing is a process called read-across, in which researchers predict a new compound’s toxicity based on the known properties of a few chemicals that have a similar structure. Read-across is much less expensive than animal testing, yet requires expert evaluation and somewhat subjective analysis for every chemical of interest.

As a step towards optimizing and automating the read-across process, Hartung and colleagues assembled the world’s largest machine-readable toxicological database two years ago. It contains information on the structures and properties of 10,000 chemical compounds, based in part on 800,000 separate toxicology tests.

“There is enormous redundancy in this database,” Hartung says. “We found that often the same chemical has been tested dozens of times in the same way, such as putting it into rabbits’ eyes to check if it’s irritating.” This waste of animals, however, gave the researchers information they needed to develop a benchmark for a better approach.

The researchers have now enlarged the database and used machine-learning algorithms, with computing muscle provided by a cloud server system, to read the data and generate a “map” of known chemical structures and their associated toxic properties. They developed related software to determine precisely where any compound of interest belongs on the map, and whether—based on the properties of compounds “nearby”—it is likely to have toxic effects such as skin irritation or DNA damage.

Will human ‘mini-brains’ replace animal testing?

“Our automated approach clearly outperformed the animal test, in a very solid assessment using data on thousands of different chemicals and tests,” Hartung says. “So it’s big news for toxicology.”

Underwriter’s Laboratories, a company that specializes in developing public safety standards and testing against them, cosponsored the research and is making the read-across software tool commercially available.

The US Food and Drug Administration and the Environmental Protection Agency have begun formal evaluations of the new method to test whether read-across can substitute for a significant proportion of the animal tests currently used to evaluate the safety of chemicals in foods, drugs, and other consumer products.

The researchers are also starting to use it to help some large corporations, including major technology companies, determine if they have potentially toxic chemicals in their products.

“One day, perhaps, chemists will use such tools to predict toxicity even before synthesizing a chemical so that they can focus on making only non-toxic compounds,” Hartung says.

The research appears in the journal Toxicological Sciences.

The National Institute of Environmental Health Sciences and the European Commission funded the work.

Source: Johns Hopkins University