We’ve just released our first animated video, “The Future of Toxicology” – an overview of the transformation that is occurring in toxicology. Check it out!
You’ll find the Human Toxicology Project Consortium at the Society of Toxicology’s annual meeting in New Orleans next week – in the ToxExpo center, poster sessions, workshops, and seminars.
The draft program is as follows:
12:30 PM—Box Lunch (for pre-registered participants) and Welcome by Thomas Hartung, Johns Hopkins University
1:00 PM—Invited Speakers (10 minute presentations each followed by 5 minute of discussion)
ToxCast Update: Russell Thomas, US Environmental Protection Agency
EDSP21 Update: David Dix, US Environmental Protection Agency
Tox21 Update: Richard Paules, US National Toxicology Program
Hamner TT21C Update: Melvin Andersen, Hamner Institutes
NICEATM Update: Warren Casey, NICEATM
SEURAT/EU Tox-Risk Update: Michael Schwarz, University of Tuebingen
CAAT’s Read-Across Initiative and Human Toxome-Related Activity Update: Thomas Hartung, Johns Hopkins
Human Toxicology Project Consortium Update: Catherine Willett, HTPC
Evidence-Based Toxicology Update: Martin Stephens, Johns Hopkins
3:15 PM—Open Microphone for Additional Presentations and Discussion
Corporate members and partners of HTPC will be presenting at SOT next week, as well. Scientists from each of the member corporations are coauthors on the following posters:
A new infographic produced by the Human Toxicology Project Consortium shows in three sections how the future of toxicity testing promises a steady reduction in testing costs, increases in human relevance and confidence in safety assessments, and the eventual elimination of animal tests.
The first section provides a snapshot comparison of the current and future costs, efficiency and efficacy of toxicity testing, while the mid portion uses pesticide testing as a specific example of now, vs near-future, vs the optimal approach that, given the focus and resources necessary, will be envisioned within the decade.
The near-future and optimal approaches rely increasingly on our understanding of biology and using it to build a predictive systems biology platform that is comprised of an interrelated network of biological pathways. This platform is used to design and interpret tests that provide much more efficient and effective characterization of chemical activity that can be used to predict safe use of chemicals.
Finally, the results of this progression are captured in the summary graphic at the end – decreasing costs, animal use and time while human relevance and our confidence in safety decisions continue to improve.
As explained on our Project page, the Human Toxicology Project Consortium works on three areas critical for the successful, international implementation of a pathways-based approach to chemical safety testing: advancing the science, communicating the purpose and goals of pathway-based toxicology, and lobbying for funding and policy changes that will support pathway-based approaches in the US and around the world.
To advance our communication and education efforts, HTPC member organizations worked together to create this infographic, to quickly and effectively illustrate the differences between traditional animal-based toxicity testing and pathway-based testing in terms of predictive power, cost, and testing capacity.
Details on the numbers used in this comparison are available here (PDF).
At a recent Capital Hill science briefing organized by the American Chemical Society (ACS) and the American Chemistry Council (ACC), Human Toxicology Project Consortium coordinator Kate Willett joined toxicologists from industry and the EPA to discuss how reforms to the Toxic Substances Control Act (TSCA) can capitalize on scientific advances in non-animal test methods. Participants explained how technologies such as high-throughput screening, organs-on-chips, and computational modeling will improve the relevance and efficiency of safety assessments, and produce crucial information more quickly. The Royal Society of Chemisty’s Chemistry World covered the briefing.
The experts gathered at the briefing agreed that tremendous advances had been made since the 1970s in understanding how chemicals can interact with biological systems – at the molecular, cellular and organ level. For example, high throughput screening now enables thousands of chemicals to be evaluated in a matter of hours or days….
Kate Willett, a toxicologist at the Humane Society of the US, noted that the critical goal of [TSCA] is to protect human health and the environment. This means a system is needed that can quickly identify potential problems and address them in the most time- and cost-effective way possible.
Willett stressed that any new TSCA reform measure must allow for “the continuing evolution of this science.” Therefore, she said the final updated law should require that all alternative approaches are used before moving to animal testing. “Reducing reliance on animal testing allows more chemicals to be more thoroughly assessed in the most efficient way possible – a win for environmental protection and the industry, and also for the animals that are used in this testing.”
The House and Senate have both passed TSCA reform bills (H.R. 2576 and S. 697) and now must reconcile differences between the two versions.
Last year, HTPC coordinator Dr. Kate Willett co-authored a publication on using QSARs to reduce animal tests.
Now, for one month only, publisher Taylor & Francis is making that article (along with others published in its computer science journals in 2014) freely available to read when you access it through Twitter! Here’s the link to use to reach the article: https://twitter.com/htpconsortium/status/631191189094400000
From Drug Discovery & Development, Algorithm Helps Scientists Decipher How Drugs Work Inside the Body:
Researchers at Columbia University Medical Center (CUMC) have developed a computer algorithm that is helping scientists see how drugs produce pharmacological effects inside the body. The study, published in the journal Cell, could help researchers create drugs that are more efficient and less prone to side effects, suggest ways to regulate a drug’s activity, and identify novel therapeutic uses for new and existing compounds.
…The method involves creating a computational model of the network of protein interactions that occur in a diseased cell. Experiments are then performed to track gene expression changes in diseased cells as they are exposed to a drug of interest. The DeMAND algorithm combines data from the model with data from the experiments to identify the complement of proteins most affected by the drug. …
The study’s senior author notes that the process “could accelerate the drug discovery process and reduce the cost of drug development by unraveling how new compounds work in the body.” Read more here.
(Reprinted from the April AltTox Digest; used with permission.)
At the start of this year’s SOT satellite meeting, “Updates on Activities Related to 21st Century Toxicology and Evidence-based Toxicology” (co-sponsored by the Center for Alternatives to Animal Testing [CAAT], the Human Toxicology Project Consortium [HTPC] and the Evidence-based Toxicology Collaboration [EBTC]), co-moderator Thomas Hartung noted that the annual gathering began in 2009 with 12 people in attendance. This year, at least 80 people attended – an impressive crowd for a meeting that takes place in the final hours of the week-long Society of Toxicology convention. The annual meeting features updates on US and EU programs and projects dedicated to advancing the toxicity-testing paradigm outlined in the NRC’s 2007 report, Toxicity Testing in the 21st Century: A Vision and a Strategy.
Richard Paules (US National Toxicology Program) started the presentations with a report on progress in the interagency Tox21 program. The program has moved into Phase III, during which they will be increasing the use of computer models for in vitro to in vivo extrapolation, adding new cell lines, expanding the pathway coverage and human relevance of assays, and developing a high-throughput (HT) transcriptomics platform.
Rusty Thomas (US Environmental Protection Agency) then described a number of initiatives underway in the ToxCast program, including research to develop the metabolic competence of existing assays, developing new assays for priority targets such as the thyroid, and exploring the use of organotypic cell cultures. ToxCast is also expanding its read-across program (and recently hired AltTox Editorial Board member Grace Patlewicz to spearhead that effort).
David Dix (US Environmental Protection Agency) gave an overview of the progress in the EPA’s Endocrine Disruptor Screening Program (EDSP), noting that improved technologies are greatly accelerating the project. The agency is concentrating on building user confidence in its screening battery and expanding the use of computational modeling. (Read an introduction to the EDSP in this two-part In the Spotlight article.)
Melvin Andersen (Hamner Institutes), a co-author of the NRC’s 2007 report, noted that considerable technical and scientific progress has been accomplished in the 8 years since publication of the NRC’s recommendations. Several key pathways have been well-described, and others are under construction. The new challenge is to determine how to communicate this progress to the public, and build their confidence in these methods.
Mark Cronin (Liverpool John Moores University) then provided an overview and update on the six components of the EU’s SEURAT-1 program. (Read more about the SEURAT-1 program in this New Perspective article.) A number of useful tools are coming out of this project, but the key outputs are proof-of-concept case studies. Level 1 studies are designed to demonstrate methods for consolidating existing knowledge to describe key adverse outcome pathways (AOPs). Level 2 studies demonstrate the integration of in vitro and in silico tools to generate predictive models. Level 3 studies, to be finalized later this year, will demonstrate how these models and knowledge bases can be used in quantitative and read-across-based risk assessment and decision-making.
Thomas Hartung reported on the activities of CAAT. Among many initiatives, CAAT is developing a read-across program that aims to facilitate 2018 REACH registrations. CAAT has also been coordinating a series of workshops in Europe and the US to advance “green toxicology” – using in silico tools to design safer chemicals. Filling in for scheduled presenter Marty Stephens, Hartung also described the work of the EBTC, which has been developing and promoting the methods and uses of systematic review in toxicology. Hartung noted that evidence-based toxicology stands to advance twenty-first century toxicology in several ways, including providing a means of assessing the quality of legacy data and new assays, providing guidance on integrating data sources, and ultimately using this information to facilitate validation procedures.
Catherine Willett updated the group on HTPC activities. Willett explained that the HTPC focuses its efforts on three areas: contributing to the advance of relevant science by sponsoring workshops and seminars, lobbying in the US and EU to increase funding for key government initiatives, and developing communication strategies to encourage regulatory and public acceptance of the NRC’s testing strategy. In the last year, the group has especially concentrated on this third area, developing an informative graphic and a series of videos that will be posted on the group’s website later this year. She noted that the HTPC also co-sponsored a seminar at SOT this year – “AOPs 201,” which covered the development and use of AOPs for regulatory purposes. Videos from the seminar will be shared on the group’s website.
The meeting closed with its traditional “open mic” segment – inviting short presentations or discussion questions from those in attendance. The update meeting will convene again at the end of next year’s SOT convention in New Orleans.
(Registered participants of the meeting will be able to access presentation slides on the EBTC website.)