There have been several compelling driving forces for approaching toxicological risk assessments from the TTC perspective.
The first were regulatory requirements for public safety, such as the Delaney Clause. The Delaney Clause is a 1958 amendment
to the Food, Drug, and Cosmetic Act of 1938 that states the following:
The Secretary of the Food and Drug Administration shall not approve for use in food any chemical additive found to induce
cancer in man, or, after tests, found to induce cancer in animals.
This requirement ultimately led to the Rawley proposal of the FDA Center for Food Safety and Applied Nutrition's (CFSAN) Threshold
of Regulation (TOR) approach. This approach determined the upper limit of concentration of a substance so that levels below
that limit raised no concern that it might cause cancer at a statistically minimal (i.e., one in 106 ) rate (8). Although proposed in 1986, a series of legal challenges prevented the codification of the TOR until 1995 (9).
The risk of inducing cancer in man or animals is not zero unless the impurity believed to induce cancer is also at zero concentration.
The development of the TOR policy effectively resolves the issue that concentrations of impurities cannot be proven to be
zero. Rather, impurity concentrations can only be shown to be less than the detection limit. According to data from the National
Cancer Institute collected between 2002–2004, the lifetime risk of developing any form of cancer in the US is approximately
one in three. Given this statistic, a risk of less than one in a million additional cancer cases for impurities below the
TOR was as close to zero as the Delaney Clause could have intended. For example, an American's current probability of getting
cancer is 1 in 3, or 0.333333. Adding a 1 in 106 additional risk would increase the probability of an individual getting cancer to 0.333334, clearly an immeasurable increase.
A second driving force for approaching toxicological risk assessments from the TTC perspective has been the increasing sensitivity
of analytical methods used to detect and measure impurities, as well as ever more powerful techniques to obtain structural
information on unknown compounds. While routine analytical methods in the 1950s measured most impurities in the fractions
of percents, by the end of the century many analytical methods could often measure impurities in the parts-per-billion range,
and much lower in certain cases. The commercial development of mass spectrometers (MS) of numerous types, but especially those
attached as detectors to gas chromatography (GC–MS) and high-performance liquid chromatography (HPLC–MS) instruments, makes
possible the identification, or partial or tentative identification, of many of these trace impurities. Once such trace-level
impurities can be detected and identified, it becomes feasible to analyze the risk that they might pose. However, the effort
and cost required to perform a risk assessment on one or two impurities are dramatically increased as the list of impurities
for a risk assessment increases, even if the concentrations of the additionally detected impurities are extremely low.
The final driving force for approaching toxicological risk assessments from the TTC perspective has been recent concerns surrounding
both the financial cost and ethics of animal testing (10). The European Union Registration, Evaluation, Authorization and
restriction of CHemicals (REACH) program has been estimated to cost €1–2 billion (USD $1.56–3.13 billion) and would require
more than a million animals if testing were done using current best practices (11). Despite a large effort to further develop
in vitro tests to minimize the number of in vivo animal tests, to date, only animal testing data can be reasonably extrapolated into humans. But a TTC approach to risk assessment
may make some animal testing unnecessary. Some have proposed a combination of the TTC approach with intelligent testing strategies
(ITS), which is premised on the idea that significant benefits will result from considering the methods used for hazard assessment
in a holistic manner, rather than examining each method separately (12).
The most reliable data on human toxicological response are unquestionably from human epidemiology studies of historical chemical
exposures, particularly when the dose can be reliably estimated. However, such data are only rarely available. Currently,
animal testing is the next-most-reliable indicator of human toxicological response, and using SARs to predict toxicity, as
is used in the total TTC approach, is currently the least reliable approach of the three. As more and more structures and
toxicological information are entered into toxicology databases and as the algorithms using SARs improve, TTC will offer greater
value. Furthermore, while in vitro and cell-based testing can be the "canary in the coal mine," their ability to predict a safe human dose is currently extremely
REGULATORY GUIDANCE IN PHARMACEUTICAL APPLICATIONS
General guidance from FDA on impurities in pharmaceuticals can primarily be found in ICH guidelines Q3A, Q3B, and Q3C (13–15).
The guidance in these documents focuses primarily on impurities caused by the synthesis of the drug, degradation of the drug,
or residual solvents in the drug from the manufacturing process. These guidance documents do not directly address impurities
from in-process leachables, but merely refer to "extraneous contamination that should not be present" that should be controlled
by current good manufacturing practices (cGMP). General guidance on equipment and materials used in manufacturing pharmaceutical
can be found in 21 CFR 221.65 which states the following:
Equipment shall be constructed so that surfaces that contact components, in-process materials, or drug products shall not
be reactive, additive, or absorptive so as to alter the safety, identity, strength, quality, or purity of the drug product
beyond the official or other established requirements. (16)
Perhaps the most specific FDA guidance in the area of leachables pertains to the final container closure (17). Focus on container
closure is natural because the exposure time can be extensive—months to years—and there are no further purification steps
to lessen any concerns about leachables. Table II is drawn from the FDA guidance for final container-closure systems and
clearly delineates the importance of the route of administration of the drug.
Table II: Safety guidance for drug containers from FDA Guidance for Industry: Container Closure Systems for Packaging Human
Drugs and Biologics (17).
The guidance on upstream, in-process leachables is appropriately less detailed because the risk is lower. A biopharmaceutical
process extractables team recommended that the relative risk of various product-contact materials be evaluated with a risk-evaluation
worksheet so that the highest priority will be given to materials known to potentially pose the highest risk. Among the variables
in the worksheet are proximity to the API; extraction capability of the solution relative to the material and its potential
extractables, time, temperature, and area or volume of contact; and cytotoxicity of extractables from the materials in tests
such as USP <87> (18).
One of the common difficulties in the use of polymeric materials in a regulated environment such as pharmaceutical manufacturing
is that the commercial lifetime of any polymeric material, or one of its components, is likely to be shorter than the commercial
lifetime of a successful pharmaceutical drug. Most polymers are commodities subject to intense cost pressures over time, including
newer manufacturing processes and lower-cost manufacturing sites. In the European Union, the Polymerforum Group was formed
to foster better communication and strategies between polymer and pharmaceutical manufacturers around the issue (19).
The literature contains an illustrative example of a comprehensive analytical leachables study conducted after a film used
as container closure was changed, although the risk-assessment portion of the study that presumably justified the change of
materials was not included (20). The importance of change controls and supply-chain management when using commodity products
such as plastics was recently emphasized (21). A comprehensive review of safety considerations related to leachables when
using polymeric materials in pharmaceutical applications was recently published (22).