Monday, December 13, 2010

Bisphenol A Found on Money

A recent report published by the non-profit, advocacy groups Washington Toxics Coalition and Safer Chemicals Healthy Families stated that 21 out of 22 dollar bills tested contained small concentrations of bisphenol A (BPA). In addition, receipts collected across the country were also tested and found BPA concentrations on 50% (11 out of 22) of them. According to the report, normal handling of the receipts resulted in transfer of BPA from the paper to skin. The new data on BPA are used by the authors to illustrate the need for TSCA revisions.

By itself, a report like this sounds alarming, but it becomes less worrisome when certain factors are considered. While there have been many scientific articles suggesting a possible association between BPA and adverse health effects, there are also many others that do not find any significant health consequences to the public, especially at low concentrations. In addition, regulatory agencies in the United States and Europe have not deemed BPA to be a health threat under conditions of intended use.

Another important consideration is that this report was published by the non-profit groups themselves and not in a scientific journal, which means that it did not undergo an independent peer-review process. In particular, there is information in the report which is puzzling. For example, the statement that one receipt contained 2.2% BPA by weight is difficult to believe given the molecular weight of BPA. The report also indicated that people are being exposed from handling the receipts and currency, but only very small amounts of BPA are presumably absorbed through the skin. The hypothetical absorbed concentration would then be converted to a biologically inactive metabolite and eliminated from the body. Therefore, no adverse health impacts would have occurred. Finally, the CDC has conducted biomonitoring surveys of the United States and found that consumer exposure (which would include all potential exposure routes including receipts and currency) to BPA is low.

While it is interesting to learn that small amounts of BPA can be found on thermal paper products and dollar bills, the new information should not change the current regulatory viewpoint that BPA is not a public health threat given the biological insignificance of the concentrations. However, the authors’ call for revisions to TSCA legislation, no matter how unrelated it may be to the substance of their article, is still a necessary step for modernizing chemical regulation given the increased need to understanding of exposure and risk as they relate to chemicals in consumer products.

Monday, December 6, 2010

Chemical Regulation Overhaul May Hit Roadblock

According to a recent article published by Chemical & Engineering News, the shift in power in the House of Representatives from Democrat to Republican will greatly impact proposed chemical regulations and reform. Along with clean air and climate change regulations, the belief is that the TSCA reform will not be a high priority in the upcoming year. One Republican, Rep. Joseph L Barton (R-Texas), who is making a bid to head the House Energy and Commerce Committee, was quoted as saying that TSCA is “working well” in its current format. The Society of Chemical Manufacturers and Affiliates (SOCMA) and the American Chemistry Council responded by stating that they will continue to meet with legislators in the hope of convincing them to introduce the revised TSCA regulations in 2011.

Saying that TSCA is “working well” has been argued against by both non-profit groups and the chemical industry. The numerous hearings during 2009 provide support that revisions to 34-year old TSCA should be considered a priority and should not be delayed. As chemical regulations in the European Union and Canada continue to change and modernize with the advancement of science and technology, the United States runs the risk of falling behind in scope and relevancy. In addition, individual states, such as California, are developing their own regulatory initiatives which could result in a patchwork of regulations across the United States and cause enormous difficulties with compliance for industries with nationwide production and distribution.

Monday, November 29, 2010

Many of the Chemicals in Fragrance Products Are Not Listed on Labels

A recent article published by DiscoveryNews reported that “toxic or hazardous chemicals” have been identified in commonly used scented products, including dishwashing detergents, deodorants, baby shampoos, fabric softeners, and cleansing products. Many of the products tested were labeled as “organic” or “natural” and the majority of the chemicals detected were not listed on the product labels.

The debate over the toxicity of unlisted chemicals in scented products is not new. A number of non-profit groups have rebuked companies, often in the cosmetics industry, for not disclosing the entirety of their ingredients claiming that their products negatively impact public health. Industries that produce fragrances and International Fragrance Association have responded that their products are safe and that the fragrances are within acceptable limits.

Presently, fragrance formulations are protected as trade secrets or Confidential Business Information, but this may soon change as one of the goals of the proposed TSCA revision is to increase transparency. Therefore, claims of Confidential Business Information will be denied if the chemical is already on the publicly available TSCA Chemical Substances Inventory or is submitted under TSCA Section 8(e) as part of a health and safety study. Full chemical disclosure on product labels is also being discussed by many campaigns. While not all industries will be affected by the TSCA revisions (e.g., cosmetic are regulated by the FDA), many companies will find themselves burdened with providing in-depth toxicity data on their chemicals and the development of more detailed labels.

It is true that some people are sensitive to certain smells, not necessarily only artificial fragrances, and may develop headaches or other mild, temporary effects. Generally, this occurs when exposed to a high concentration of a scented product and not with normal application. However, the fact still remains that unlisted chemicals in scented products are an issue in the media and in legislation. As suggested by the proposed TSCA revisions, the chemical regulation landscape is changing. Whether it is Confidential Business Information or product labels, companies need to (1) have a full appreciation of the potential health impacts of the chemicals in their products, (2) stay informed on the changes to these regulations, and (3) learn how to comply with the regulations relevant to their industry.

Monday, November 22, 2010

EPA’s Backlog Stalls Regulation and Enforcement of Nearly 255 Chemicals

The United States EPA’s Integrated Risk Information System (IRIS) has often been criticized for lacking transparency in their evaluation processes, allowing influence from other governmental agencies, and unexplained delays in producing assessments. The latest condemnation comes from a report published by the Center for Progressive Reform, a nonprofit organization, which chastises EPA for falling behind in completing statutory mandates for 255 chemicals. As an example of the incompleteness of IRIS, the report notes that EPA has not developed inhalation reference concentrations (RfCs) for approximately 77 known hazardous air pollutants.

This is not the first time that the problems with IRIS have been reported. In 2008, the Government Accountability Office (GAO) published a report detailing that despite EPA’s efforts, the backlog on chemical assessments have not diminished. In fact, the GAO implied that Office of Management and Budget requirement of interagency review would only further hamper the assessment process and limit IRIS’s credibility. Some of the GAO recommendations included clearly defining and documenting the IRIS process to minimize the need for revision, setting defined time-limits for interagency review, and conducting assessment with available studies and only suspending assessments to await completion of scientific studies in exceptional circumstances.

In addition, a review of the IRISTrack website demonstrated the validity of this criticism. For example, the acrylonitrile and benzo[a]pyrene assessments began in January and December 1998, respectively, and the finalized assessments are still pending. However, the assessment for chromium VI was initiated in November 2008 and is scheduled to be finalized in the third quarter of FY11. While the quick turnaround of the chromium VI assessment is unusual for IRIS, the majority of the ongoing assessments are scheduled to be finalized sometime during FY11, so, perhaps, this is an indication that the IRIS process is improving with the recent revisions.

In 2009, EPA Administrator Lisa Jackson announced reforms to the IRIS process including a streamlined review process so that assessments are available within two years of the start date. The reform involved reducing the timeframes for assessment and making the written comments from other governmental agencies public. These revisions allowed the EPA to regain a stronger control over the IRIS process, while still providing transparency and integrity.

While the recent reform of the IRIS process is a step in the right direction, there will still be long delays in producing assessments because EPA is hindered by a lack of toxicity data for many of the chemicals yet to be assessed. However, this may soon change with the upcoming TSCA revisions placing the burden on industry to conduct studies to fill in data gaps on their products and chemicals. Whether EPA is prepared to handle the influx of data from various industries on numerous chemicals that will emerge after the TSCA revisions are finalized is still a question to be answered. Once EPA begins receiving the industry data, there will be an initial struggle to review and manage the incoming information until EPA learns to adapt, which will undoubtedly result in further delays in the IRIS assessment process. How long the period of adjustment will last will depend on how well the IRIS program is managed.

Tuesday, November 16, 2010

Gas Production Technique Under Scrutiny

Controversy Over Whether Drilling Fluids are Contaminating Groundwater


About 90 percent of natural gas wells now employ hydraulic fracturing, a technique in which sand and fluids are pumped into wells to open seams within rock formations so that they release more methane. This practice, commonly called “fracking,” has allowed for the production of gas from formations one to two miles deep, and is credited with more efficient development of gas reservoirs with fewer wells. It is also considered critical to economical production from

shale formations, such as the Marcellus Shale region in the Eastern U.S., which tend to feature low porosity rock.


In fracking operations, wells are completed to the desired depth and the bottom portions of the well casing are punctured with explosive charges, releasing pressurized water and sand into resulting fractures in the rock. The water contains an estimated one percent drilling fluids, primarily lubricants and surfactants. The precise formulations of these fluids are proprietary, but constituent chemicals may include benzene, zetaflow ® (whose composition is not publicly known) and 2-butoxyethenol (2-BE).


It is these drilling fluids that have aroused complaints from neighboring landowners and environmental advocates, who claim that they are polluting groundwater near gas wells. These parties cite odors, coloring and sediments observed in tap water, as well as isolated cases of illness. They also call for the identity of the fracking fluids to be revealed, without which linkages to water contaminants will be difficult to establish.


A 2004 EPA study on coalbed methane production concluded that it was unlikely to impact groundwater because water tables resided hundreds or thousands of feet above the fractured portions of wells. Some advocates of further study and regulation acknowledge this, but theorize that imperfections in well structures may be allowing fracking fluids up into the well shaft and from there onto the surface and the groundwater beneath.


Based in part on EPA’s work, Congress in 2005 exempted fracking fluids from the Safe Drinking Water Act, which requires the disclosure of chemicals potentially released into groundwater. Legislation proposed by Rep. Diana DeGette (D-CO) and Charles Schumer (D-NY) would revoke that exemption and thereby shed light on the compounds being used. These bills are supported by the Natural Resources Defense Council (NRDC) and other public interest groups.


EPA has announced another review of the issue, and has asked nine leading drillers to voluntarily provide the agency with the formulations of their fluids. A coalition of state water regulators is also promoting voluntary disclosure of fluid components, which it plans to post on a website for the benefit of researchers.

Monday, October 25, 2010

California "Safer Consumer Product Alternatives" Regulation Continues Trend Towards Greater Chemical Scrutiny


California is known for setting trends in this country related to music and entertainment. But it also serves as a bellwether for environmental regulation. This pattern may be continuing with the Safer Consumer Product Alternatives (SCPA) regulation, a measure under development by the California Department of Toxic Substances Control (DTSC) as part of the state’s Green Chemistry Initiative.

The SCPA, which does not require approval by the legislature, will direct state regulators to establish a priority list of chemicals of concern, and a corresponding list of products containing those chemicals. Businesses selling into the state—including manufacturers, distributors, retailers and licensees—will need to:
  • Provide regulators with the name and contact info for all participants in the product’s supply chain;
  • Conduct a life-cycle analysis of the product’s human health and environmental impacts, considering raw materials sourcing, manufacturing, transportation and disposal/recycling;
  • Prepare a similar analysis of substitute chemicals, along with a proposal to redesign the product to reduce concentrations of chemicals of concern in favor of safer alternatives.
The process of establishing the priority chemicals and priority products list will consume the next three years, and compliance obligations under the regulation will commence in December 2013. A parallel proposal by Cal/EPA’s Office of Environmental Health Hazard Assessment (OEHHA) would require that data developed under these assessments be made available to consumers under the state’s Proposition 65 law. Currently, only cancer and developmental/ reproductive effects must be reported under Prop 65.

Taken together with activity under CPSIA, TSCA and REACH, the SCPA embodies a demand by decision-makers for a more comprehensive suite of data on the chemical content of products, exposures, and the potential for human and environmental harm as a condition of market access. The hazard traits that must be evaluated and under the draft law are quite extensive, and include neurotoxicity, endocrine disruption, epigenetic toxicity, ototoxicity and phtotoxicity.

Readers may wish to submit comments to the agency before the expiration of the comment period on November 1, 2010.

Thursday, October 14, 2010

REACH Right-to-Know Provisions: More Thoughts

Retailers and their upstream suppliers should not be surprised to see a flood of such inquiries going forward. To help facilitate that result, at least one NGO has posted a sample letter for use by supporters.

The time and paperwork entailed in responding to these information requests will prompt many companies to view these inquiries as "a death by a thousand cuts." Some will simply abandon the market while others may reformulate to non-SVHC ingredients - which some would argue is the ultimate purpose of right-to-know legislation.

Regardless, chemical management by hazard in the absence of an appreciation of exposure and risk appears to be the modus operandi for regulators and NGOs for the foreseeable future.

Wednesday, October 13, 2010

Retailers May be Violating REACH Right-to-Know Provisions

One of the lesser-known provisions of the sprawling REACH law is the right of consumers to demand information on the chemical content of products they purchase. The Ecologist reports today that two major European retail chains are failing to meet these obligations.

Titles IV and V of the law require that sellers of products containing chemicals listed as substances of very high concern (SVHC) notify their customers of this (customers could be downstream producers utilizing a chemical in their own product, as well as retail consumers). Generally, the threshold for notification is the presence of an SVHC in a concentration of 0.1 percent. The reporting obligation can be triggered by the addition of a substance to the SVHC candidate list, which currently contains several dozen chemicals but is expected to grow exponentially in coming years.

Reporting duties can also be triggered by a customer request for such information. In this case, at least the name of the chemical and information allowing the safe use of the product must be provided to the customer within 45 days. In a recent investigation by the European Environmental Bureau (EEB), that NGO sent 158 information requests to 60 retailers and vendors in the EU. EEB contends that 50 percent of the requests were not answered at all, while 75 percent received legally insufficient responses. EEB goes on to recommend that companies selling into Europe establish electronic chemicals management systems, utilize third-party testing, and deliver certification manifests through the supply chain.

Monday, October 4, 2010

Compliance Challenges for Chemical Companies

An excellent article by James A. Kosch, partner at LeClairRyan, on the need for the chemical industry to adapt to a changing regulatory climate.

Friday, September 24, 2010

TSCA Legislation: A Short Phrase With Huge Implications

Section 6 of the House TSCA modernization bill (H.R. 5820) requires U.S. EPA to develop a priority list of 300 chemicals and to make safety determinations regarding them. The bill also directs the agency to develop guidance for making such safety determinations and that “in developing such guidance, the Administrator shall rely upon the recommendations of the National Academy of Sciences report entitled ‘Science and Decisions.’"

This routine-sounding phrase is of potentially enormous significance to the way that risk assessment and chemical regulation is conducted in this country. The referenced document is a 2009 National Research Council report that made controversial recommendations on how dose-response curves should be formulated. The NRC committee that prepared the report wrote that:


Non-cancer effects do not necessarily have a threshold, or low-dose nonlinearity…. Scientific and risk-management considerations both support unification of cancer and non-cancer dose-response assessment approaches. (Summary, p. 8)

The idea of “linearity” –that an extrapolation of data points along the dose-response curve should pass through the intercept of the x and y axes—makes sense from a statistical standpoint. Indeed, statisticians were well represented on the committee.

However, linearity of non-cancer effects departs from decades of accepted understanding of biological mechanisms and how those shape dose/response. Through the processes of detoxification and cell-repair, organisms respond to lower levels of toxic exposure without adverse health effects.

This phenomenon gives rise to the concept of a threshold of effect. Naturally occurring levels of numerous toxicants provide evidence of the threshold effect at work. Without the ability to respond to these exposures with detoxification and repair, human life would have long ago been extinguished on this planet.

Determining such thresholds in laboratory animals, where it is termed a “no observed adverse effect level” (NOAEL), and translating it to comparable human equivalent dose levels --a “reference dose (RfD)--has become a frequent product of the work of regulatory agencies and non-governmental organizations across the globe.


It allows them to establish and enforce protective yet workable exposure limits for chemicals in food, consumer products, occupational settings, and environmental media such as remediated soil.

Discounting the existence of a threshold, what the committee calls “non-linearity,” would introduce both theoretical confusion into the field of risk assessment and also practical difficulties into the work of regulators. Linearity implies that each incremental increase in dose above zero can be expected to produce health effects. If moving from zero to one part-per-trillion of a particular chemical carries a statistically calculated, theoretical risk, does that obligate a regulatory agency to address that level of exposure?

This question is especially pertinent for statutory frameworks that lack other standard-setting criteria, such as technical feasibility or a balancing of costs and benefits. The National Ambient Air Quality Standards (NAAQS) provisions of the Clean Air Act, for example, require EPA to set allowable levels for particulate matter, ozone and other pollutants at a level sufficient to “protect human health and the environment, with an adequate margin of safety.” In such a context, the question arises whether the legally required level is “zero,” and if so, how does EPA—and American society—get to that level?

Thursday, September 9, 2010

Autism “Debate” Reveals Media’s Misplaced Quest for Balance

An recent article from CNN.com suggests that suspected links between childhood immunizations and autism have finally been dispelled. Twelve years ago, the publication of a single journal article originally advanced the hypothesis that multiple vaccines (i.e, MMRs) were linked to autism. Subsequent research, including over a dozen epidemiological studies, showed no link between vaccines and autism. Further, the journal that published the original article retracted it, and its author was stripped of his medical license as a result of ethical questions pertaining to his publication. Unfortunately, the media has appeared to reach this consensus over a decade too late.

Furthermore, the CNN piece leaves readers with the impression that a debate in the scientific community is still ongoing. Undoubtedly this is the result of the reporter's desire to apply the practice of "balanced journalism" (e.g., quoting one expert on each side of a story line) to scientific controversies, even when the weight of evidence tilts emphatically in one direction.

In this case, parents across the globe were unnecessarily alarmed, immunizations were curtailed, and outbreaks of measles and other diseases resulted. In fact, to this day there are parents who are dangerously neglecting their responsibilities to protect their children's health by following the advice of a discredited study and uninformed celebrities. One would hope that in the future, the media would rely on more rigorous application of scientific principles, particularly the weight-of-evidence in the scientific community, rather than lessons learned during Journalism 101.

Tuesday, August 31, 2010

Bed Bugs and Pesticide Regulation

A piece from today's Washington Post highlights the phenomena of countervailing risks and unintended consequences so common to chemicals management. DDT is long-acting and effective against a broad spectrum of insects. It has relatively low human toxicity, but high environmental persistence, manifested most vividly in the fragile eggshells of species such as bald eagles and brown pelicans that led to its ban in the U.S. The ban was subsequently extended worldwide, which some contend contributed to the death of millions from malaria and other insect-borne diseases. This argument is advanced by Robert S. Desowitz in The Malaria Capers.

Monday, August 23, 2010

TSCA 2.0: Are Comparisons to Pesticide Regulation Instructive?

There is a widely-shared criticism of how new chemicals are regulated under TSCA. TSCA Section 5 requires a Pre-Manufacturing Notice (PMN) for both new chemicals and proposed new uses of chemicals already on the market. The PMN need not be accompanied by comprehensive health data on the compound, merely that information already in the possession of the applicant. This leaves industry without a strong incentive to compile a robust data set, and the resulting information gaps impair EPA’s ability to make informed judgments about the safety of a given chemical.

Both the House and Senate TSCA bills would substantially toughen PMN and review, requiring applicants to provide a "minimum data set" about the chemical and potential health and environmental impacts. Manufacturers and processers would for the first time have the responsibility of identifying or generating research needed to complete this data set or risk losing access to the marketplace.

Proponents of the bill argue that stringent data collection and review of new chemical products have been features of federal pesticide regulation for decades, and that this has not unduly burdened the crop protection industry or the agriculture sector. This analogy was made by Rep. Frank Pallone (D-NJ) at the July 29 subcommittee hearing on H.R. 5820.

The pre-market screening of pesticides under the Federal Insecticide, Fungicide, Rodenticide Act (FIFRA) and related statutes is undoubtedly stringent. Registrants must be prepared to provide studies on carcinogenic, neurological, developmental and reproductive effects, as well as information on fate and transport, persistence, and metabolic byproducts. Another essential element of the petition is an assessment of exposure to humans and other non-target organisms.

In light of these requirements, the agency itself acknowledges that “depending on the class of pesticide and the priority assigned to it, the review process can take several years.” Despite this, U.S. regulation of pesticides is generally viewed as a success, a point often made by those drawing comparisons to TSCA modernization.

But is this comparison apt? Pesticides present a relatively contained range of exposure scenarios, including exposed agricultural workers, contaminated groundwater, and pesticide residues on crops.

In contrast, the chemicals that would be governed by a revised TSCA present multiple routes of exposure, and the interaction of these various exposures must be modeled and aggregated to make meaningful decisions about chemical use. Further, a reworked TSCA is virtually certain to require modeling of exposures by vulnerable populations such as children and the elderly, something that is not required for pesticides.

The net result could be a registration process with a complexity that resembles REACH rather than FIFRA, one that will delay market access for some chemicals by years, and discourage others from being commercialized at all. Such an outcome might be tolerable for some mundane products and uses. But for components of lifesaving technologies such as medical devices and time-to-market critical products like smartphones, is our society prepared to accept such a delay?

Friday, August 13, 2010

New TSCA Inventory Update Reporting Rule Proposed

Agency Continues Pattern of Deploying Existing Authority

On August 11, 2010, EPA proposed changes to its TSCA Inventory Update Reporting (IUR) rule that would increase the frequency and standards for reporting by IUR- covered facilities. First promulgated in 1986, the IUR rule is designed to provide the agency with the volume of chemicals produced, imported or processed, along with basic information on how those chemicals are used. The goal is to inform agency prioritization and safety determinations for chemical substances, and (consistent with confidentiality restrictions) to release relevant data to the public.

The IUR rule was revised in 2003 and again in 2005 to expand the range of reportable chemicals and plant sites, broaden the type of data reported, and lower the production volume thresholds that trigger reporting. The pending changes continue that trajectory, increasing the frequency of reporting, requiring electronic reporting and decreasing the reporting threshold for facilities processing and using chemicals from 300,000 pounds to 25,000 pounds. The criteria for asserting confidential business information (CBI) would be further tightened, more specific information required on downstream commercial and consumer uses of chemicals produced.

This proposal fits squarely within a pattern of more deliberative assertion by the agency of its TSCA authority in recent years. The agency has expedited the collection and review of data for chemicals under voluntary programs such as the Voluntary Children’s Chemical Evaluation Program (VCCEP), the Chemical Assessment and Management Program (ChAMP), and the High Production Volume (HPV) Challenge Program. More recently, a series of “action plans” utilizing TSCA Section 6 and Section 5(b)(4) authority have directed scrutiny onto chemicals suspected of properties such as persistence, bioaccumulation and endocrine disruption.

Some have speculated that the Fall elections and other legislative priorities will hinder if not prevent TCSA modernization for several years. Nevertheless, what is clear is that EPA will continue to push for more comprehensive, precise and transparent chemicals management with a renewed insistence on industry responsibility for these data. Furthermore, administrative agencies and legislatures around the globe mirror these objectives. Some may view this as yet another regulator burden; however, we believe that this creates a unique opportunity for the most proactive industries and companies to gain a competitive advantage. Efforts to address EPA’s administrative demands can be structured to give companies a head start compiling an understanding of their chemical use, potential health effects, exposures, and product life cycle that a revised TSCA and its overseas counterparts will ultimately require. In the process, they will distinguish their firms as responsible and forward-looking in the eyes of regulators, customers, and consumer advocates.

Thursday, August 5, 2010

Aggregate and Cumulative Exposures

An understanding of the various routes of exposure for chemicals in products, as well as their interaction with similar chemicals, are an important element of REACH and will likewise be central to a reformed TSCA. In combination with data on health effects, this information will allow authorities to make required safety determinations. Reliable modeling of exposure scenarios will also equip companies to persuasively defend their products against unfounded accusations in the media, the internet and the courts.

********

An influential 1994 National Academy of Sciences report called Science and Judgment in Risk Assessment was one of the first voices citing aggregate and cumulative chemical exposures as critical to understanding real-world risk. Aggregate and cumulative exposures are related concepts concerning the potential impact of a given chemical through multiple routes of exposure, as well as the possibility that multiple chemicals might interact to produce additive or synergistic effects. For example, a person might be exposed to mercury from a smelting plant, but also in the fish he consumers. The same person might be exposed to different substances that share mercury’s neurotoxic effects, for example in pesticides he uses in the garden.

The Lautenberg bill directs EPA to “consider” the work of the Academy in this area. H.R. 5820 goes further, directing EPA to incorporate aggregate exposures in its determination of “reasonable certainty of no harm” (RCNH) for a particular chemical. At the July 29 hearing, there was some disagreement among witnesses and subcommittee members whether EPA or the chemical manufacturer or processor bore the ultimate responsibility for examining and making judgments on these complex issues.

Several lawmakers, including Rep. Diana DeGette (D-CO) noted that a recent modification to the bill agreed to by its drafters would make the company responsible only for providing information on exposure related to the chemical’s “intended use.” The agency would then amalgamate company submissions on various uses and exposures and incorporate these into its safety determination. However, other sections of the bill make clear that the agency’s failure to make timely determinations can have the effect of keeping a chemical out of the marketplace. Consequently, if EPA is thwarted in its efforts by the complexity of the analysis, by resource constraints or other factors, producers, formulators and commercial users may well have to undertake this effort themselves in order to keep the substance in use.

Regardless of what TSCA eventually requires, we believe there are compelling reasons for companies involved in the production and use of chemicals to take a proactive approach to exposure modeling and life-cycle analysis. Exposure data is a necessary complement to data on human and ecological toxicity in the realm of product stewardship and defense. Given the ready access to information from a variety of media sources and the ease with which this information can be distorted or misunderstood, a company’s ability to substantiate its exposure, hazard and disposal findings will prove critical in defending a product, company or industry. The groundswell of toxic torts cases, along with state-initiatives to restrict or eliminate chemicals based on inaccurate data, further support the wisdom of building a baseline understanding of exposure. We believe there’s no need to wait for legislation, news coverage or litigation to begin assembling one's product defense arsenal.

Tuesday, August 3, 2010

Regulating Hazard or Risk - The Debate Continues

Another observation from the July 29th hearing was the extent to which debate over chemicals managment is circumscribed by the concept of hazard as opposed to risk. The witness testimony, as well as the question/answer exchanges, stressed the need to find “safer chemicals” to replace “toxic, persistent, and/or bioaccumulative chemicals” currently on the market.

However, as any first year toxicology student can attest, “the dose makes the poison.” A chemical demonstrating toxic properties in the laboratory can nonetheless be used with minimum risk provided that exposures are controlled. For example, most of us readily accept the presence of cadmium and other metals in auto batteries because the sealed battery housing, coupled with deposit and recycling programs, minimize the potential for human and environmental exposure.

It would be unfortunate if legislators miss a chance to significantly improve how we manufacture, distribute, use, and dispose of chemicals by continuing to focus on toxicity (in the abstract) as opposed to risk (in real-world context). This is one of a number of areas where a more informed embrace of scientific principles could benefit the final legislative product.

Monday, August 2, 2010

TSCA: Innovation Through Stronger Regulation?

The House Subcommittee on Commerce, Trade and Consumer Protection held a hearing on July 29, 2010 on H.R. 5820, recently-introduced legislation to revamp the Toxic Substances Control Act (TSCA). One recurring theme at the hearing was the contention that fuller information about chemicals and their potential health effects would encourage the development and commercialization of safer alternatives. It was suggested that this in turn would advance the innovation and competitiveness of U.S. chemical producers. In recent years, similar arguments have been made about the innovation and job-creation potential of carbon regulation and other environmental initiatives.

Let's not reflexively reject the notion that public policies can promote economic growth and competitiveness. Consider the Interstate Highway System. Is the nation’s economy healthier and its standard of living higher by virtue of the massive federal program to construct our network or highways, bridges, and roads? Economists across the political spectrum say “yes,” citing increased safety, workforce mobility, just-in-time delivery of industrial inputs, and the resulting growth of cities in the Sun Belt.

Turing our focus on chemicals policy, one can envision a regulatory framework that might help bring newer, safer products into the marketplace. It would require, on one hand, a fairly burdensome regimen of testing and reporting for existing chemicals (particularly those with indicia of hazard), and greater transparency into chemical formulations and other information now deemed proprietary. This would have to be coupled by more lenient treatment of new chemical formulations, including a comparatively expeditious path to market for those that are likely free of health and environmental concerns.

But is this type of framework embodied in H.R. 5820? Some proponents argue that it is. Dr. Richard Denison of the Environmental Defense Fund said at the hearing that the bill would allow “safer” chemicals to enter the market with less burdensome safety determinations. Stated that way, the argument appears internally inconsistent, because it is safety determinations that are chiefly relied upon to distinguish between good and bad chemical actors. Denison’s written testimony more explicitly defines what he means by “safer”: H.R. 5820 would allow new chemicals to enter the market without safety determinations if they are intrinsically low hazard, are safer for particular uses than chemicals already on the market, or serve critical uses.

What types of chemicals could be considered “intrinsically low hazard”? This category would include substances whose biological action is acknowledged to be benign and compounds with extremely low toxicity. However, most of the promising chemicals that have not yet been commercialized would fall outside of these relatively narrow classes. For these, structure-activity relationship (SAR) analysis would be needed to establish “intrinsically low hazard.” This type of analysis comes with its own complexities, and may not represent a significant decrease in burden as compared with traditional safety determinations. As such, it is not clear that chemicals in this category would benefit from an expedited route to market to the extent that the bill would promote innovation.

Tuesday, July 27, 2010

TSCA: Pre-Manufacture Notification and Review


TSCA reform promises to significantly alter the chemical marketplace. Contemplated changes will have a profound impact on all industries that manufacture, distribute, modify or utilize chemicals, articles or mixtures. Recognizing the magnitude of this potential impact, Science News & Views intends to provide substantive monitoring and analysis as the legislative process proceeds. Today, we will examine the current and proposed system of pre-manufacture notification and review of chemicals, a controversial aspect of the existing statute.

*****


Section 5 of existing TSCA prohibits the manufacture, processing, or import of a “new chemical substance” or “significant new use” of an existing substance unless a pre-manufacture notification (PMN) is submitted to EPA at least 90 days in advance.


Section 5 does not currently require a submitter to conduct testing before submitting a PMN, but merely to provide any information on health or environmental effects that are in its possession. During the 90-day review period, EPA is to utilize this and other information to determine whether the chemical “may present an unreasonable risk of injury to health or the environment,” and if so, to request more data, prohibit or limit manufacture, or halt the review process. In practice, the dearth of data accompanying submissions impairs the agency’s ability to make informed judgments about the safety of a given compound. Reform advocates cite this as a key weakness of the existing statute.


The proposed bills would toughen the Section 5 PMN and review process. Both the House and Senate would subject new chemicals and uses under Section 5 (as well as existing chemicals under Section 6) to the requirement of a “minimum data set.” This data set would consist of the chemical’s identity, physical characteristics, toxicological properties, hazard, exposure and use, along with other information that EPA establishes by rule. Indications are that the agency will require data on both traditional endpoints such as carcinogenesis and mutagenicity, and also emerging concerns such as bioaccumulation, environmental persistence and endocrine disruption.


The current presumption that a chemical is appropriate for the marketplace in the absence of an “unreasonable risk” or “insufficient data” finding will be inverted. Instead, a six-month to one-year review of the application will be triggered unless EPA affirmatively finds that the chemical is “reasonably anticipated” to meet safety standards.


Manufacturers and processers will have the responsibility of identifying or generating research needed to complete the data set for a new chemical or new chemical use, effectively shifting the “burden of proof” to them. They would also be responsible for updating data submissions to reflect new information. Both bills would give EPA the power to compel testing by administrative order and to specify appropriate methodologies. The House bill would allow the agency to to assess fines for non-compliance.


The PMN and minimum data set provisions will likely be the subject of debate and modification, particularly the timetables for implementation. Recent changes to chemical regulation in Europe help illustrate this point. The REACH law imposed similarly aggressive benchmarks for generation and submission of test data. This approach has proved to be impractical, time-consuming and expensive to implement.


Are we on a similar path with a new TSCA?


Clearly early engagement with legislators to help craft a smart and effective chemical regulation, one that protects human health and the environment in a cost-effective, rational manner, is critical.

Friday, July 23, 2010

TSCA Reform--Moving Forward

In a move which may signal growing momentum for chemical regulation reform, leaders of the House Energy & Commerce Committee introduced TSCA modernization legislation (H.R. 5820) and scheduled a subcommittee hearing for July 29, 2010. We will continue to provide monitoring and analysis of this bill and its Senate counterpart as they move through the legislative process.

Thursday, July 22, 2010

Regulating Wood Dust - Is There A Better Way?

California OSHA is in the process of updating permissible exposure limits (PELs) for a number of airborne substances found in the state’s workplaces. Currently under review is wood dust, a byproduct of wood milling, sanding and routing in businesses ranging from small custom cabinetry shops to the largest sawmills and flooring producers. The history of wood dust regulation provides an interesting window into the interaction between researchers, federal and state safety agencies, courts and non-governmental standard setting bodies.

Wood dust first came to the attention of health researchers in connection with nasal adenocarcinoma. In the 1960's, English scientists observed a much higher than expected incidence of tumors in High Wycombe, an area northwest of London that was then a center of the furniture and cabinet industry. Adenocarcinoma cases were several hundred times more common among this group than among comparable industrial workers. When U.S. researchers attempted to replicate these findings, however, they found much lower relative risks associated with occupational wood exposure. In fact, pooled analysis of 220,000 woodworkers studied in North Carolina and Virginia identified only three cases, about what would be expected among the general public.

In light of this ambiguous evidence, U.S. OSHA proposed a fairly reasonable 5 mg /m3 standard for total dust as a time-weighted average (TWA) in 1989. This level, which was supported by key woodworking industry organizations, essentially required older plants to adopt the dust-control technologies such as cyclones and baghouses already being implemented by newer facilities. Unfortunately, OSHA compiled the wood dust standard with proposed PELs for hundreds of unrelated airborne substances. In 1992, a federal court found that not all of these standards were supported by good science and invalidated all of them.

Technically, that left wood dust regulated as a “nuisance dust” at 15 mg/m3. However, state occupational agencies (including Cal OSHA) adopted the 5 mg standard for total dust, and federal OSHA retained the power to cite dusty workplaces under its General Duty Clause. Recent workplace monitoring shows that the vast majority of work stations in woodworking facilities meet the 5 mg level. For example, Glindmeyer (2008) found inhalable dust samples ranging from 0.82 to 2.51 mg/m3 across a group of ten plants, which included furniture, kitchen cabinet, and flooring facilities as well as a sawmill. Geometric standard deviations ranged from 2.1-2.8 mg/m3.

About 15 years ago, researchers began focusing on non-cancer effects of wood dust. The concern was that long-term occupational exposure could compromise pulmonary function. This placed renewed focus on the smallest “respirable” dust particles most likely to reach the lungs. Generally, these measure 2.5 microns or less, the same size as the PM 2.5 particulate matter that have been the subject of ambient air regulation over the last decade.

A concern about the respirable fraction of dust and its impact on pulmonary health has informed analysis and standard setting in recent years. The American Conference of Governmental Industrial Hygienists (ACGIH) has set a recommended threshold limit value (TLV) of 1 mg/m3 for inhalable dust of most wood species. Cal OSHA is evaluating a mandatory PEL of 1 mg/m3 total dust. The agency’s health effects advisory committee wrote earlier this year that “the history of lung disease findings at higher wood dust exposure levels, with recent lower wood dust dose studies showing no or little adverse effect, and the … carcinogenicity of wood dust vis. a vis. sino-nasal cancer [supports] lowering the PEL to 1 mg/m3.” A committee charged with evaluating the technical and economic feasibility of this standard meets this October 6 in Oakland.

Deliberations over wood dust spotlight a complex interaction between researchers, state and federal regulators, industry, and the courts. This is probably not a decision-making apparatus that any of us would design from scratch. However, it has arguably responded effectively to emerging research and technological innovation. Is there a better, time and cost-effective approach?

We welcome your comments.