After West disaster,
By JON McCLURE, DANIEL LATHROP and MATT JACOB
Updated: 25 August 2013 11:57 AM
Even the best national data on chemical accidents is wrong nine times out of 10.
A Dallas Morning News analysis of more than 750,000 federal records found pervasive inaccuracies and holes in data on chemical accidents, such as the one in West that killed 15 people and injured more than 300.
In fact, no one at any level of government knows how often serious chemical accidents occur each year in the United States. And there is no plan in place for federal agencies to gather more accurate information.
As a result, the kind of data sharing ordered by President Barack Obama in response to West is unlikely to improve the government’s ability to answer even the most basic questions about chemical safety.
“We can track Gross National Product to the second and third decimal, but there is no reliable way of tracking even simple things like how many [chemical] accidents happen,” said Sam Mannan, a nationally recognized expert on chemical safety who recently testified before a congressional hearing on West.
“This is just scandalous.”
After the West explosion in April, The News asked a simple question: How often do serious or potentially serious industrial chemical accidents occur in Texas and nationwide? After scouring the four federal databases with the most comprehensive information available on chemical safety, The News concluded that there was no way to know.
For a recent four-year period, the paper managed to confirm at least 24 industrial chemical accidents in Texas that resulted in deaths, injuries or evacuations. But the poor data quality guarantees that number does not account for all accidents. Nor was it possible to make a meaningful comparison with other states that would lend important context to the safety picture in Texas.
Large data systems have inherent problems with accuracy — an issue that experts caution will only worsen in an era when huge amounts of electronic data are being collected. Even so, government investigators and researchers have been warning for at least 25 years about the problems with chemical accident data. The News found report after report that said chemical accident data were insufficient to spot even basic accident patterns and suggest solutions.
“The data are insufficient to drive analysis that would help prevent future accidents,” said Paul Orum, a chemical safety consultant to environmental groups, who has frequently testified before Congress.
What’s needed, experts say, is an overhaul of the data-collection process or the expansion of an existing pilot program that has labored under years of inadequate funding.
Best of the bad
Only one agency collects nationally comprehensive information specifically on chemical accidents. The U.S. Coast Guard’s National Response Center receives reports of chemical spills and other accidents from companies, emergency responders and the general public.
But the NRC data is no more than a call log, like a 911 hotline for environmental emergencies, and first reports often turn out to be wrong. Following up those initial reports to update the data and record what actually happened is not part of the center’s mission, spokesman Andrew Kennedy said.
Such bad data can lead to bad conclusions. The New York Times, for example, cited the response center’s data in a June 1 editorial about the chemical accident dangers revealed by the West explosion. The Times’ editorial said that 1,270 people had died as a result of chemical spills and accidents around the country in 2012. But that figure included 907 deaths that didn’t involve chemicals and 137 that never happened. They were recorded as a part of training exercises, The News found.
Even when chemical accidents are correctly identified in the data, estimates of injuries and deaths can be way off.
According to the data, only one person died when oxygen tanks exploded on a bus carrying hurricane evacuees near Wilmer, just south of Dallas, in 2005. In fact, 23 elderly residents of an assisted-living facility died that day when a fire in the bus’s wheel well ignited the tanks.
Despite these problems, researchers say the National Response Center data is the best single source they have to study these chemical safety issues.
“It’s comprehensive, but it’s useless data,” said Mannan, who heads the Mary Kay O’Connor Process Safety Center at Texas A&M and worked 12 years in private industry as a chemical engineer. “Only 10 percent accuracy. Nowhere near reliable to where you could make statistically valid conclusions.”
To get reliable numbers for chemical accidents at facilities in Texas, The News tried to verify records in the NRC database with three other national datasets. What followed was a three-month data-mining effort to triangulate NRC data with other databases that should cover many of the same accidents.
In addition to data from the National Response Center, The News examined chemical accident information recorded by:
The National Fire Incident Reporting System, which tracks the activities of participating fire departments that often respond to chemical accidents.
The Occupational Safety and Health Administration, which tracks accidents and injuries at workplaces, including chemical facilities.
The Chemical Safety Board, which maintains a list of major chemical accidents, culled from media reports.
To test the degree to which the four databases overlapped, The News developed a set of matching criteria. If a serious accident was recorded in any data set, the paper attempted to match the records in other data sets according to date and location of the accident, number of fatalities or injuries, or whether there was an evacuation.
The results were disappointing.
During the 2008-11 study period, for example, there were 158 calls to the NRC related to potentially serious chemical accidents at facilities in Texas. But only 12 percent of those could be confirmed in any of the other databases.
The News used the same criteria for California, a state similar to Texas in the size of its population and its chemical industry. California’s tougher state-level environmental and workplace safety regulations presumably would produce better reporting and, therefore, more matches. But the data for California was just as bad.
A total of 174 reports were made of serious chemical accidents in California, with only 10 percent confirmed in the other data sets.
The News concluded that there was no systematic way to identify serious accidents among the hundreds of thousands of records in the four datasets. The only way forward was to loosen the matching criteria and read through more than 500 individual accident narratives to identify serious chemical accidents. Doing that, The News was able to confirm at least 24 serious or potentially serious chemical accidents in Texas between 2008 and 2011.
On rough average, that’s one every two months — a lot more than make headlines.
Because of limitations in the data, it was also difficult to determine the potential for any one chemical accident to have caused serious harm if not for first responders, or even plain luck.
A 2009 fire at El Dorado Chemical Co. in Bryan is a good example. Like the West fertilizer plant whose stores exploded during a fire April 17, the Bryan facility housed large amounts of ammonium nitrate.
Thousands of Bryan residents were evacuated from their homes when the El Dorado fire erupted. The blaze burned out before setting off a major explosion, but the potential for disaster was eerily similar to that in West.
In the NRC database, though, the El Dorado event is simply reported as a “large chemical fire” that prompted an evacuation.
Had Bryan been recognized as a near miss, lessons could have been learned that might have prevented the explosion in West four years later. Instead, the best source of national data on chemical safety ranks Bryan alongside more commonplace fires with no distinction.
Government reports citing serious problems with chemical safety data go back to at least the 1980s. That’s when an accidental release of methyl isocyanate from a Union Carbide plant in Bhopal, India, killed more than 5,000 people.
The 1984 disaster spurred the Environmental Protection Agency to attempt to create a chemical accident database. The effort began in 1985. But researchers quickly found that many serious chemical accidents never came to the attention of any federal agency. By 1989, funding for the project had ended.
Former EPA Administrator Bill Ruckelshaus, who led that initiative, said he was looking for hard data to help policymakers ensure that the chemical industry had proper safeguards in place. Then as now, he said, “We should be carefully assessing accidents or instances in which toxic materials have been released and find out … why that happened and take steps necessary to reduce the chances of it happening in the future.”
The News identified at least a dozen subsequent reports from investigators and researchers citing a lack of good data. Moreover, reports repeatedly said that government agencies had the authority to collect the needed data but didn’t.
“Here we are 29, 30 years after Bhopal and there is no, absolutely no, way of making any credible statement or answering a simple question: Are we getting better or are we getting worse?” Mannan said.
Recent criticism has focused on the U.S. Chemical Safety Board, an agency launched in 1998 with a mission to investigate chemical accidents that caused or risked fatalities, serious injuries and major property damage.
According to a series of reports by the Government Accountability Office and various inspectors general, the board was supposed to have created a database to help gauge chemical safety trends. But it hasn’t yet done so.
Board officials dispute that such an effort is mandatory under statute and say that, even if it were, the board lacks the resources to carry it out.
“There’s a lot in the federal government that’s authorized in statute but Congress hasn’t put any money or focus on,” said Daniel Horowitz, the board’s managing director.
The board primarily focuses on conducting investigations of individual accidents at the behest of Congress, such as its probe of the fire and explosion in West and the Deepwater Horizon oil spill in the Gulf of Mexico. It requested additional funds for the 2014 budget, from $10.5 million to $11.4 million. But there is no specific funding in the budget request for improving data collection and analysis.
“We agree that the current system is very fragmented and incomplete, and it’s a problem,” Horowitz said.
But it’s “somewhat atypical” to have an independent investigative agency collect data, he said. The EPA is better-suited to that task because it already collects accident data as part of its Risk Management Program for extremely hazardous materials, Horowitz said.
That EPA program oversees only certain designated chemicals. Not on its oversight list is ammonium nitrate, the chemical that devastated West.
Plugging the gap
At least three agencies could step up to fix the problems with chemical safety data.
The Chemical Safety Board has the legal authority to create a comprehensive system. The National Response Center already maintains the largest database of preliminary accident reports. But it’s unclear what additional resources either would need to improve data collection and whether Congress would appropriate the funds.
The best solution may be a pilot chemical accident data collection effort under the U.S. Department of Health and Human Services.
The National Toxic Substances Incident Program gathers data from multiple state and local agencies, in combination with federal resources such as the NRC.
“That’s as good a data surveillance program as you will get currently,” Mannan said.
But Congress has appropriated only enough funding for 10 states to participate on a cost-sharing basis. A previous accident surveillance program, similar but less comprehensive, had funding for 14 states, one of which was Texas. After the congressional budget cuts, Texas declined to share the cost and dropped out of the program.
“We’re trying to get states to buy into it,” said Maureen Orr, an epidemiologist with the National Toxic Substances Incident Program. “Unfortunately, we don’t have the funding to do every state.”
Nor does the program presently receive any funding from either the EPA or the Chemical Safety Board, though both agencies benefit from data-sharing agreements with the program.
“We’re always trying to increase partnerships,” Orr said.
Combining the duplicative efforts of several agencies, each with its own mandate to capture part of the chemical safety picture, could mean an overall cost reduction, experts said.
“It’s extremely important to know what your problems are,” Orr said. “It would allow them [states] to target their limited resources.”
At least one chemical industry group has been collecting its own accident data since 1994.
Over 100 member companies of the American Chemistry Council participate in the program, in which accident and environmental impact data is gathered according to a uniform set of criteria. The data helps show a return on the group’s investment in safety, the council says, and that amounts to tens of millions of dollars each year.
“The key concern was that external stakeholders said it’s hard to determine if you’re making real progress,” said Dan Roczniak, senior director of the program. “This was a way to measure hard data that people on the outside wanted to see.”
Council spokesman Scott Jensen said small operations often benefit most from the data collected. Without the resources to pay for ongoing safety or research programs, they can learn best practices from the experience of much larger industry players.
Despite the council’s efforts, its membership represents only a segment of the chemical-manufacturing industry, which according to the U.S. Census Bureau includes more than 22,000 companies. Moreover, while the council publicly releases company summary data on severe chemical accidents, the data is neither detailed nor comprehensive enough to inform regulations.
In fact, many chemical companies strongly oppose publicly releasing their accident data, Mannan said.
“They say, ‘We have enough problems dealing with lawsuits,’” he said. “I mean, I have very good relationships with these people, but this data issue, I just can’t get them onboard.”
In any case, experts say, relying on companies to self-report accident data is not a very good idea.
The human effort, time and cost required to ensure that data is accurate, uniform and useful for study often runs counter to a company’s financial interests.
“The motivation is not always there,” said Andy Podgurski, a professor of electrical engineering at Case Western Reserve University and co-author of a study on “Big Bad Data” published in the Journal of Law, Medicine & Ethics.
“Why would a company or organization do data checking? They might do it because there is a liability risk. They might do it because they feel it’s the right thing to do,” Podgurski said in an interview. “But even in an honest organization, deadline financial pressures will mitigate those efforts.”