Environmental Health News

What's Working

  • Garden Mosaics projects promote science education while connecting young and old people as they work together in local gardens.
  • Hope Meadows is a planned inter-generational community containing foster and adoptive parents, children, and senior citizens
  • In August 2002, the Los Angeles Unified School District (LAUSD) Board voted to ban soft drinks from all of the district’s schools

#865 -- Why Do We Poison Our Children?, 27-Jul-2006


Rachel's Democracy & Health News #865

"Environment, health, jobs and justice--Who gets to decide?"

Thursday, July 27, 2006.................Printer-friendly version
www.rachel.org -- To make a secure donation, click here.

Featured stories in this issue...

Why Do We Poison Our Children?
  Like many of our readers, we have worked for decades to provide
  good information to citizens and to decision-makers, on the assumption
  that more facts, packaged effectively, would give rise to better
  decisions. But is this assumption valid? Can information by itself,
  without a robust grass-roots movement behind it, alter the fundamental
  behavior of the people who govern America?
Insurmountable Risks
  We are told we must build more nuclear power plants to halt global
  warming. It may sound persuasive at first -- until the Institute for
  Energy and Environmental Research (IEER) examines the numbers and
  reveals that nuclear power almost certainly cannot expand rapidly
  enough to make a real difference. And if it did, it would create
  intractable problems of waste disposal and the spread of A-bombs
White House Is Undermining All Environment and Health Regulations
  While we are all paying attention to the catastrophe unfolding in
  the Middle East, the White House is working quietly to weaken all
  environmental and health regulations at home. Their strategy is
  science-based: require every regulation to be accompanied by a high-
  quality risk assessment for which the necessary data are almost never
  available. No risk assessment, no regulation. It is a brilliant
  strategy that will keep government both bloated and impotent.
Congress Is Ready to Prevent States from Regulating Pesticides
  Congress is about to do another huge favor for the chemical
  industry -- pre-empting the right of states and local governments to
  regulate pesticides. With the Congress beholden to Big Money and
  forever on the take, states need to retain independent power to
  protect the health of their citizens.


From: Rachel's Democracy & Health News #865, Jul. 27, 2006
[Printer-friendly version]


By Peter Montague

As every reader of Rachel's News knows, there is abundant evidence
that our children are being subtly poisoned by chemicals. Sources
include a daily cocktail of pesticides, hormone-disrupters such as
phthalates and BPA leaching out of plastic products, benzene in soft
drinks, and so on. Every year or two, some new threat to children's
health is discovered, with no end in sight.

Is this a problem that can be solved merely by providing better
information to decision-makers? Or is it possibly a messaging problem
that can be solved by merely packaging our information in slicker,
more persuasive ways?

These approaches assume that the permanent (unelected) government
simply doesn't know that children are being poisoned or what it's
costing in suffering and in dollars. According to this view, if we
just provide compelling facts they'll come to their senses and change
their behavior. History suggests that this is not the case.

Let's look at the well-documented example of toxic lead.

In 1992, Rachel's News #294 laid out the history of toxic lead
exposures of children, including what was known about childhood
poisoning starting in 1892. By 1920 it was clear that U.S. children
were being poisoned (Europe and Australia was beginning to ban lead in
paint by that time). By 1950, it was well-documented that really large
numbers of children were being poisoned, and rather severely. Rachel's
also documented the provisional (elected) government's response,
which was a Great Wringing of Hands. You can find the history here:


In 2000, Rachel's News ran a 3-part series, filling in more historical
details about the poisoning of children in the U.S. -- the series was
called "Dumbing Down the Children."

Part 1: Rachel's News #687: http://www.rachel.org/bulletin.cfm?Is

Part 2: Rachel's News #688:  http://www.rachel.org/bulletin.cfm?Is

Part 3: Rachel's News #689: http://www.rachel.org/bulletin.cfm?Is

Since then at least two major studies have shown that there would be
very substantial multi-billion dollar savings to the national economy
if we reduced lead exosures below current levels. More Great Wringing
of Hands.



In 2004, Rachel's reported new estimates, that removing lead from U.S.
housing stock would cost $16 billion but would result in an immediate
benefit of $43 billion, with very substantial multi-billion-dollar
profits to the national economy EVERY YEAR thereafter. We also showed
that, at the present rate of lead removal, U.S. housing stock will
remain contaminated for the next 120 years.

It's pretty clear that the permanent government -- and perhaps many in
the provisional government as well -- believe toxic lead in children
is desirable -- desirable enough to forego tens of billions of dollars
in savings each year. Put another way, the nation's leaders are
willing to accept costs of tens of billions of dollars each year for
the benefit of keeping hundreds of thousands of children (particularly
poor children and children of color) behind the eight-ball.


All the Rachel's News stories have been based on readily-available
information from the open literature. Much of the information comes
directly from the provisional government itself, and from the
newspaper of record, the New York Times. No secrets here.

I think this goes to the heart of an information-and-messaging-only
strategy, doesn't it?

If the permanent government will change its ways when confronted with
the facts, then we just need to gather more facts and package them

But if history is any guide, the permanent government is NOT moved by
mere facts or mere multi-billion-dollar savings offered by pollution
prevention. For some reason (which each of us can decide for himself
or herself), the permanent government calculates that someone or
something important is better-off when large numbers of children are
poisoned each year, even at considerable cost to GDP.

If this is the case, then campaigns built around "more information"
and "more effective messaging" -- without intentionally building the
infrastructure to support and sustain a grass-roots movement for
change -- are likely to have quite limited success, are they not?

Return to Table of Contents


From: Science for Democratic Action, Aug. 1, 2006
[Printer-friendly version]


Can Nuclear Power Solve the Global Warming Problem?

By Brice Smith[1]

Climate change is by far the most serious vulnerability associated
with the world's current energy system. While there are significant
uncertainties, the possible outcomes of global warming are so varied
and potentially so severe in their ecological and human impacts that
immediate precautionary action is called for.

Compared to fossil fuels, nuclear power emits far lower levels of
greenhouse gases even when mining, enrichment, and fuel fabrication
are taken into consideration.[2] As a result, some have come to
believe that nuclear power should play a role in reducing greenhouse
gas emissions.

The most important practical consideration, rarely addressed in the
debate, is this: how many nuclear power plants will it take to
significantly impact future carbon dioxide emissions from fossil fuel
power plants? We have considered in detail two representative
scenarios for the future expansion of nuclear power. The assumed
worldwide growth rate of electricity is the same for both, 2.1 percent
per year, comparable to values assumed in most conventional studies of
the electricity sector.

Nuclear growth scenarios

The first scenario was taken from a 2003 study from the Massachusetts
Institute of Technology.[3] In this report, the authors envisioned a
"global growth scenario" with a base case of 1,000 gigawatts (GW) of
nuclear capacity installed around the world by 2050. Since all of the
reactors in operation today would be shut down by mid-century, this
would represent a net increase of roughly a factor of three over
today's effective capacity. To give a sense of scale, this proposal
would require one new reactor to come online somewhere in the world
every 15 days on average between 2010 and 2050.

Despite the increase in nuclear power envisioned under the global
growth scenario, the proportion of electricity supplied by nuclear
power plants would increase only slightly, from about 16 percent to
about 20 percent. As a result, fossil fuel-fired generation would also
grow and the emissions of carbon dioxide, the most important
greenhouse gas, from the electricity sector would continue to

In order to consider a more serious effort to limit carbon emissions
through the use of nuclear power, we developed the "steady-state
growth scenario." Using the same electricity demand growth assumed in
the MIT report, we calculated the number of nuclear reactors that
would be required to simply maintain global carbon dioxide emissions
at their year 2000 levels.

Considering a range of assumptions about the future contribution of
renewables and natural gas fired plants, we found that between 1,900
and 3,300 GW of nuclear capacity would be required to hold emissions
constant. For simplicity we used 2,500 GW as the alternative case
study. This scenario is roughly equivalent to assuming that nuclear
plays about the same role in the global electricity sector in the year
2050 as coal does today in the United States.

In order to significantly reduce carbon dioxide emissions, nuclear
power plant construction would have to be more rapid than one a week.
We have not considered such scenarios, since the dangers of using
nuclear energy to address greenhouse gas emissions are amply clear in
the two scenarios discussed here.

Evaluating the scanarios

Given that both time and resources are limited, a choice must be made
as to which sources of electricity should be pursued aggressively and
which should not. The best mix of alternatives will vary according to
local, regional, and country-wide resources and needs. In making a
choice, the following should serve to help guide the selection:

1. The options must be capable of making a significant contribution to
a reduction in greenhouse gas emissions, with a preference given to
those that achieve more rapid reductions;

2. The options should be economically competitive to facilitate their
rapid entry into the market; and,

3. The options should minimize other environmental and security
impacts and should be compatible with a longer term vision for
creating an equitable and sustainable global energy system.

It is within this context that the future of nuclear power must be


The largest vulnerability associated with a large expansion of nuclear
power is likely to be its connection to the potential proliferation of
nuclear weapons. In order to fuel the global or steady-state growth
scenarios, the world's uranium enrichment capacity would have to
increase by approximately two and half to six times.[4] Just one
percent of the enrichment capacity required by the global growth
scenario would be enough to supply the highly-enriched uranium for
nearly 210 nuclear weapons every year. Reprocessing the spent fuel
would add significantly to these security risks (see below). Proposals
to reduce the risks of nuclear weapons proliferation are unlikely to
be successful in a world where the five acknowledged nuclear weapons
states seek to retain their arsenals indefinitely. The
institutionalization of a system in which some states are allowed to
possess nuclear weapons while dictating intrusive inspections and
restricting what activities other states may pursue is not likely to
be sustainable. As summarized by Mohamed ElBaradei, director general
of the International Atomic Energy Agency: "We must abandon the
unworkable notion that it is morally reprehensible for some countries
to pursue weapons of mass destruction yet morally acceptable for
others to rely on them for security -- indeed to continue to refine
their capacities and postulate plans for their use."[5]

Without a concrete, verifiable program to irreversibly eliminate the
tens of thousands of existing nuclear weapons, no nonproliferation
strategy is likely to be successful no matter how strong.


The potential for a catastrophic reactor accident or well coordinated
terrorist attack to release a large amount of radiation is another
unique danger of nuclear power. Such a release could have extremely
severe consequences for human health and the environment. The so-
called CRAC-2 study conducted by Sandia National Laboratories
estimated that a worst case accident at an existing nuclear plant in
the United States could, for some sites, result in tens of thousands
of prompt and long-term deaths and cause hundreds of billions of
dollars in damages.[6] Even if a reactor's secondary containment was
not breached, a serious accident would still cost a great deal. As
summarized by Peter Bradford, a former commissioner of the U.S.
Nuclear Regulatory Commission (NRC):

"The abiding lesson that Three Mile Island taught Wall Street was that
a group of N.R.C.-licensed reactor operators, as good as any others,
could turn a $2 billion asset into a $1 billion cleanup job in about
90 minutes."[7]

Despite the importance of reactor safety, the probabilistic risk
assessments used to estimate the likelihood of accidents have numerous
methodological weaknesses that limit their usefulness. First, the
questions of completeness and how to incorporate design defects are
particularly difficult to handle. Second, concerns arise due to the
fact that nuclear power demands an extremely high level of competence
at all times from the regulators and managers all the way through to
the operators and maintenance crews. Finally, the increased use of
computers and digital systems create important safety tradeoffs, with
improvements possible during normal operation, but with the potential
for unexpected problems to arise during accidents. In light of the
uncertainties inherent in risk assessments, William Ruckelshaus, the
head of the U.S. Environmental Protection Agency under both Presidents
Nixon and Reagan cautioned that:

"We should remember that risk assessment data can be like the captured
spy: if you torture it long enough, it will tell you anything you want
to know."[8]

In the nearly 3,000 reactor-years of experience at power plants in the
United States, there has been one partial core meltdown and a number
of near misses and close calls. From this, the probability of such an
accident occurring is estimated to be between 1 in 8,440 and 1 in 630
per year.[9] Using the median accident probability of 1 in 1,800 per
year, and retaining the assumption from the MIT report that future
plants will be ten times safer than those in operation today, we find
that the probability of at least one accident occurring somewhere in
the world by 2050 would be greater than 75 percent for the global
growth scenario, and over 90 percent for the steady-state growth

The possibility that public opinion could turn sharply against the
widespread use of nuclear power following an accident is a significant
vulnerability. If nuclear power was in the process of being expanded,
public pressure following an accident would leave open few options. On
the other hand, if long-term plans to phase out nuclear power were
already being carried out, there would be far more options available
and those options could be accelerated with less disruption to the
overall economy.

Spent Fuel

There is also the difficulty of managing radioactive waste. The
existence of weapons-usable plutonium in the waste complicates the
problem. While the management of low-level waste will continue to pose
a challenge, by far the largest concern is how to handle spent nuclear

Complicating this task are the long half-lives of some of the
radionuclides present in the waste (for example: plutonium-239, half-
life 24,000 years; technetium-99, half-life 212,000 years; and
iodine-129, half-life 15.7 million years).

Through 2050, the global growth scenario would lead to nearly a
doubling of the average rate at which spent fuel is generated, with
proportionally larger increases under the steady-state growth
scenario. Assuming a constant rate of growth, a repository with the
capacity of Yucca Mountain (70,000 metric tons) would have to come
online somewhere in the world every five and a half years in order to
handle the waste that would be generated under the global growth
scenario. For the steady-state growth scenario, a new repository would
be needed every three years on average.

The characterization and siting of repositories rapidly enough to
handle this waste would be a very serious challenge. Yucca Mountain
has been studied for more than two decades, and it has been the sole
focus of the U.S. Department of Energy (DOE) repository program since
1987. Despite this effort, and nearly $9 billion in expenditures, to
date no license application has yet been filed. In fact, in February
2006, Secretary of Energy Samuel Bodman admitted that the DOE can no
longer make an official estimate for when Yucca Mountain might open
due to ongoing difficulties faced by the project.

Internationally, no country plans to have a repository in operation
before 2020, at the earliest, and all repository programs have
encountered problems during development. Even if the capacity per
repository is increased, deep geologic disposal will remain a major
vulnerability of a much-expanded nuclear power system.

Alternatives to repository disposal are unlikely to overcome the
challenges posed by the amount of waste that would be generated under
the global or steady-state growth scenarios. Proposals to reprocess
the spent fuel would not only not solve the waste problem, but would
greatly increase the dangers. Reprocessing schemes are expensive and
create a number of serious environmental risks while still generating
large volumes of waste destined for repository disposal. In addition,
reprocessing results in the separation of weapons-useable plutonium,
adding significantly to the risks of proliferation. While future
reprocessing technologies like UREX+ or pyroprocessing could have some
nonproliferation benefits, they would still pose a significant risk if
deployed on a large scale. Under the global growth scenario, the
authors of the MIT study estimate that more than 155 metric tons of
separated plutonium would be required annually to supply the required
MOX (mixed-oxide) fuel. Just one percent of this commercial plutonium
would be sufficient to produce more than 190 nuclear weapons every

The authors of the MIT study acknowledge the high cost and negative
impacts of reprocessing and, as such, advocate against its use.
Instead they propose interim storage and expanded research on deep
borehole disposal. It is possible that deep boreholes might prove to
be an alternative in countries with smaller amounts of waste. However,
committing to a large increase in the rate of waste generation based
only on the potential plausibility of a future waste management option
would be to repeat the central error of nuclear power's past. The
concept for mined geologic repositories dates back to at least 1957.
However, turning this idea into a reality has proven quite difficult,
and not one spent fuel rod has yet been permanently disposed of
anywhere in the world.


Nuclear power is likely to be an expensive source of electricity, with
projected costs in the range of six to seven cents per kilowatt-hour
(kWh) for new reactors. Tables 1 and 2 show data from the MIT study
and a study conducted at the University of Chicago.[10] Table 1
shows estimates used for the projected capital costs, construction
lead times and interest rate for natural gas, coal and nuclear power
in the United States. Table 2 show estimates of cost per kilowatt-

While a number of potential cost reductions have been considered by
nuclear power proponents in the United States, it is unlikely that
plants not heavily subsidized by the federal government would be able
to achieve these. This is particularly true given that the cost
improvements would have to be maintained under the very demanding
timetables set by the global or steady-state growth scenario.

Promising Alternatives

A number of energy alternatives that are economically competitive with
new nuclear power are available in the near to medium term.[11] The
choice between these alternatives will hinge primarily on the rapidity
with which they can be brought online and on their relative
environmental and security impacts.

Of the available near-term options for reducing greenhouse gas
emissions, the two most promising ones in the United States and other
areas of the Global North are increasing efficiency and expanding the
use of wind power at favorable sites. At approximately four to six
cents per kWh, wind power at favorable sites in the United States is
already competitive with natural gas or new nuclear power. With the
proper priorities on upgrading the transmission and distribution
infrastructure and changing the way the electricity sector is
regulated, wind power could expand rapidly in the United States. In
fact, without any major changes to the existing grid, wind power could
expand to 15 to 20 percent of U.S. electricity supply, as compared to
less than one-half of one percent in 2003, without negatively
impacting overall stability or reliability.

Improvements in energy efficiency could continue to be made in the
medium term as well. For example, as the current building stock turns
over, older buildings could be replaced by more efficient designs. In
addition, the utilization of wind power, thin-film solar cells,
advanced hydropower at existing dams, and some types of sustainable
biomass could allow renewables to make up an increasingly significant
proportion of the electricity supply over the medium term. This
expansion of renewables could be facilitated through the development
of a robust mix of technologies, the development of strengthened
regional grids to help stabilize the contribution of wind and solar
power through geographic distribution, the use of pumped hydropower
systems to store excess electricity during times of low demand, and
the tighter integration of large scale wind farms with natural gas
fired capacity.[12]

While it would require a significant effort to implement new
efficiency programs and to develop the necessary infrastructure to
expand wind power, these efforts must be compared to the difficulties
that would be encountered in restarting a nuclear power industry that
hasn't had a new order placed in the United States in more than 25
years and hasn't opened a single new plant in the last ten years. In
addition, the current fossil fuel based energy system is very
expensive to maintain. For example, the International Energy Agency
estimates that the amount of investment in oil and gas between 2001
and 2030 will total nearly $6.1 trillion, with 72 percent of that
going towards new exploration and development efforts.

Transition technologies

Energy efficiency and renewable energy programs have few negative
environmental or security impacts compared to our present energy
system and, in fact, have many advantages. As a result, these options
should be pursued to the maximum extent possible. However, in order to
stabilize the climate, it appears likely that some energy sources with
more significant tradeoffs will also be needed as transition

The two most important transition strategies are increased reliance on
the import of liquefied natural gas (LNG) and the development of
integrated coal gasification plants (IGCC-integrated gasification
combined cycle) with sequestration of the carbon dioxide emissions in
geologic formations.

Compared to pulverized coal plants, combined cycle natural gas plants
emit about 55 percent less CO2 for the same amount of generation. If
efficiency improvements and an expanded liquidification and
regasification infrastructure can stabilize the long-term price of
natural gas at the cost of imported LNG, then the use of combined
cycle natural gas plants is likely to remain an economically viable
choice for replacing highly inefficient coal fired plants.

The use of coal gasification technologies would greatly reduce the
emissions of mercury, particulates, and sulfur and nitrogen oxides
from the burning of coal. However, for coal gasification to be
considered as a potentially viable transition technology, it must be
accompanied by carbon sequestration, the injection and storage of CO2
into geologic formations. Experience in the United States with carbon
dioxide injection as part of enhanced oil recovery has been gained
since at least 1972. In addition, the feasibility of sequestering
carbon dioxide has been demonstrated at both the Sleipner gas fields
in the North Sea and the In Salah natural gas fields in Algeria. While
the costs of such strategies are more uncertain than those of other
mitigation options, estimates for the cost of electricity from power
plants with carbon sequestration still fall within the range of six to
seven cents per kWh.

Some of the most troubling aspects of coal, such as mountain top
removal mining, would be mitigated by the reduction in demand due to
increased efficiency and the rapid expansion of alternative energy
sources. In addition, it appears likely that coal gasification and
carbon sequestration would be better suited to the Western United
States given the greater access to oil and gas fields which have
already been explored and which offer the potential for added economic
benefits from enhanced oil and gas recovery. On the other hand, the
Eastern United States would appear better suited for an expanded use
of LNG during the transition given the existing regasification
capacity, the well developed distribution system, and the shorter
transportation routes from the Caribbean, Venezuela, and Western

The continued use of fossil fuels during the transition period will
have many serious drawbacks. However, these must be weighed against
the potentially catastrophic damage that could result from global
warming and against the unique dangers that accompany the use of
nuclear power. To trade one uncertain but potentially catastrophic
health, environmental and security threat for another is not a
sensible basis for an energy policy.

No energy system is free of negative impacts. The challenge is to
choose the least bad mix of options in the near to medium term while
achieving significant global reductions in CO2 emissions, and to move
long term toward the development of a sustainable and equitable global
energy system.


Just as the claim by Atomic Energy Commission Chairman Lewis Strauss
that nuclear power would one day be "too cheap to meter" was known to
be a myth well before ground was broken on the first civilian reactor
in the United States, and just as the link between the nuclear fuel
cycle and the potential to manufacture nuclear weapons was widely
acknowledged before President Eisenhower first voiced his vision for
the "Atoms-for-Peace" program, a careful examination today reveals
that the expense and vulnerabilities associated with nuclear power
would make it a risky and unsustainable option for reducing greenhouse
gas emissions. As the authors of the MIT report themselves conclude:

"The potential impact on the public from safety or waste management
failure and the link to nuclear explosives technology are unique to
nuclear energy among energy supply options. These characteristics and
the fact that nuclear is more costly, make it impossible today to make
a credible case for the immediate expanded use of nuclear power."[13]

Nuclear power is a uniquely dangerous source of electricity that would
create a number of serious risks if employed on a large scale. It is
very unlikely that the problems with nuclear power could be
successfully overcome given the large number of reactors required for
even modestly affecting carbon dioxide emissions. It has now been more
than 50 years since the birth of the civilian nuclear industry and
more than 25 years since the last reactor order was placed in the
United States.

It is time to move on from considering the nuclear option and to begin
focusing on developing more rapid, robust and sustainable options for
addressing the most pressing environmental concern of our day. The
alternatives are available if the public and their decision makers
have the will to make them a reality. If not, our children and
grandchildren will have to live with the consequences.


[1] This article is based on Insurmountable Risks: The Dangers of
Using Nuclear Power to Combat Global Climate Change by Brice Smith
(IEER Press, 2006). Full references can be found in the book, which is
available for purchase at www.EggheadBooks.com.

[2] See Paul J. Meier, "Life-Cycle Assessment of Electricity
Generation Systems and Applications for Climate Change Policy
Analysis", Ph.D. Dissertation, University of Wisconsin-Madison, August
2002, online at http://fti.neep.wisc.edu/pdf/fdm1181.pdf; and, Uwe
R. Fritsche, Comparison of Greenhouse-Gas Emissions and Abatement Cost
of Nuclear and Alternative Energy Options from a Life-Cycle
Perspective, Updated Version (Oko-Institut, Darmstadt, January 2006).

[3] John Deutch and Ernest J. Moniz (co-chairs) et al., The Future of
Nuclear Power, An Interdisciplinary MIT Study, 2003, online at http:

[4] A typical 1000 megawatt (MW) light-water reactor requires
approximately 100 to 120 MTSWU per year in enrichment services to
provide its fuel. For simplicity in this calculation we have assumed
110 MTSWU per year would be required for future reactors. (MTSWU
stands for metric ton separative work unit, a complex unit that
essentially represents the amount of effort required to achieve a
given level of enrichment.)

[5] Mohamed El Baradei, "Saving Ourselves from Self-Destruction", New
York Times, February 12, 2004.

[6] Jim Riccio, Risky Business: The Probability and Consequences of a
Nuclear Accident, A Study for Greenpeace USA, 2001.

[7] Matthew Wald, "Interest in Building Reactors, but Industry Is
Still Cautious," New York Times, May 2, 2005.

[8] William D. Ruckelshaus, "Risk in a Free Society", Risk Analysis,
Vol. 4 No. 3, 157-162 (1984), pp. 157-158.

[9] The cited range represents our estimate for the 5-95 percent
confidence interval for the average accident rate (i.e. there is a 5
percent chance that the actual accident rate is greater than 1 in 633
per year and a 5 percent chance that it is less than 1 in 8,440 per

[10] The Economic Future of Nuclear Power, A Study Conducted at The
University of Chicago, August 2004.

[11] The importance of the fact that the cost of all of the
alternatives tend to cluster around six to seven cents per kWh was
originally noted by Dr. Arjun Makhijani.

[12] Dr. Arjun Makhijani has long advocated for changes to the U.S.
energy system. For a discussion of the IEER recommendations put forth
by Dr. Makhijani for how to best facilitate the expansion of energy
efficiency programs and the development of renewable energy resources,
including actions at the state and local level, see: pp. 181-195 of
Arjun Makhijani and Scott Saleska, The Nuclear Power Deception (Apex
Press, New York, 1999); pp. 48-57 of Arjun Makhijani, Securing the
Energy Future of the United States: Oil, Nuclear, and Electricity
Vulnerabilities and a post-September 11, 2001 Roadmap for Action,
November 2001; and, pp. 7-10 of Arjun Makhijani, Peter Bickel, Aiyou
Chen, and Brice Smith, Cash Crop on the Wind Farm: A New Mexico Case
Study of the Cost, Price, and Value of Wind-Generated Electricity,
Prepared for presentation at the North American Energy Summit Western
Governors' Association, Albuquerque, New Mexico, April 15-16, 2004.
All are available at www.ieer.org.

[13] Deutch and Moniz, op. cit., p. 22 (emphasis added).

Return to Table of Contents


From: Nature, Jul. 20, 2006
[Printer-friendly version]


By Colin Macilwain

'Bush declares war on environment,' read an unflattering CNN headline
in March 2001. It was early in the administration of President George
W. Bush, and the White House had just upwardly revised how much
arsenic it would allow in drinking water. Never mind that they had
stepped into a trap laid by former president Bill Clinton, who sharply
lowered the acceptable level of arsenic just before leaving office.
The direct approach to changing the standards didn't play well, and it
would be the last time the administration openly sought to relax a
pollution rule.

After that, the usually direct administration took a discreet approach
to reforming the way in which the US federal agencies regulate
everything from drinking-water standards and air quality to airport
security and work safety. The approach is so discreet that few outside
Washington DC even realize it exists. Its headquarters are at the
Office of Information and Regulatory Affairs, at the White House
Office of Management and Budget (OMB) -- the low-key powerhouse that
oversees the $2.7-trillion annual spending of the federal government.

Until this February, the effort's chief engineer was John Graham, a
soft-spoken, bespectacled former director of the Center for Risk
Analysis at Harvard. When Graham quit the government to run the Pardee
RAND science-policy graduate school in Santa Monica, California, he
left behind a proposal that could radically change how health, safety
and environmental rules are drawn up. Its real effect would be to
relax them, critics charge.

The OMB Bulletin on Risk Assessment landed quietly on agency
officials' desks in January. The 26-page document outlines proposed
changes to the way the government conducts risk assessments -- the
scientific studies that quantify the risks to human health, safety and
the environment caused by various activities. The bulletin is merely a
proposal at this stage and, unusually, the OMB has sent it to the
National Academies for review. An academies panel chaired by John
Ahearne, an engineer and former head of the Nuclear Regulatory
Commission now at Duke University in Durham, North Carolina, will
report on it in November. The OMB should issue its final bulletin soon
afterwards. And all federal agencies will be expected to adhere to it.

"The quality of risk assessment in the federal government is uneven,"
explains Graham. "What we're trying to attain is a standard across the

But already the knives are out for the proposal. Environmental and
consumer groups charge that its real aim is to weaken regulators, such
as the Environmental Protection Agency (EPA) and the Food and Drug
Administration. "If this is adopted in anything close to its current
form," says Rena Steinzor, a regulatory-affairs specialist at the
University of Maryland in Baltimore, "it will have a devastating
impact on public health and safety regulations."

Congress is also getting involved. On 5 May, leading Democrats in the
House of Representatives fired off a letter to the president of the
National Academy of Sciences, Ralph Cicerone. In it, Tennessee's Bart
Gordon, Michigan's John Dingell and other senior Democrats ask the
National Academies to spell out the limitations of its own review. In
effect, they warn it to refrain from filing a scientific endorsement
of what they regard as a fundamentally political proposal.

"It appears impossible to provide a comprehensive answer to the
questions without reaching beyond the scope of a scientific review,"
says the letter. It goes on to charge that the "OMB's proposed
bulletin is in conflict with the approach taken in existing law" --
which, the authors say, already instructs individual agencies on how
to assess particular risks.

Both supporters and critics agree that the document is the crowning
achievement of Graham's long march to regulatory reform. It is the
fourth important element that he put in place from his powerful
position at the OMB. Three broad-ranging edicts on how agencies
should, and should not, go about the production of regulations have
already been issued: they covered 'information quality', cost-benefit
analyses and scientific peer review. The last drew a sharp and
uninvited riposte from the National Academies when it was released in
2003. The academies said it was far too prescriptive in telling
scientists how to review the work of their peers; the document was
subsequently watered down to accommodate the concerns.

The main scope of January's risk-assessment bulletin is twofold: it
offers guidance on how government agencies should go about conducting
such assessments, and broadens the set of circumstances in which they
need to be done.

Scientific risk assessments are murky affairs at the best of times.
The US environmental movement, in its 1970s heyday, strongly resisted
the very concept; greens prefer the rival paradigm of the
'precautionary principle'. This principle holds that a regulator
responsible for, say, clean water, should respond to uncertainty about
the toxic effects of a given chemical by setting a limit that it holds
to be safe, in advance of more precise information. The approach is
anathema to regulated industries, but it has been embraced, at least
in theory, by political leaders in the European Union. By contrast, in
the United States, where water and air quality are more tightly
regulated, risk assessment has been incorporated into many
environmental laws. With the bulletin, it would be officially
incorporated into all federal-agency responses to risk.

Triple bill

Graham has cited three main examples in support of the need for the
bulletin. One takes issue with a 2002 EPA estimate of the 'safe' dose
in drinking water of perchlorate -- an ingredient in solid rocket
fuel, often dumped on public land by the Department of Defense. A
subsequent, 2005 National Academies study identified a much higher
safe dose of the chemical than the EPA had done -- leading the
agency's critics to say that it should take more care before issuing
such dose estimates.

The second case also concerns the EPA, and an estimate that it made of
deaths caused from cardiopulmonary disease as a result of diesel fuel
emissions from off-road vehicles. The EPA made its estimate, Graham
argues, even as it requested millions of dollars to research the
topic. It turned out that the agency's estimate was merely a central
value between risks that might be much higher or much lower, depending
on the outcome of the research. "When an agency produces a risk
assessment with false precision," says Graham, "it engenders negative
reactions from stakeholders and from the scientific community."

His third example addresses a rather different area. The Department of
Agriculture decided to shut the Canadian border to cattle trading
after mad cow disease was found in Canadian cattle in May 2003. Graham
says that when he raised the issue of a risk assessment with
department officials, as they were considering re-opening the border,
he got "a long silence and a blank stare, as if risk assessment was a
term from a foreign language".

Only last week, a National Academies panel took sharp issue with yet
another EPA risk assessment -- a 2003 estimate of the cancer risk
posed by chemicals called dioxins. The panel, chaired by David Eaton
of the University of Washington in Seattle, said that the agency had
failed to quantify the risks in its assessment, or to justify the main
assumptions behind it.

Graham's bulletin would require government decisions to be subjected
to formal risk assessment; in emergencies, this could be completed
after the decision was implemented. Deciding what constitutes an
emergency can be difficult, however, and critics claim the regime
would stifle government action. "They won't be able to react to things
that involve a rapid decision," says Jennifer Sass of the Natural
Resources Defense Council, an environmental group based in New York.
For instance, the EPA moved swiftly to monitor mould growth in New
Orleans after Hurricane Katrina -- a response that might not be
possible under the new rules, she claims.

The measures enacted by the OMB, critics say, will provide enough
sawdust to clog up the wheels of government regulation for years to
come. "The aim is to bog the process down, in the name of
transparency," says Robert Shull, head of regulatory policy at the
pressure group OMB Watch, based in Washington DC.

Graham, a former economist, says its aim is quite the opposite. "There
will be some cases where risk assessments will become more expensive"
as a result of the bulletin, he concedes. "But their improved quality
will mean less controversy and less delay -- so we can see
opportunities for saving time and money."

Fans in commerce The measure's strongest supporters are the US Chamber
of Commerce and industry groups such as the American Chemistry
Council, headquartered in Arlington, Virginia. "It crystallizes 15 to
20 years of research on how best to do risk assessment, on the basis
of what has been said by the National Academies and others," says the
chemistry council's senior toxicologist, Richard Becker. His group has
endorsed the proposal and suggests that it be given more teeth by
making it subject to judicial review, in which an aggrieved party can
challenge the government's implementation of it.

The National Academies is accustomed to being in the middle of this
kind of fight, but it is rarely asked to review a policy proposal
before it goes into force. Sceptics smell a trap: the academies'
unsolicited intervention greatly diluted the previous White House
proposal on standardizing peer review of scientific evidence. By
asking for its advice first, the OMB can hope to implement a final
rule -- which it will write itself -- that will be difficult for
future administrations to overturn.

The only time the unflappable Graham looks ruffled is when asked if
his sending of the proposed bulletin to the academies for endorsement
might be regarded as a ploy to muffle political criticism. "Anyone who
believes that the National Academies can be used in that way doesn't
understand the process," he sniffs. "Perhaps the OMB should be given
credit for allowing its work to be criticized by the scientific
community. There was no legal requirement for it to do so."

Whatever the National Academies says in November, a rule is likely to
be implemented that will embed risk assessment more deeply in the
decision-making process. This will be John Graham's permanent legacy,
subtly moving the regulatory goalposts in industry's favour, without
catching the public's eye. "It all sounds very nice and sensible,"
says Shull. "Politically, it is much more viable than, say, trying to
weaken the drinking-water standards."

Return to Table of Contents


From: Minneapolis Star Tribune, Jul. 21, 2006
[Printer-friendly version]


Republicans prepare another gift to the chemical industry.

Next time you hear congressional Republicans proclaiming reverence for
states' rights, minimal federal authority, government closest to the
people and all that gas, think about last week's pesticides farce.
On a party-line vote, GOP members of the House Energy and Commerce
committee approved a White House-backed measure that would essentially
eliminate a state's right to ban pesticides without approval from the
U.S. Environmental Protection Agency.

This is not a trivial issue, nor a theoretical one. EPA inaction has
prompted several states to outlaw or sharply restrict pesticides and
other industrial chemicals on their own. Some Minnesota legislators
think this state should phase out the herbicide atrazine, strongly
implicated in frog deformities but, after a series of industry-
friendly reviews, still blessed by EPA.

Nor is this our one-party national government's first effort to
preempt state authority in areas of public health and environmental
protection. The assaults on California's auto-emissions rules come to
mind, and there is persistent talk of a challenge to aggressive
mercury-reduction plans adopted by Minnesota and other states. A count
by House Democrats last month cited 27 new laws limiting states'
rights to set their own policies on such matters as pollution control,
power-plant siting and food safety.

The unifying objective is to give industries the standardized and
milder regulatory environment they prefer. But the official
justification, of course, is always something loftier. In the case of
the pesticide legislation, it's bringing the United States into the
community of nations that have ratified the Stockholm Convention.

This treaty was negotiated in Bill Clinton's administration and signed
by President Bush in 2001. It isn't particularly burdensome to
American business, as yet; the "dirty dozen" substances it outlaws are
already banned or sharply restricted in this country. But the Senate's
Republican majority has been in no hurry to seal the deal -- until

Since the treaty took effect in 2004, some of the other 127
signatories have proposed to ban additional chemicals. Now the U.S.
companies that make, sell or use them are keen to have their interests
protected at the negotiating table.

Rep. Paul Gillmor, an Ohio Republican, was happy to offer a series of
amendments needed to bring U.S. laws into line with the treaty -- and
also, while he was at it, to throw in some irrelevant gifts to the
chemical industry. Besides preemption of state autonomy, there's a
requirement that EPA undertake a cost-benefit analysis in assessing
new restrictions, instead of simply looking at harm to public health.
Other provisions would lengthen a review process already glacial,
neatly allowing the United States to influence future revisions of the
Stockholm pact while postponing compliance.

The politicization of the EPA is an old, sad story, told compellingly
by former EPA chiefs of both parties. This latest move continues that
shameful history, and offers a crystal-clear illustration of why
states must retain independent power to protect their citizens'

Return to Table of Contents


  Rachel's Democracy & Health News (formerly Rachel's Environment &
  Health News) highlights the connections between issues that are
  often considered separately or not at all.

  The natural world is deteriorating and human health is declining  
  because those who make the important decisions aren't the ones who
  bear the brunt. Our purpose is to connect the dots between human
  health, the destruction of nature, the decline of community, the
  rise of economic insecurity and inequalities, growing stress among
  workers and families, and the crippling legacies of patriarchy,
  intolerance, and racial injustice that allow us to be divided and
  therefore ruled by the few.  

  In a democracy, there are no more fundamental questions than, "Who
  gets to decide?" And, "How do the few control the many, and what
  might be done about it?"

  As you come across stories that might help people connect the dots,
  please Email them to us at dhn@rachel.org.
  Rachel's Democracy & Health News is published as often as
  necessary to provide readers with up-to-date coverage of the

  Peter Montague - peter@rachel.org
  Tim Montague   -   tim@rachel.org

  To start your own free Email subscription to Rachel's Democracy
  & Health News send a blank Email to: join-rachel@gselist.org.

  In response, you will receive an Email asking you to confirm that
  you want to subscribe.

Environmental Research Foundation
P.O. Box 160, New Brunswick, N.J. 08903

Error. Page cannot be displayed. Please contact your service provider for more details. (18)