Mar
06
    
From too cheap to meter to too costly to matter By Michael Mariotte and Aja Binette

From too cheap to meter to too costly to matter

By Michael Mariotte and Aja Binette

Nuclear Information and Resource Service

From the industry’s proclamation of nuclear power being “too cheap to meter” in the 1950s to the reality of a major utility going bankrupt in the 1980s, nuclear power has proven to be a source of economic speculation since its inception.

At the beginning of the building boom in the 1960s, reactors were estimated to cost $560/kw for plants being built starting in 1966.  However, the actual cost for those early reactors averaged $1170/kw–a 209% increase over the projected cost. (Costs of building a nuclear reactor are often framed in terms of amount of money spent per kilowatt of electricity that will be supplied to the grid.)

Construction costs skyrocketed in the late 1970s and 1980s, partly due to high interest costs for borrowed funds and partly due to the Three Mile Island accident of 1979, which ushered in a wave of sometimes expensive safety modifications and requirements for new reactors.

By the 1980s, costs for a single new reactor in the U.S. averaged above $4 billion, with some reactors, like Nine Mile Point-2 and Seabrook, reaching above $6 billion—more than $4,000/kw.

Despite that history, as recently as 2006 the industry’s lobbying arm, the Nuclear Energy Institute (NEI), argued on its website that new reactors could be built for as little as $2,000/kw with costs eventually going down to $1,500/kw, or about $2.5-3.5 billion per reactor. At those costs, nuclear power could be economically competitive with other energy sources.

But before a first layer of concrete for a new reactor could be poured, those estimates became obsolete. As utilities seriously pondered new reactor construction in preparation for license applications with the Nuclear Regulatory Commission, they realized costs would be much higher than the NEI had promised. The first reactor applications in 2007 project construction costs in the $3.5-5 billion range, or about $2,500-3,500/kw, making their economic competitiveness suspect.

Real-world experience and more recent, more objective projections cast serious doubt on even those estimates. Areva, a French firm considered the most experienced and accomplished reactor manufacturer in the world, currently is building a new EPR (European Pressurized Reactor) in Finland. Construction began in April 2005 under a fixed-price contract for 3 Billion Euros (about $4.7 Billion). Because of a variety of problems, including a faulty foundation and failure to properly manage vendors, as of November 2007 the project is two years behind schedule and 50% over budget, meaning the cost already has risen to 4.5 Billion Euros or nearly $7 billion—well over $4,000/kw for a 1600 MW reactor.

Meanwhile, a major Wall Street firm, Moody’s Investor Service, released a report in October 2007 projecting costs for completed U.S. reactors to range even higher: from $5,000 to $6,000/kw—for a 1600 MW EPR, which eight U.S. utilities are said to be interested in, that would range from nearly $8 billion to $10 billion for a single reactor.

At those costs, no one would seriously consider building a new reactor using their own money. And indeed, the utilities have made clear they don’t intend to build reactors using their money, while Wall Street bankers have said they’re not going to lend money to utilities for new reactors without a guarantee they’ll be paid back—Wall Street has long memories and remembers the default of more than $2 billion of bonds by the Washington Public Power Supply System in the early 1980s, the bankruptcy of Public Service of New Hampshire, the near-wipeout of the Long Island Lighting Co, which built the Shoreham reactor for more than $4 billion and which was shut down without ever generating a single watt of electricity due to understandable concerns that Long Island could not be evacuated in the event of an accident.

Instead, unlike for the first generation of reactors, which was built with private funds, the nuclear industry is turning to the taxpayer: it is seeking $50 billion or more in taxpayer-backed loan guarantees to build a new generation of atomic reactors. This would place all of the risk of new reactor construction on taxpayers, while the rewards for successful construction projects would go to the utilities. What is clear is that if the nuclear industry does not get taxpayer backing, new reactors won’t be built.

As the Associated Press reported on November 29, 2007, “Constellation Energy Group Inc. will not break ground on a new nuclear plant in Maryland next year unless a federal loan-guarantee program is in place, an executive from the power company said Wednesday. “If the loan-guarantee program is able to materialize in early 2008 so we are able to secure loan guarantees for construction of a new plant at the end of 2008,” the company’s board could move forward, said Michael Wallace, president of Constellation Energy Generation Group. “If it doesn’t, we won’t.”

At this writing, the loan guarantee issue has not been settled by Congress. Nuclear advocates are promoting the proposed program; a coalition of taxpayer, consumer and environmental organizations are working to ensure that taxpayer money is not used on such risky ventures.

Other Subsidies

The federal government has taken a number of other steps over the years to try to make nuclear power economically viable.  One of the first, initially enacted by Congress in 1957, was the Price Anderson Act. It was designed to ease the fears of insurers, who refused to provide insurance for such a dangerous technology capable of catastrophic damage.  If the nuclear utilities had to purchase insurance to cover the potential damages a reactor accident could cause, there would be no reactors, since insurance costs would be prohibitive if insurance even were available, which it wouldn’t be. A 1982 study by Sandia National Laboratories, for example, predicted that a nuclear accident at the Indian Point facility about 35 miles from New York City could cause damages upwards of $300 billion, while an accident at one of the Salem reactors in New Jersey could kill more than 100,000 people.

25 years later, those figures would only be higher. The 1986 accident at the Chornobyl reactor in Ukraine, for example, has been estimated itself to have cost $300 billion+ in damages—and that accident took place in a relatively rural area in a much poorer country than the U.S. At the time, for example, average Ukrainian wages were about $100/month.

Since 1957, the Price Anderson Act has been extended seven times; in 2005 it was extended until 2025 by the Energy Policy Act of 2005.  The Act requires utilities—as a group—to purchase $200 million in private insurance. Once those funds ran out, utilities would pay about $15 million per reactor per year for seven years—a fund of about $10 billion. Taxpayers would have to step in to pay damages above that level.

The nuclear industry has also benefited from federal largesse in research and development of nuclear technology. Nuclear power (fission and fusion) has received about 60% of all energy research and development funding over the past 50 years; renewable energy technologies, on the other hand, have received less than 15% of federal R&D funds (most of the remainder has gone to fossil fuels like coal and oil).

Other Nuclear Costs

Besides reactor construction, there are additional and significant costs associated with nuclear power.

Operating and maintenance costs: Here, nuclear power currently is competitive with other energy sources, as long as uranium remains relatively inexpensive and a utility can avoid large capital costs with reactor repairs. However, some reactors have closed rather than face such capital costs, including Trojan in Oregon, Zion in Illinois, Maine Yankee and Connecticut Yankee. As reactors age, the likelihood of more reactors closing early rather than face major component replacement costs increases. Meanwhile, the price of uranium has skyrocketed over the past three years (partly due to flooding of a major uranium mine in Canada), and some researchers maintain that the world’s easily-accessible supplies of uranium are rapidly diminishing, leading to the prospect of nuclear power’s fuel source becoming a costly item in the next two or three decades. This would undercut even nuclear’s advantage in everyday operational costs (which typically don’t include construction costs).

Nuclear Fuel Chain: Rarely added to the costs of operating a nuclear reactor are the costs of the front end of the nuclear fuel chain: uranium mining, milling, processing, enrichment, and fuel fabrication. While it could be argued that these costs are reflected in the costs of uranium to utilities, the reality is that at each stage of the fuel chain, contamination has occurred and clean-up and decommissioning costs are unknown and are potentially a huge liability—although these are more likely to be borne by taxpayers rather than the nuclear industry.

Decommissioning: While utilities are required to set up funds to finance the expected decommissioning costs of nuclear reactors, the reality is that such costs currently are unknown and the technology to actually take apart large commercial reactors does not yet exist. The Shoreham reactor, which never generated electricity or operated at full power, cost $400 million to decommission. Large, highly radioactive reactors, could end up costing as much to decommission as they cost to build in the first place.

Radioactive Waste: High-level radioactive waste currently is slated to go to Yucca Mountain, Nevada for permanent storage. But that program is highly controversial, the site has been demonstrated to be scientifically unsuitable for waste storage, and the odds of Yucca Mountain ever opening are probably less than 50-50. Even if it did open eventually, costs for the Yucca Mt. program and related radioactive waste transportation are estimated to be more than $60 Billion. If a new search for a permanent waste site is required, costs could rise even higher. Ratepayers of nuclear utilities pay 1/10th of a cent per kw/hour to a Nuclear Waste Fund, which is supposed to cover the costs of radioactive waste disposal, but the Fund is unlikely to raise enough money to cover even half of the projected disposal costs. These costs are rarely reflected in the nuclear industry’s stated costs for nuclear powered electricity.

Nuclear power’s poor economics was the major factor for the industry’s demise during the 20th century. Without having solved its economic issues (nor, for that matter, its safety, waste, proliferation or security issues), nuclear power’s future hinges on whether the industry can tap taxpayer wallets and require everyday Americans to assume the financial risks neither Wall Street nor the utilities themselves are willing to accept themselves.

Michael Mariotte and Aja Binette

Nuclear Information and Resource Service

6930 Carroll Avenue, #340

Takoma Park, MD 20912

301.270.6477; nirsnet@nirs.org; www.nirs.org

2008

Post a comment
Name: 
Email: 
URL: 
Comments: