Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.

https://www.worldnuclearreport.org/reactors.html#tab=iso;reg...

>So $1.7 billion per GW. This is an exceptionally good price for a system of generation that is geographically independent, is non-intermittent, and is energy dense (and so does not have to involve long transmission lines moving electricity from solar fields and wind farms to cities).

You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on. You've picked the single ripest possoble cherry. It was also the first turnkey plant so it being the cheap directly contradicts your hypothesis.

> So solar panels' cost has to include all the military and communications satellites that pioneered solar panel tech? Most renewable systems also use electronic computers to some degree. This technology was originally pioneered for military encryption and firing computers. You could apply this kind of broken logic to anything. Military and civilian reactor designs are vastly different: the latter are usually mobile, use highly enriched uranium, and are relatively small.

If the PV on Jim Doe's roof was required to power the satellite, and the government sold the polysilicon and sent experts to Jinko to help design the manufacturing facility, and provided the funding then yeah.

> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.

https://www.worldnuclearreport.org/reactors.html#tab=iso;reg...

You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82. The Arkansaw plant I picked as an example was the last one finished before TMI and was wholly consistent with $6/W (or higher including cost of finance) and a negative learning rate since Paliside.

> Do you not see this big cluster of cheap plants built before 3 mile island and then plants got a lot more expensive afterward? I'll draw this in MS paint to make it easier for you: https://i.imgur.com/VD34Zhi.jpeg

I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).



> You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on.

If I were cherry picking I could pick even cheaper plants. Zion 1 and 2 were built for less, as was Oconee 1 and 2.

> You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82.

Most of which were delayed after the 3 mile island incident, and correspondingly experienced greater costs. Sure, if you want to get pedantic the peak number of plants under construction at any one time peaked just after three mile island. But that's because so many plants were delayed, and this led to higher costs.

> I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).

I'm looking over the OSTI report and calculating the inflation adjusted numbers line by line. They match the costs listed in my source. It doesn't look like there's anything to refute: both of our sources show that nuclear plants built during the nuclear boom were some of the cheapest forms of decarbonized energy there is.

I don't have anything refute, because your source agrees with my point. Your own source's data reinforces the claim that nuclear built during the nuclear boom (plants started after 1965 and built before three mile island) were often delivered between 1 and 2 billion dollars (2010 adjusted) per GW of capacity, and some even less than 1 billion.


Zion 1 and 2 were 276 million each at 58% CF or between $3 and $4.2 per net Watt. Better than the last plant to open before TMI, which supports a negative learning rate.

And again. This doesn't include safety retrofits, and it doesn't include O&M which is higher than new renewables.

Even after retrofit, it was destroyed due to a design and management failure in 1998.

All of those early plants are more expensive than you are saying, they had state controlled funding. They were inefficient, and they were unsafe when they opened.

Additionally they all had abysmal capacity factors in the 70s and 80s, around the 50-60% range so using lifetime CF is incredibly biased towards making them look good.

The cost of retrofits which was almost entirely unrelated to TMI was about 40c/Watt https://www.sciencedirect.com/science/article/abs/pii/036054... or about 80c per net watt just for the retrofit to meet 1980s standards.

Include all the failed reactors, and stop looking at just the lowest cost ones, include the cost of the free loans, and you're back up around $6/W


If we're counting capacity factor, then the cost of solar and wind increase by ~4x since they have capacity factors of ~25%, which is a lot less than nuclear's typical ~90% capacity factor [1]. Oconee's capacity factor is 81% over its life and 97% in a typical year. It's actually the opposite: focusing on lifetime capacity makes most nuclear plants look worse than in a typical year.

For all their supposed lack of safety, nuclear power - including these early and supposedly unsafe designs - safer than most renewables [2]. There's an immense double standard between renewable safety (nobody seems to care about the tens of thousands of people killed by dams) and nuclear power.

1. https://www.energy.gov/ne/articles/what-generation-capacity#....

2. https://www.statista.com/statistics/494425/death-rate-worldw...


Lifetime capacities up to TMI are fair for a proposal to build what was built before TMI. Including reliability improvements deployed over cumulative decades of downtime at costs of billions per reactor isn't comparing the thing that was purchased before TMI.

Of course renewables should be capacity weighted. Noone is saying they shouldn't. Capacity weighted new solar in germany is about $3.80/W or new onshore wind is about $3/W. These are both dropping 10-20% YoY. New 4 hour battery is around $2/W. The up front cost is about the same, but the operating costs of NPP exceed what many wind and solar projects are able to bid for. Even if we assume unrealistically short construction times of the 70s for a new Gen III+ reactor the extra 6 years of operation will have the solar park half paid off by the time it opens.

Those early designs were safe enough to mostly keep operating thanks to the exorbitantly expensive upgrades. This is an engineering feat, and a testament to the care and excellence of the US NRC, but it came at a cost which you are trying to pretend does not need paying. Gen III+ reactors are far more complex and so cost more on top of the additional costs incurred by not operating in the unique environment of the 60s.


> Capacity weighted new solar in germany is about $3.80/W or new onshore wind is about $3/W.

This alone is more expensive than nuclear power built during the nuclear boom.

> New 4 hour battery is around $2/W.

So 12 hours of battery, which is a minimum estimate of what we'll need is $6/W. Also, this price is rising: https://www.utilitydive.com/news/battery-prices-to-rise-for-...

Combined these sources make for $9-10 per watt. Furthermore, they have life spans lasting far less than nuclear power, meaning they'll have to be replaced more frequently. By comparison, your own source found that nuclear was built for $2-3 per watt during the nuclear boom. Again: your own sources contradict you.


You're cherry picking the data I cherry picked to help you again. P919 says the average cost was $589/kW in 1983 dollars with $120/kW of non-TMI retrofits and costs rose with time rather than going down. In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same. Your argument about positive learning rates doesn't fit the data even slightly.

Your absolute best argument if I shuffle the goalposts all the way along for you and ignore the guaranteed money, the abandoned plants, the shutdowns that occured under a decade after opening (all of which were paid for on the public dime) the military and govt involvement and the lack of liability is that undoing 37 years of safety and efficiency improvements and reproducing reactor designs with similar capacity to wind and a much higher correlated forced outage rate to a renewable blend sans storage will allow you to come in at only 7% over the cost and only 4-6 years later?

Then even after all that, operating it for two decades will cost more than the total cost of the renewable system.

All this in a country with mediocre wind and worse solar resource than Alberta, Canada. This is your argument?


Whatever "TMI retrofits" which you keep referring to (yet never actually backing it up with a source) are likely not necessary: 3 mile island's secondary containment worked and prevented any significant amount of radiation release.

Estimates you're giving for renewables are excluding the cost of storage, or using fanciful figures of 4 hours worth of storage, as well as excluding costs of transmission and load shifting.

> In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same.

No, it doesn't. It comes out to $1600/kW. Average capacity factor of nuclear power is over 90%, not the 50% you claimed earlier. And again, your "renewable blend" omits the cost of storage, which will be immense if we're even able to build storage at the scale required at all.


The retrofits are reliability and safety upgrades excluding those that happened as a result of TMI. P920 in the Phung paper I linked. This adds about $120/kW or $200/kW net in 1983 dollars.

You don't get to use the price excluding 40 years of reliability and safety upgrades since TMI in one of the strictest nuclear regulatory regimes with tens of billions of tax money spent on the public share of enforcement, and the performance including those upgrades. A Ford Pinto isn't a 2022 Lambourghini.

The prices are costs pre-tmi. The 58% is the lifetime capacity pre-tmi. If you want to use 92%, then find and source the cost of retrofits and interruptions between 1979 and 2022, as well as the cost of replacing all the plants that closed early and the cost of abandoned plants.

At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages. The renewable blend isn't as reliable as the modern fleet, but the 1979 fleet needs more storage, more backup, and more transmission to distribute the overprovision to where it is needed.

If you don't want to prevent more TMI incidents by adding all the stuff that happened after, you're also going to have to throw in a billion every 20 years or so to pay for cleanup and replacing the lost generation capacity.

Alsk keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom. This alone is enough to disprove your assertion that the cost difference is due to lesser construction.


> $200/kW net in 1983 dollars.

So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.

And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.

> At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages.

Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.

> Alsk [sic] keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom.

And again, you skew the timelines to fit your narrative. The nuclear boom started in 1965, and saw large price decreases from plants starting construction before then. That's the price drops brought about by the nuclear boom. Reactors continued to be cheap until three mile occurred and reactors that had started construction in the mid 70s had to deal with a whole ton of nuclear obstructionism. That's why price increases among reactors that started construction in the mid 1970s and later. Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.


> So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.

$200 in 1983 is 600 in 2022. So why are you claiming that it's the reason the plants finished in the 80s and 90s were more expensive then? A much cheaper version of a quarter of an insignificant amount can't double the price. There must be a different explanation like a negative learning rate.

> And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.

It still wiped out the plant and cost billions to clean up. And these are all changes related to the countless minor incidents like fires and pipe burst *before* TMI. The changes due to TMI before 1985 were not counted and were smaller. 40% of these modifications were initiated by the utilities to improve reliability and weren't even due to regulation. This was the cost of improving the early extremely simple reactors to the same safety and reliability standards as TMI.

> Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.

Not the machines you are saying to build. If you want to build a machine with 1/3rd of the material, a quarter of the regulation, a sixth of the labour many fewer redundancies, and less QA? You get one that performs like it. More regulation and more expensive designs and another 40 years of upgrades improved the reliability.

The nuclear industry learnt by doing, and what they learnt is if you half ass things your reactor catches fire or starts leaking and needs repairs like Browns Ferry or Rancho Seco or any if the N4 reactors in France which were built and operated at similarly slapdash rates.

> Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.

In the primary source from the DOE, the prices increase. Same with the Phung paper. The retrofit costs I included are precisely and only the ones unrelated to TMI. Excluding a small fraction of FOAK plants, they went up in price the entire time for plants that opened from 1969 to 1979. Every year they got bigger and more cimplex and added more redundancies because that was what was needed to keep them running more than half the time.

Every year the very first commercial plants were running, more problems were found that needed monkey patching and required adding complexity to subsequent and in progress designs. This necessarily can't have started before the first plant of each manufacturer was running.


The price per gigawatt dropped precipitously in plants with construction starting in 1965, and remained flat until three mile island. They got bigger and more complex, but produced more power and the cost per watt was the same. Only until after three mile island did cost start to balloon. Costs were almost entirely under $2 billion per GW until three mile island. Your sources do not contradict this, I have checked.


This paper

https://www.sciencedirect.com/science/article/pii/S030142151...

Which you keep citing parts of or citing directly shows costs increasing 23% year on year after the opening of the first large commercial reactor. This increase then slowed after TMI according to that paper. The only cost decreases were demo reactors, and small turnkey reactors most of which were shut down not very long after. The countries where costs did not escalate all had far worse reliability than the US program after the 80s. You get what you pay for.

In 2022 dollars the overnight cost for reactors coming online just prior to TMI is over $3000/kW for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.

Here is a simple model in the arctic for a renewable mix of 100MW with higher capacity than those plants at an all in cost that is lower than the overnight cost in your fantasy scenario. It uses a capacity factor about 2% lower than the median for new wind and a solar panel angle that is off optimal by 20 degrees for the latitude.

https://model.energy/?results=fae43ac72e2df9c13526b6989d63c0...

Selling the power that system made for just the O&M costs of the cheapest US reactors today for 20 years would pay for another one.

Renewables in reality are vastly superior to your fantasy nuclear reactor.


You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.

https://ars.els-cdn.com/content/image/1-s2.0-S03014215163001...

Cost exploded after three mile island, your idea that cost increases slowed after is the complete opposite. Can you really not see how the brown data points shoot up?

> for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.

Are you going to keep cherry picking one plan? Nuclear's average capacity factor is over 90% https://www.energy.gov/ne/articles/what-generation-capacity

I looked into your claim that earlier plants had lower capacity factors. This is true for demonstration plants in the 50s and early 60s, but not the 800+ MW production facilities that were built during the nuclear boom.


> You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.

I am and always have been talking only about the bright red dots. I'm indulging in your fantasy and demonstrating that the result of it is the opposite of what you claim. Yet again, the first large commercial reactor that didn't shutdown after a handful of years was San Onofre opening in 1968. This is the start point.

Every year in which nuclear reactors were being operated commercially, costs increased due to the discovery of a the ways it's really hard. Reactors which were under construction after 1968 and completed before 1979 were more expensive every year at a rate of 23% as per your own source. Your red cluster is an increasing function of time. The growth rate (as in ratio of costs from year to year) actually slowed after TMI. This is the conclusion of the paper this image is from.

The top end of the line representing reactors started just before TMI. The reactors you claim are the cheapest of all time. When adjusted for 2022 dollars. Cost over $3000/kW.

92% is the capacity for US plants after decades of operation including the costs incurred by TMI and chernobyl and fukuhsima. It also includes survivorship bias as the reactors which were destroyed or were too unreliable and shut down early are excluded. World average EAF according to IAEA is 79%. You want a cut rate nuclear program, you get the performance of a cut rate nuclear program. France and South Korea are barely better than the early US program. Japan was much worse. The only outlier with a large sample where reports are even approaching reality is China, and the prices China reports for every major project are a tiny fraction of what anyone else does.

The Phung paper has a list of plants and capacity factors up until 1985. The average is 58% and many are far below. Every reactor that wasn't cancelled or closed and didn't have an accident which destroyed an integral part of it has had substantial upgrades and repairs since then.

If we were being honest, then of the 16 reactors finished between 1976 and 1979, we'd include the price of the 3 that failed or were closed decades early in the accounting as well as their cleanup and decomissioning prices, but I'm letting you have that one along with all the other unaccounted subsidies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: