Ben, Sam, and Alex examine why nuclear energy shifted from a cheap postwar dream to a costly and misunderstood industry.
They highlight the success of the French model and challenge common fears about radiation and meltdowns.
Moving past these regulatory hurdles is essential for building a future powered by clean and reliable energy.
Key takeaways
- Nuclear power was originally intended to be the cheapest energy source because its fuel costs are nearly negligible, representing only 2% of a plant's operating expenses.
- Modern nuclear construction is approximately 30 times more expensive and far slower than in the 1950s, when plants could be built in just four years.
- Modern radiation safety standards are based on a model of cumulative damage that may be less accurate than guidelines from the 1930s.
- Incidents like Three Mile Island show how perceived secrecy and lack of transparency can damage an industry's reputation more than the physical impact of the accident itself.
- Public support for nuclear power tends to decline because projects become slow and expensive rather than as a direct reaction to nuclear disasters.
- The ALARA principle effectively caps nuclear profits because operators are often required to spend any excess revenue on additional safety mitigations.
- Regulators sometimes move safety baselines not because the science has changed, but simply because they believe the industry can afford the new standard.
- Nuclear energy remains one of the safest forms of power, as even major accidents like Fukushima resulted in no direct deaths from radiation.
- Modern containment structures are designed to keep radioactive material trapped even if the core melts, making the external health risk comparable to a medical CT scan.
- France secured public support for nuclear energy by ensuring that tax revenues from power plants remained within the local community rather than going to the central government.
- The centralized political structure of the French Fifth Republic allowed for rapid nuclear expansion by removing parliamentary oversight and public hearings from the licensing process.
- Improving the management and maintenance of existing nuclear plants can yield performance gains equivalent to building entirely new reactors.
- The effectiveness of a state organization is determined by its management and incentives rather than its ownership.
- Hinkley Point C became extremely expensive due to a highly complex reactor design and UK regulatory demands, such as a 12-year requirement to design a backup analog system.
- Nuclear power does not need a technological breakthrough to be cheap. It requires a policy breakthrough to return to the regulatory environment of fifty years ago.
- Solar power becomes increasingly expensive as it supplies more than 60 percent of the grid because of the high costs of long-term storage and transmission.
- Big tech companies are becoming the primary advocates for nuclear power because their massive energy needs concentrate the financial benefits of cheap electricity.
- Small modular reactors could solve the nuclear cost problem by moving construction from expensive custom sites to efficient factory assembly lines.
- Evidence from historical nuclear events and studies suggests that the health risks of radiation are often significantly exaggerated and subject to threshold effects.
- Nuclear meltdowns should be viewed as an acceptable business risk provided that proper safety and response protocols are established.
Podchemy Weekly
Save hours every week! Get hand-picked podcast insights delivered straight to your inbox.
The stagnation and potential revival of nuclear power
Sizewell B is the only pressurized water nuclear reactor in Britain. The turbine hall is a massive space that resembles a cathedral of energy. It houses two enormous turbines that look like jet engines encased in steel. These machines spin at 3,000 rpm to generate power at a frequency of 50 Hz. Although this facility is a marvel of engineering, it was finished 30 years ago. Britain has not completed another nuclear reactor since its construction.
When you go into the turbine hall, it's like a cathedral to energy. It has two gigantic turbines which are like giant jet engines encased in steel. You can go up and touch the steel and you can feel the turbine spinning at 3,000 rpm.
The halt in British nuclear development mirrors a global trend. Many nations built numerous reactors in the years following World War II. However, construction largely stopped during the 1980s and 1990s. There is now a renewed interest in nuclear power driven by the energy needs of data centers and the push for decarbonization. Efforts are underway to find ways to make nuclear energy cheaper and faster to build so the industry can enter a new period of growth.
The original promise of cheap nuclear power
Nuclear power is fundamentally similar to fossil fuels but significantly more energy dense. This density changes the economics of power production. While coal requires massive operations for mining, transport, and waste disposal, nuclear fuel costs are almost trivial. In fact, the fuel makes up only about 2% of the cost of running a nuclear power plant. Because nuclear plants essentially boil water to turn a turbine just like coal plants do, the removal of high fuel costs meant nuclear power should have cost half as much as coal.
You should have stuff that's just half the cost of a coal power plant. That's why the American utilities loved it in the 1950s when they started building them. We're going to be able to have electricity that's like half the price.
In the 1950s and 1960s, this promise was being met. Power plants were built in about four years at a cost of roughly $500 per kilowatt in modern money. However, modern projects have seen a massive divergence from this efficiency. A recent plant in Georgia cost $16,000 per kilowatt and took 27 years to plug in. This makes modern nuclear power roughly 30 times more expensive and significantly slower to build than it was seventy years ago. The original excitement for nuclear was not about it being green or carbon-free, but simply because it was the cheapest way to produce energy.
Why the nuclear energy narrative changed
The nuclear industry's initial enthusiasm began to fade in the late 1960s. Several factors caused this shift. Costs started rising because the industry overpromised and underdelivered. Companies moved toward increasingly complex reactor designs. They assumed that costs would scale linearly. To win contracts, companies underbid in competitive markets. When they went over budget and behind schedule, public and bureaucratic trust eroded.
The Dungeness B project in the UK is a prime example of these failures. A consortium on the verge of bankruptcy won the bid. They promised to build a new type of reactor in just four years. The project became a disaster. The company went bust and much of the engineering work had to be redone. It finally opened in the 1980s, decades late. It only operated about 12 percent of the time.
They claimed they could build this thing in four years no one had built one at full scale before. The company goes bust a few years in. Huge amounts of the engineering work end up having to be redone. Nuclear inspectors refused to sign off on the reactor. It finally, I think, having started the mid-60s, gets switched on in the 1980s, and I think is running about 12% of the time.
The regulatory and social landscape also changed. The environmental movement grew quickly. Legal decisions in the US forced the industry to perform proper environmental assessments. The industry often dismissed concerns about nuclear waste. This dismissive attitude led to a significant public backlash.
Safety regulations became more conservative due to shifting scientific theories. The industry adopted the linear no threshold hypothesis. This model suggests that radiation damage builds up in the body over time. It assumes there is no safe level of exposure. Interestingly, some experts believe that radiation guidelines from 1934 were actually more accurate than the models used today.
If you look at our first international set of radiation guidelines from 1934, it was actually probably closer to the truth than the model we use today. It's a rare case of us going backwards.
The origins of anti-nuclear sentiment in the environmental movement
Environmentalist concerns in the 1960s were fueled by a mix of political and safety issues. Much of the early pushback came from anti-war and anti-imperialist groups who saw nuclear power as inseparable from nuclear weaponry. In some early designs, reactors were actually used to produce plutonium for bombs. This overlap between civilian energy and military interests made it difficult for the public to trust the technology.
The reactors were being used to produce plutonium in some cases. By the 1960s, everyone had worked out chemicals, so no one was really using them. But the original wave of nuclear reactors built in Britain anyway were done.
Public skepticism was also driven by the fact that the same organization, the Atomic Energy Commission, regulated both power plants and bomb tests. Several high-profile failures during the 1950s justified this fear. Reports surfaced of mysterious sheep deaths in Colorado and radioactive strontium appearing in milk supplies. The most significant disaster was the Castle Bravo test in 1954.
The wind speed changed unexpectedly, and the bomb generated much more fallout than expected. The US essentially rendered an entire island perpetually uninhabitable. Ash rained down on Rongelap, and children went out to play with it thinking it was snow, and then got quite ill.
The fallout from that test even reached a Japanese fishing vessel. When the contaminated fish was sold in Japan, it triggered a massive national panic. These recurring safety failures and the direct link to the military meant that early environmentalists viewed nuclear power as a genuine threat to public health.
The collapse of public trust in nuclear energy
Public anxiety regarding nuclear energy and the fear of meltdowns became significantly heightened during the 1970s. This shift was driven by a series of smaller accidents and growing concerns over nuclear waste. In the United Kingdom, early activism by Greenpeace exposed ships dumping waste at sea, which caused a massive public outcry and led to threats of boycotts by transport workers. The industry often viewed these concerns as fake issues, believing that waste could simply be reprocessed and reused in future reactors.
The tension between the industry and the public was exacerbated by a cultural gap. Many people working in the early nuclear sector had backgrounds in wartime nuclear programs where public opinion was never a factor. Alex notes that these experts expected the public to simply trust the enlightened engineer or technocrat. This expectation of blind trust eventually failed as the industry failed to realize that the era of the unquestioned expert was ending.
The idea that the public put their trust in the enlightened engineer or enlightened expert or enlightened technocrat just sort of dies a little bit of a death. But the technocrats don't really realize that it's happening to them.
High-profile incidents like the 1979 Three Mile Island accident in Pennsylvania further fueled the anti-nuclear movement. Although scientific evidence suggests the accident did not actually harm anyone, the initial secrecy and cover-ups by officials damaged long-term credibility. Similarly, the Chernobyl disaster in 1986 solidified fears about radiation spreading across borders, creating a moral hazard where neighboring countries felt at risk from accidents occurring elsewhere.
The relationship between cost and nuclear power support
While many assume that public support for nuclear power declined due to high profile disasters, polling suggests a different timeline. The drop in popularity typically follows the realization that nuclear projects have become slow and expensive. When the public perceives nuclear energy as an inefficient power source, support begins to fade. Surprisingly, this shift in opinion often occurs well after the initial safety concerns arise.
I do not actually think that the disasters cause a massive fall off in popular opinion. The decline comes after nuclear power becomes slow and expensive and people decide it is a bad source of power.
The economic challenges facing the nuclear industry are not solely the result of modern safety regulations. A significant portion of the rising costs occurred before the implementation of Alara standards. This indicates that the financial hurdles were already building within the industry before the regulatory landscape shifted in response to global events.
The impact of the ALARA philosophy on nuclear costs
The nuclear industry follows a risk management philosophy known as ALARA, which stands for as low as reasonably achievable. In the United Kingdom, it is often called ALARP. This principle suggests that risks should be reduced until the cost of further reduction becomes disproportionate to the benefit. In practice, this often means that if a project has any profit left over, the operator is obligated to spend it on additional safety mitigations. This creates a situation where profits above the bare minimum are effectively illegal.
It essentially means that you get punished for good behavior. So any innovation that anyone else makes in safety suddenly becomes the new floor. Well, they had an extra backup system. Why don't you have an extra backup system?
This approach led to a massive increase in costs during the 1970s oil crises. As energy prices rose, potential profits for nuclear plants also increased. Regulators then required more mitigations because the industry could technically afford them. Once these regulations are established, they are rarely rolled back. Even today, regulators might lower safety objectives simply because they believe the industry can now achieve them easily, rather than because of new scientific evidence. This shifts the baseline constantly and makes it difficult to maintain stable costs.
Risk mitigation and the safety of nuclear power
Nuclear regulation tries to account for many different risks. While rules for background radiation exist, the levels are often so low they are irrelevant. Focus often shifts to extremely rare events, like a plane hitting a containment dome. These events have a likelihood of one in a million or even ten million reactor operating years. This level of risk modeling is so sensitive to small changes in data that the results often feel like nonsense. Because of this, nuclear companies usually follow regulator demands without challenging the underlying cost-benefit analysis.
Reactor operating years are the kind of level of risks that we're trying to mitigate. But in practice, when you're trying to evaluate that level of risk, it's so sensitive to your inputs that it becomes a little bit of a nonsense. Which is why nuclear companies often don't actually try and challenge the cost benefit analysis because it all feels a little bit fake.
Nuclear power is exceptionally safe compared to almost all other forms of energy. While solar power is also very safe, nuclear remains an extreme case of low health impact. Even when considering major accidents like Fukushima, the actual radiological effects on people were negligible. Alex argues that the standard ways we measure the health impacts of radiation might be incorrect. However, even using those standard measures, the safety of nuclear power stands out. Major disasters like Chernobyl were the result of disastrous management that has not been replicated in the West.
Compared to basically all of the other ones, nuclear power is an extreme. Even if you assume we're going to get a Fukushima every 40 years, according to the UN assessment, it didn't harm anyone through radiological effects.
The mechanics and misconceptions of nuclear meltdowns
Nuclear reactors typically use metal rods and moderators to maintain a steady reaction. A meltdown occurs when the cooling system fails. Whether the coolant is light water, heavy water, or gas, its job is to keep temperatures stable. If the coolant escapes or cannot circulate, the metal rods reach their melting point. Once the metal melts, it becomes extremely difficult to control as it burns through surrounding structures.
A meltdown is basically just the core, which is made of metal, melting. It gets really hot because it's a nuclear reaction that's melting it. You can't cool it. Like, it gets past us. There's nothing you can do.
In the past, reactors like those in the UK in 1957 lacked modern containment domes. This led to disasters where radioactive iodine escaped into the atmosphere and contaminated the food supply, specifically milk. Chernobyl is another example where extreme testing pushed parameters to dangerous levels. Modern Western reactors are equipped with heavy containment domes designed to trap melting material and prevent its escape.
Many people mistakenly equate a nuclear meltdown with a nuclear bomb explosion. The actual danger is the uncontrolled release of radioactive material into the environment, not a massive blast. Standing at the edge of a modern nuclear site during a full meltdown would likely result in a radiation dose of about 10 millisieverts. This is roughly equivalent to receiving one full body CT scan, which generally has no measurable effect on long-term health.
Actually a meltdown happening without something else really bad happening probably won't cause any harm in good Western modern reactors. But if you were doing it in the first ones we had, which did even have a containment dome, then that could lead to the UK's worst nuclear disaster.
The decline of nuclear power in the 1980s
Following the Three Mile Island accident, the nuclear industry faced a major turning point. In the United States, utility companies had ordered many reactors under the assumption that oil and gas prices would stay high forever. However, as fossil fuel imports stabilized and nuclear projects took years longer than planned, the enthusiasm cooled. Around 120 orders were canceled as the reality of high costs set in. While some plants were eventually finished, the momentum of the 1970s disappeared.
In the United Kingdom, Margaret Thatcher entered office with a strong pro-nuclear stance. She wanted to move away from expensive, domestic reactor designs in favor of the pressurized water reactor used internationally. To build public support for this shift, the government launched a massive public inquiry for the Sizewell B plant. They hoped the strength of their technical case would humiliate environmentalists and other critics. Instead, the process became a disaster for the industry. The technocrats were unprepared and lacked necessary safety documentation. The cost data they provided was so unreliable that the chair of the inquiry admitted he did not believe it.
The government are like, great, this is going to be our opportunity to just defeat all of our foes and all of the environmentalists and all of the haters, we are going to humiliate them with the sheer strength of our case. They allow the inquiry to be defined along these really, really broad terms. And so the technocrats... don't prepare for the inquiry very well.
The inquiry spiraled into a year-long debate that covered everything from safety to the ethics of mining uranium in Namibia during the apartheid era. Faith in the industry was further damaged when the UK tried to privatize its energy sector in the late 1980s. Investors discovered that nuclear power was being cross-subsidized by revenues from coal and oil. When the true costs were revealed, the government had to pull nuclear power from the privatization deal at the last minute. This embarrassment effectively shelved the UK's plans for new reactors, as the public and investors realized nuclear energy was far more expensive than previously claimed.
How France built a massive nuclear program through centralization and local incentives
Following the 1973 oil crash, France faced a crisis because they imported almost all their energy. Unlike the US or the UK, which had their own oil resources and focused on high regulation, France prioritized energy security. They abandoned inefficient domestic designs and moved to light water reactors. During the 1970s and 1980s, the country built over 50 large reactors, which were double the size of typical units at the time.
Essentially all decisions about nuclear regulation and licensing were taken away from Parliament. The public essentially had no say. You don't get the kind of hearings and inquiries you get in the UK in the 70s and 80s.
The centralized political system of the French Fifth Republic was a major advantage. It allowed nuclear licensing to happen in relative secrecy. Until 1979, there were no codified safety rules or requirements for environmental impact assessments. This allowed the program to continue at a fast pace even when disasters like Three Mile Island and Chernobyl slowed down nuclear progress in other countries.
Local support was maintained through a unique tax structure. In the UK, tax revenue from power plants goes to the central government. In France, these taxes stay with the local commune or town council. This makes small towns extremely wealthy. One example is Chooz, a small town with only a thousand people. Because of the local power plant, residents have not paid property taxes for 40 years and receive free satellite television from the local authorities.
If you make it so that the people who would be most likely to give support to those moves have an incentive not to, that is why it doesn't get off the ground. Environmentalism is amazing until you have a really strong incentive to believe otherwise.
These local benefits effectively neutralized the environmental movement. People were less likely to oppose a massive concrete power plant when it funded their playgrounds and infrastructure. Even politicians like Mitterrand, who originally opposed nuclear power, had to moderate their views because they could not find a large enough constituency of voters who were against it.
State versus private models in nuclear energy
There are two primary models for building nuclear energy infrastructure. Historical successes in South Korea, France, and Britain were state-led. Meanwhile, the United States and Japan relied on private utilities. Some believe the state-led approach is superior because it provides scale and certainty. A government can ensure a steady pipeline of projects and act as a reliable customer. This stability helps avoid the dysfunctional competition where companies underbid and then struggle to deliver.
The argument for a state-led approach to this is that it gives you certainty, it gives you scale, you know that the customer will still be there.
However, the state model is not a silver bullet. In Britain, many failures occurred while the industry was still nationalized. State-led projects today, like the Sizewell C plant or large transport projects, still face massive cost overruns. Government involvement does not automatically make construction cheaper or more efficient. Sometimes one part of the government simply ties the hands of another part through complex regulations.
Efficiency also suffered under nationalization. During the 1970s and 1980s, British plants had lower output than their American peers. Because these plants were not individual profit centers, managers had little incentive to optimize maintenance. When the management was eventually professionalized in the 1990s, the performance gains were significant. These improvements were equal to adding two entire new reactors to the grid without any new construction.
The importance of incentives in state management
The success of state-run entities depends largely on management quality and the design of incentives. There is a wide gap between a well run organization and a poorly run one. When the structure of incentives is broken, even large scale projects can fail. This was evident in the development of Hinckley Point C and Sizewell C, where the incentive structures were handled poorly.
You can have well run state organizations and you can have badly run ones. I think it is about incentives. With both Hinckley Point C and Sizewell C, we got the incentive sort of catastrophically badly wrong.
The challenges of building Hinkley Point C
Britain decided to return to nuclear power in the late 2000s to meet climate change targets. The government identified existing sites like Hinkley Point C because they already had grid connections. EDF proposed using the European Pressurized Reactor or EPR. This design was created after the Chernobyl disaster to meet extremely strict safety standards. It features massive amounts of redundancy and complexity. Many nuclear engineers find the design to be an aberration because it is so convoluted.
The EPR was essentially a product of Chernobyl. They teamed up with the Germans to design the safest reactor ever. It has insane amounts of redundancy built into it, but it is an incredibly complicated, convoluted design.
The UK regulatory process made the project even more difficult. Regulators demanded many changes to the original French design. These changes required significantly more concrete and steel. They even required a backup analog control system. Designing that single system took 12 years. The financial structure of the deal gave the builders little incentive to push back against these costly requirements. As a result, Hinkley Point C is set to be one of the most expensive power stations ever built.
While costs have exploded in the UK, USA, and France, other nations have found ways to keep them low. South Korea builds nuclear plants at about one-sixth of the cost of the US. China also reports much lower costs and successfully built an EPR for a fraction of the British price. This suggests that the high cost of nuclear power in the West is a self-fulfilling prophecy caused by specific planning and regulatory choices rather than the technology itself.
The limitations of solar and the necessity of nuclear power
Many people now view nuclear power as slow and costly. Solar power seems much more promising because prices are dropping fast. It is already very effective in Africa and the American Southwest. However, relying entirely on solar has major drawbacks. Cost reductions are not guaranteed to continue forever. If we assume they will and they stop, we will be in trouble.
You can never be certain about anything like learning curves. Cost reductions are not predictive of future cost reductions. It would be a really bad idea to just assume that they will continue. If they don't, you're in trouble.
Geography also plays a huge role. Countries like Britain or the Nordic nations do not get enough consistent sunlight to rely on solar. Nuclear reactors also have a unique advantage in producing heat. They generate heat directly like fossil fuels do. Solar is less efficient at this because it must convert electricity back into heat.
Alex notes that nuclear does not actually need a technological breakthrough. It only needs a policy breakthrough. If we can fix the regulatory side, nuclear can become as affordable as it was fifty years ago. Policy constraints are often treated as fixed, but changing them could reduce energy costs without needing a new invention.
Over-reliance on solar creates risks for the power grid. In places like California, solar capacity factors drop below 10 percent during the winter. This requires massive battery storage or expensive high voltage cables. Without gas plants to act as backup, a grid relying mostly on solar becomes very expensive once it tries to supply more than 60 percent of total energy.
The economics of energy storage and nuclear power
Current estimates for a national battery storage system reach trillions of pounds under today's prices. However, battery costs are falling significantly. While they are not dropping as fast as solar prices, the downward trend suggests a future where large-scale storage is more realistic. Critics often view months of battery backup as a strawman argument, but the technology is a moving target. We might hit a wall with lithium-ion batteries, or we might invent entirely new methods, such as using heated dirt to store energy over a year.
If we made it so that we could build nuclear power as quickly and as affordably as South Korea, then there wouldn't really be this either-or question anyway. China builds loads of nuclear power plants and also puts loads of solar in. There isn't actually an opportunity cost.
The choice between nuclear and solar is not necessarily a zero-sum game. Countries like China demonstrate that it is possible to invest heavily in both. The real challenge lies in systemic issues like permitting and grid connections. Fixing the general energy market would likely benefit both nuclear and renewable energy at the same time. If we resolve the hurdles for one, we often resolve them for the other.
The role of corporate demand and SMRs in nuclear revival
The cost of nuclear power used to be spread so thin that it was hard to build a movement in its favor. Benefits were dispersed across millions of people who might only save a few pennies on their bills. This dynamic is changing because of AI data centers. Companies like Meta are signing massive deals for gigawatts of power. This concentrates the benefits of cheap energy into the hands of a few powerful players. These companies can become advocates for change. They can identify and move the blockers that have stalled nuclear development for decades.
If all of the five biggest companies in the US are advocating for cheaper nuclear power, that might be enough to make it going.
In the UK, there are efforts to streamline approval processes and stop people from using small legal claims to sabotage new projects. Regulators are being encouraged to stop demanding excessive safety features for tiny radiation risks. Across Europe, countries like Sweden and Poland are looking to revive their nuclear programs. Coordination across the continent could prevent the waste of building multiple parallel supply chains for the same technology.
Small modular reactors (SMRs) are a key part of this future. While they might be less efficient than massive traditional plants in theory, they offer practical advantages. They can be mass-produced in factories and might benefit from a more streamlined regulatory process. SMRs could also power large ships for a century without needing to refuel. They could provide the intense heat required for industrial processes. It is important to distinguish between actual technology and designs that only exist on paper. Some advanced designs could use hotter coolants to provide high temperatures for manufacturing.
Strategies for clean energy and nuclear innovation
A sensible approach to climate change involves mitigating everything that is cheap to fix first. This means electrifying as much as possible and building a vast supply of clean energy. Once clean energy is abundant, it may be easier to extract carbon from the atmosphere than to overhaul the most expensive industrial processes, such as chemical cracking plants.
Small Modular Reactors (SMRs) could be a breakthrough due to the benefits of mass production. However, many current projects do not actually fit the definition of the technology. A true SMR should be small enough to fit into shipping containers. Some models, like the Rolls Royce reactor, are larger than 400 megawatts and are not modular at all. Despite these naming issues, nuclear innovation remains a vital field to pursue.
It is foolish when people say you should not work on a problem because there is another more important one. There are a lot of problems, and we should not all work on the most important problem. We still need someone to make our shoes. That is not the most important problem in the world, but it would suck if we did not have shoes.
Progress requires people to work on many different levels of problems simultaneously. Even if solving nuclear costs is not the single most important task in the world, it is still a problem worth solving. The arrival of strong stakeholders in the nuclear industry is a positive sign, but it is not a reason for complacency.
The exaggerated fear of radiation
The fear of radiation is often exaggerated and out of line with the evidence. Even in the worst nuclear disasters like Chernobyl, the actual death toll is much lower than many people believe. Reports suggest that around 200 people died in total from that event. While about 28 first responders died quickly, they were exposed to radiation levels equivalent to thousands of full body CT scans within just a few hours.
Chernobyl killed about 200 people total. There were about 28 first responders who died. Those guys had doses of thousands of CT scans and they died within hours or days. No one else has any proven harm except for higher cancer rates decades later.
Most people never experience anything close to those extreme levels. A full body CT scan is typically the most radiation a person will encounter in their life. When researchers try to find harmful effects from very tiny amounts of radiation, the statistical methods often create something out of nothing. Looking at extreme cases provides a clearer picture of the actual risks compared to the intense public fear.
The overstatement of radiation harms
Historical data from Hiroshima and Nagasaki reveals that radiation effects are less widespread than commonly believed. Survivors beyond 1200 meters from the blast center showed no detectable health effects throughout their lives. There were no waves of heritable deformities or delayed health issues in the general population. Even though these individuals received the equivalent of 100 CT scans in one second, the long-term impact was not detectable in most cases.
The evidence is that the harms of radiation are massively overstated. In fact, even a meltdown usually doesn't cause significant health harms. We should work to prevent meltdowns, but we should also live in a world where a certain number of meltdowns is an acceptable cost of doing business as long as we have reasonable protocols to deal with them.
Thresholds for harm appear much higher than public perception suggests. In the 1920s, women who painted watch dials with radium only developed noticeable cancer effects if their daily exposure reached the level of ten CT scans. Those exposed to five CT scans a day showed no noticeable increase in cancer. In another extreme case, terminal patients injected with plutonium did not die earlier than expected. One misdiagnosed patient lived to be 84 years old despite the injections. This suggests that while safety protocols are necessary, the fear surrounding radiation is often disproportionate to the actual risk.
