Tag Archives: fission

Disease, atom bombs, and R-naught

A key indicator of the speed and likelihood of a major disease outbreak is the number of people that each infected person is likely to infect. This infection number is called R-naught, or Ro; it is shown in the table below for several major plague diseases.

R-naught - communicability for several contagious diseases, CDC.

R-naught – infect-ability for several contagious diseases, CDC.

Of the diseases shown, measles is the most communicable, with an Ro of 12 to 18. In an unvaccinated population, one measles-infected person will infect 12- 18 others: his/her whole family and/ or most of his/her friends. After two weeks or so of incubation, each of the newly infected will infect another 12-18. Traveling this way, measles wiped out swaths of the American Indian population in just a few months. It was one of the major plagues that made America white.

While Measles is virtually gone today, Ebola, SARS, HIV, and Leprosy remain. They are far less communicable, and far less deadly, but there is no vaccine. Because they have a low Ro, outbreaks of these diseases move only slowly through a population with outbreaks that can last for years or decades.

To estimate of the total number of people infected, you can use R-naught and the incubation-transmission time as follows:

Ni = Row/wt

where Ni is the total number of people infected at any time after the initial outbreak, w is the number of weeks since the outbreak began, and wt is the average infection to transmission time in weeks.

For measles, wt is approximately 2 weeks. In the days before vaccine, Ro was about 15, as on the table, and

Ni = 15w/2.

In 2 weeks, there will be 15 measles infected people, in 4 weeks there will be 152, or 225, and in 6 generations, or 12 weeks, you’d expect to have 11.39 million. This is a real plague. The spread of measles would slow somewhat after a few weeks, as the infected more and more run into folks who are already infected or already immune. But even when the measles slowed, it still infected quite a lot faster than HIV, Leprosy, or SARS (SARS is a form of Influenza). Leprosy is particularly slow, having a low R-naught, and an infection-transmission time of about 20 years (10 years without symptoms!).

In America, more or less everyone is vaccinated for measles. Measles vaccine works, even if the benefits are oversold, mainly by reducing the effective value of Ro. The measles vaccine is claimed to be 93% effective, suggesting that only 7% of the people that an infected person meets are not immune. If the original value of Ro is 15, as above, the effect of immunization is to reduce the value Ro in the US today to effectively 15 x 0.07 = 1.05. We can still  have measles outbreaks, but only on a small-scale, with slow-moving outbreaks going through pockets of the less-immunized. The average measles-infected person will infect only one other person, if that. The expectation is that an outbreak will be captured by the CDC before it can do much harm.

Short of a vaccine, the best we can do to stop droplet-spread diseases, like SARS, Leprosy, or Ebola is by way of a face mask. Those are worn in Hong Kong and Singapore, but have yet to become acceptable in the USA. It is a low-tech way to reduce Ro to a value below 1.0, — if R-naught is below 1.0, the disease dies out on its own. With HIV, the main way the spread was stopped was by condoms — the same, low tech solution, applied to sexually transmitted disease.

Image from VCE Physics, https://sites.google.com/site/coyleysvcephysics/home/unit-2/optional-studies/26-how-do-fusion-and-fission-compare-as-viable-nuclear-energy-power-sources/fission-and-fusion---lesson-2/chain-reactions-with-dominoes

Progress of an Atom bomb going off. Image from VCE Physics, visit here

As it happens, the explosion of an atom bomb follows the same path as the spread of disease. One neutron appears out of somewhere, and splits a uranium or plutonium atom. Each atom produces two or three more neutrons, so that we might think that R-naught = 2.5, approximately. For a bomb, Ro is found to be a bit lower because we are only interested in fast-released neutrons, and because some neutrons are lost. For a well-designed bomb, it’s OK to say that Ro is about 2.

The progress of a bomb going off will follow the same math as above:

Nn = Rot/nt

where Nn is the total number of neutrons at any time, t is the average number of nanoseconds since the first neutron hit, and nt is the transmission time — the time it takes between when a neuron is given off and absorbed, in nanoseconds.

Assuming an average neutron speed of 13 million m/s, and an average travel distance for neutrons of about 0.1 m, the time between interactions comes out to about 8 billionths of a second — 8 ns. From this, we find the number of neutrons is:

Nn = 2t/8, where t is time measured in nanoseconds (billionths of a second). Since 1 kg of uranium contains about 2 x 1024 atoms, a well-designed A-bomb that contains 1 kg, should take about 83 generations (283 = 1024). If each generation is 8 ns, as above, the explosion should take about 0.664 milliseconds to consume 100% of the fuel. The fission power of each Uranium atom is about 210 MeV, suggesting that this 1 kg bomb could release 16 billion Kcal, or as much explosive energy as 16 kTons of TNT, about the explosive power of the Nagasaki bomb (There are about 38 x10-24 Kcal/eV).

As with disease, this calculation is a bit misleading about the ease of designing a working atomic bomb. Ro starts to get lower after a significant faction of the atoms are split. The atoms begin to move away from each other, and some of the atoms become immune. Once split, the daughter nuclei continue to absorb neutrons without giving off either neutrons or energy. The net result is that an increased fraction of neutrons that are lost to space, and the explosion dies off long before the full power is released.

Computers are very helpful in the analysis of bombs and plagues, as are smart people. The Manhattan project scientists got it right on the first try. They had only rudimentary computers but lots of smart people. Even so, they seem to have gotten an efficiency of about 15%. The North Koreans, with better computers and fewer smart people took 5 tries to reach this level of competence (analyzed here). They are now in the process of developing germ-warfare — directed plagues. As a warning to them, just as it’s very hard to get things right with A-bombs, it’s very hard to get it right with disease; people might start wearing masks, or drinking bottled water, or the CDC could develop a vaccine. The danger, if you get it wrong is the same as with atom bombs: the US will not take this sort of attack lying down.

Robert Buxbaum, January 18, 2019. One of my favorite authors, Issac Asimov, died of AIDS; a slow-moving plague that he contacted from a transfusion. I benefitted vastly from Isaac Asimov’s science and science fiction, but he wrote on virtually every topic. My aim is essays that are sort-of like his, but more mathematical.

Nuclear fusion

I got my PhD at Princeton University 33 years ago (1981) working on the engineering of nuclear fusion reactors, and I thought I’d use this blog to rethink through the issues. I find I’m still of the opinion that developing fusion is important as the it seems the best, long-range power option. Civilization will still need significant electric power 300 to 3000 years from now, it seems, when most other fuel sources are gone. Fusion is also one of the few options for long-range space exploration; needed if we ever decide to send colonies to Alpha Centauri or Saturn. I thought fusion would be ready by now, but it is not, and commercial use seems unlikely for the next ten years at least — an indication of the difficulties involved, and a certain lack of urgency.

Oil, gas, and uranium didn’t run out like we’d predicted in the mid 70s. Instead, population growth slowed, new supplies were found, and better methods were developed to recover and use them. Shale oil and fracking unlocked hydrocarbons we thought were unusable, and nuclear fission reactors got better –safer and more efficient. At the same time, the more we studied, the clearer it came that fusion’s technical problems are much harder to tame than uranium fission’s.

Uranium fission was/is frighteningly simple — far simpler than even the most basic fusion reactor. The first nuclear fission reactor (1940) involved nothing more than uranium pellets in a pile of carbon bricks stacked in a converted squash court at the University of Chicago. No outside effort was needed to get the large, unstable uranium atoms split to smaller, more stable ones. Water circulating through the pile removed the heat released, and control was maintained by people lifting and lowering cadmium control rods while standing on the pile.

A fusion reactor requires high temperature or energy to make anything happen. Fusion energy is produced by combining small, unstable heavy hydrogen atoms into helium, a bigger more stable one, see figure. To do this reaction you need to operate at the equivalent of about 500,000,000 degrees C, and containing it requires (typically) a magnetic bottle — something far more complex than a pile of graphic bricks. The reward was smaller too: “only” about 1/13th as much energy per event as fission. We knew the magnetic bottles were going to be tricky, e.g. there was no obvious heat transfer and control method, but fusion seemed important enough, and the problems seemed manageable enough that fusion power seemed worth pursuing — with just enough difficulties to make it a challenge.

Basic fusion reaction: deuterium + tritium react to give helium, a neutron and energy.

Basic fusion reaction: deuterium + tritium react to give helium, a neutron and energy.

The plan at Princeton, and most everywhere, was to use a TOKAMAK, a doughnut-shaped reactor like the one shown below, but roughly twice as big; TOKAMAK was a Russian acronym. The doughnut served as one side of an enormous transformer. Hydrogen fuel was ionized into a plasma (a neutral soup of protons and electrons) and heated to 300,000,000°C by a current in the TOKOMAK generated by varying the current in the other side of the transformer. Plasma containment was provided by enormous magnets on the top and bottom, and by ring-shaped magnets arranged around the torus.

As development went on, we found we kept needing bigger and bigger doughnuts and stronger and stronger magnets in an effort to balance heat loss with fusion heating. The number density of hydrogen atoms per volume, n, is proportional to the magnetic strength. This is important because the fusion heat rate per volume is proportional to n-squared, n2, while heat loss is proportional to n divided by the residence time, something we called tau, τ. The main heat loss was from the hot plasma going to the reactor surface. Because of the above, a heat balance ratio was seen to be important, heat in divided by heat out, and that was seen to be more-or-less proportional to nτ. As the target temperatures increased, we found we needed larger and larger nτ reactors to make a positive heat balance. And this translated to ever larger reactors and ever stronger magnetic fields, but even here there was a limit, 1 billion Kelvin, a thermodynamic temperature where the fusion reaction went backward and no energy was produced. The Princeton design was huge, with super strong, super magnets, and was operated at 300 million°C, near the top of the reaction curve. If the temperature went above or below this temperature, the fire would go out. There was no room for error, but relatively little energy output per volume — compared to fission.

Fusion reaction options and reaction rates.

Fusion reaction options and reaction rates.

The most likely reaction involved deuterium and tritium, referred to as D and T. This was the reaction of the two heavy isotopes of hydrogen shown in the figure above — the same reaction used in hydrogen bombs, a point we rarely made to the public. For each reaction D + T –> He + n, you get 17.6 million electron volts (17.6 MeV). This is 17.6 million times the energy you get for an electron moving over one Volt, but only 1/13 the energy of a fission reaction. By comparison, the energy of water-forming, H2 + 1/2 O2 –> H2O, is the equivalent of two electrons moving over 1.2 Volts, or 2.4 electron volts (eV), some 8 million times less than fusion.

The Princeton design involved reacting 40 gm/hr of heavy hydrogen to produce 8 mol/hr of helium and 4000 MW of heat. The heat was converted to electricity at 38% efficiency using a topping cycle, a modern (relatively untried) design. Of the 1500 MWh/hr of electricity that was supposed to be produced, all but about 400 MW was to be delivered to the power grid — if everything worked right. Sorry to say, the value of the electricity did not rise anywhere as fast as the cost of the reactor and turbines. Another problem: 1100 MW was more than could be easily absorbed by any electrical grid. The output was high and steady, and could not be easily adjusted to match fluctuating customer demand. By contrast a coal plant’s or fuel cell’s output could be easily adjusted (and a nuclear plant with a little more difficulty).

Because of the need for heat balance, it turned out that at least 9% of the hydrogen had to be burnt per pass through the reactor. The heat lost per mol by conduction to the wall was, to good approximation, the heat capacity of each mol of hydrogen ions, 82 J/°C mol, times the temperature of the ions, 300 million °C divided by the containment time, τ. The Princeton design was supposed to have a containment of about 4 seconds. As a result, the heat loss by conduction was 6.2 GW per mol. This must be matched by the molar heat of reaction that stayed in the plasma. This was 17.6 MeV times Faraday’s constant, 96,800 divided by 4 seconds (= 430 GW/mol reacted) divided by 5. Of the 430 GW/mol produced in fusion reactions only 1/5 remains in the plasma (= 86 GW/mol) the other 4/5 of the energy of reaction leaves with the neutron. To get the heat balance right, at least 9% of the hydrogen must react per pass through the reactor; there were also some heat losses from radiation, so the number is higher. Burn more or less percent of the hydrogen and you had problems. The only other solution was to increase τ > 4 seconds, but this meant ever bigger reactors.

There was also a material handling issue: to get enough fuel hydrogen into the center of the reactor, quite a lot of radioactive gas had to be handled — extracted from the plasma chamber. These were to be frozen into tiny spheres of near-solid hydrogen and injected into the reactor at ultra-sonic velocity. Any slower and the spheres would evaporate before reaching the center. As 40 grams per hour was 9% of the feed, it became clear that we had to be ready to produce and inject 1 pound/hour of tiny spheres. These “snowballs-in-hell” had to be small so they didn’t dampen the fire. The vacuum system had to be able to be big enough to handle the lb/hr or so of unburned hydrogen and ash, keeping the pressure near total vacuum. You then had to purify the hydrogen from the ash-helium and remake the little spheres that would be fed back to the reactor. There were no easy engineering problems here, but I found it enjoyable enough. With a colleague, I came up with a cute, efficient high vacuum pump and recycling system, and published it here.

Yet another engineering challenge concerned the difficulty of finding a material for the first-wall — the inner wall of the doughnut facing the plasma. Of the 4000 MW of heat energy produced, all the conduction and radiation heat, about 1000 MW is deposited in the first wall and has to be conducted away. Conducting this heat means that the wall must have an enormous coolant flow and must withstand an enormous amount of thermal stress. One possible approach was to use a liquid wall, but I’ve recently come up with a rather nicer solid wall solution (I think) and have filed a patent; more on that later, perhaps after/if the patent is accepted. Another engineering challenge was making T, tritium, for the D-T reaction. Tritium is not found in nature, but has to be made from the neutron created in the reaction and from lithium in a breeder blanket, Li + n –> He + T. I examined all possible options for extracting this tritium from the lithium at low concentrations as part of my PhD thesis, and eventually found a nice solution. The education I got in the process is used in my, REB Research hydrogen engineering business.

Man inside the fusion reactor doughnut at ITER. He'd better leave before the 8,000,000°C plasma turns on.

Man inside the fusion reactor doughnut at ITER. He’d better leave before the 8,000,000°C plasma turns on.

Because of its complexity, and all these engineering challenges, fusion power never reached the maturity of fission power; and then Three-mile Island happened and ruined the enthusiasm for all things nuclear. There were some claims that fusion would be safer than fission, but because of the complexity and improvements in fission, I am not convinced that fusion would ever be even as safe. And the long-term need keeps moving out: we keep finding more uranium, and we’ve developed breeder reactors and a thorium cycle: technologies that make it very unlikely we will run out of fission material any time soon.

The main, near term advantage I see for fusion over fission is that there are fewer radioactive products, see comparison.  A secondary advantage is neutrons. Fusion reactors make excess neutrons that can be used to make tritium, or other unusual elements. A need for one of these could favor the development of fusion power. And finally, there’s the long-term need: space exploration, or basic power when we run out of coal, uranium, and thorium. Fine advantages but unlikely to be important for a hundred years.

Robert E. Buxbaum, March 1, 2014. Here’s a post on land use, on the aesthetics of engineering design, and on the health risks of nuclear power. The sun’s nuclear fusion reactor is unstable too — one possible source of the chaotic behavior of the climate. Here’s a control joke.