If you’re around my age, and a fan of John Cusack or Paul Newman, you may have caught the 1989 film Fat Man and Little Boy, which was loosely based on the events of the Manhattan Project, and the race to construct the first atomic bombs.
Those who have seen the film probably recall a standout scene, where a group of scientists in a lab are witnesses to an unexpected and lethal nuclear accident, with an eerie blue glow filling the room. This is the scene were John Cusack’s character gets killed, even though he manages to have a few more minutes of screen time. From that scene on, he is already doomed.
If you have not seen the film, or haven’t seen it in a while, here is a refresher:
With the blue glow and cinematic orchestral stab at the moment the screwdriver slips, the scene may seem like science fiction. Believe it or not, however, the scene is actually a fictionalized account of an event that really took place, under similar circumstances, at the Los Alamos laboratories on May 21, 1946.
The scientist in question was a 35 year old Canadian named Louis Slotin, and he really was the fellow hit with a blue glow when an experiment with a nuclear bomb core went very, very wrong. He died nine days after the accident.
In our day to day lives, the energy we are most familiar with tends to come in two forms. Probably the most basic energy we encounter is thermal energy, which can come from a variety of chemical reactions. Get a fire going in a fireplace, that’s thermal energy, with the carbon in the wood forming rapid chemical bonds with the oxygen in the surrounding air.
The other energy we encounter is electrical energy. It’s the flowing invisible juice that’s allowing you to read these words right now.
We know that these energetic sources have benefits and dangers. A fire can keep you warm, but it can also burn you. Electricity powers all the neat gadgets of our lives, but it can also shock you or kill you.
Nuclear energy is exotic. It is rare that anyone would directly encounter a significant source of energy generated by nuclear reactions in their day to day lives. That’s both because of the rarity of the sources and conditions that cause such reactions, and the lethality of the energy produced.
Both chemical and electrical interactions deal with the electrons that are available on the outer edges of atoms, whereas nuclear reactions deal with the splitting apart of the tightly-held mass of protons and neutrons that sit at the nucleus of every atom. A nuclear reaction is more portent, by orders of magnitude, than any chemical or electrical reaction in terms of the total release of energy.
Technically speaking, a fire in your fireplace is a kind of chain reaction. The massive amount of heat is released not because of one interaction of chemicals, but because that interaction is iterated in a self-sustaining way over time. This same principle applies to the release of nuclear energy; one event it negligible, but a series of events cascading over time can lead to a massive release of energy.
To a physicist, any system that uses mechanics or materials to create this sort of sustained release of power from nuclear destabilization is called a reactor. So the big power plant with the oddly-shaped steam towers is a reactor. But to a physicist, a bomb that blows up a city is also a reactor, and so was that weird assortment of metal bricks and spheres that were sitting on the tabletop in that movie.
The arrangement of mechanics and materials that leads to a sudden release of nuclear-generated power is referred to by physicists as an assembly, and the threshold that determines the difference between an arrangement that doesn’t release any power, and an arrangement that is releasing power, is called criticality.
So, in the movie scene above, as with Louis Slotin in real life, the tabletop arrangement was a critical assembly, and the events that killed Louis Slotin can be understood as that assembly moving from a sub-critical state to a state referred to as prompt critical.
‘Prompt’ in this instance refers to the source of the nuclear reactions; as each atomic nucleus splits, it generates a shower of neutrons that are immediately splitting the next generation of atoms, leading to more neutrons. It results in a very rapid and escalating burst of energy.*
The Slotin tragedy, along with the death of another scientist, Harry Daghlian, a few months before, helped to launch the industrial practices we now call criticality safety. These are the risk management steps that physicists and plant operators take to keep nuclear materials governable and survivable.
And the cardinal rule of criticality safety is this: Assume all unmonitored assemblies are critical.
In other words, if you are approaching any sort of reactor, whether in a nuclear power plant, or a tabletop experiment, or a bomb, you need to assume it is in a configuration that is going to kill you—and quickly—until proven otherwise.
‘Monitoring’ involves throwing a Geiger counter or a neutron counter into the equation. In the clip above, those are the boxes making the clicking noises. Unless you are directly measuring the output of an assembly, it is a black box. Until you are measuring the power, directly, you have to assume that it is in a state that will be lethal.
What I like about this assumption is that it is not only applicable in nuclear criticality labs and power plants. If we generalize the principle, it yields a very good rule for one’s professional life.
As a small business owner, I am able to generate a lot of data about the “assembly” that is in front of me. I can get a read pretty quickly on my bank balances, my outstanding invoices and receivables, and the general demeanor of my clients. That data flow means that I can monitor this assembly, and the likelihood that it will blow up in my face and harm me or those that I love is reduced. Not eliminated, understand, but reduced. The monitoring makes a difference.
In contrast, through the years I have also been an employee at companies and institutions of various sizes. Part of the consistent experiences I have had in these various jobs is that my access to pertinent information is greatly reduced. I still can monitor some things—my paycheck, and the amounts going to taxes and benefits, for example. But there are other factors that are not only out of my control, but also beyond my ability to measure them. These include key decisions, the allocation of resources, and the fluctuations of income and expense in other part of the organization, far from me.
You will forgive my nerdiness if I tell you that I tend to think about employment arrangements in a similar manner to how a physicist is trained to think about a critical assembly. If possible, I want as much data as I can get, so I can monitor the state of the situation. But if I cannot get access to that data, I very rightly treat that arrangement like something that, at any moment, might blow up in my face.
I hope this does not sound simply cynical; I intend instead for it to sound rational. Unmonitored systems must be considered dangerous, simply because the lack of timely data, mixed with the possibility of a rapid shift in power, can lead to that horrible blue flash. Nobody wants that.
I wish employment situations were arranged so that those closest to the assembly could have the most accurate and timely information about the state of things, but that is rarely the case. So in many employment situations, I stay back. If you cannot monitor the system directly, then any physicist will tell you that your best bet for safety in a system of unknown power is time, distance, and shielding.
That is, minimize your exposure, stay as far from the system as you can, and have as much defense in place as possible.
So, like everyone reading this, I will keep working to get as much information as I can about the situations we are in—systems of employment, government, and public health.
As any physicist would tell you, you should keep a close eye on them, or keep your distance.
* This is in contrast to delayed criticality, where the neutrons causing the reaction are not the ones generated by the immediately previous generation of reactions, but rather from other radioactive materials in the mix. Delayed criticality is what makes the difference between a spike in power taking minutes versus milliseconds. Nuclear power plants, for example, depend on delayed criticality in order to be governable. If nuclear plants were prompt critical, no reactor operator would have reflexes fast enough to actually run them.
If you like this writing, please consider supporting it by becoming a subscriber. Also feel free to leave a comment, and to tell your friends about Walking the Wire. Thank you!