Debunked: Scientists risked destroying the earth during nuclear tests and CERN

Mick West

Administrator
Staff member
[Admin: Originally posted by Trigger Hippie, but I though I deserved a thread of its own]

even if it meant an unstoppable chain reaction that could have destroyed the entire world . . .

That's not the first time you've said something like this.

Not unlike the detonation of the first hydrogen bombs . . . the fear was an unending chain reaction . . . funny thing they did it anyway . . . Guess what . . . Edward Teller was part of the decision . . . same people who are IMO . . . possibly In charge of the geoengineering decision !!!!

Your premise that scientists would recklessly endanger the whole planet with geoengineering programs because they disregarded similar world ending scenarios during the H-Bomb tests, falls short of the mark.

In 1942 the decision was made to research a fission bomb. However, Edward Teller continued attempts to gain support for creating a much more powerful thermonuclear bomb (fusion bomb). It was during those early years, when fusion was not well understood, before even the first controlled fission reaction, that Teller first speculated about how a fission bomb might ignite the atmosphere with a self-sustaining fusion reaction of Nitrogen nuclei. (Teller developed a track record for overstating the likelihood of fusion reactions. Bethe, a key figure in the bomb's development, recounted how the H-Bomb could have been produced much sooner were it not for Teller miscalculating the likelihood of thermonuclear reactions.)

Anyway back to 1942. Upon hearing the prospect of an uncontrolled atmospheric reaction, Oppenheimer set Hans Bethe to look into the matter. Bethe, using early IBM digital computers to achieve his results, calculated that a fission reaction could not induce a thermonuclear reaction in the open atmosphere. Research resumed and the first A-Bomb was constructed.

During the Trinity test, Enrico Fermi recalled Teller's idea of igniting the atmosphere. In an attempt to relieve some tension, he started taking bets on whether the test would destroy the world, or merely glass the State of New Mexico.

Development of a fusion bomb began after the war. Soon the notion of igniting the atmosphere surfaced once again. Only this time it was speculated that a thermonuclear reaction could trigger the fusion of Nitrogen nuclei in the atmosphere. In 1946, Teller's own Calculations showed that the bomb was not large enough trigger a cascade, and even if it were, other physical phenomenon would disperse the energy required to sustain the reaction. He concluded the prospect was so improbable that it was considered impossible. Oppenheimer agreed.




This meme that mad scientists will risk the destruction the world for the sake of their precious experiments persists to this day. A Scientific American article caused a big flap when it speculated the Relativistic Heavy Ion Collider might produce a doomsday scenario. This was seen yet again with The Large Hadron Collider.

Scientists were not careless when they evaluated the possibility of global destruction during the first nuclear bomb tests nor are they careless when evaluating geoengineering experiments today. Some are advocating for more research in the subject so that if, at some time in the future, should it be the only card left to play, we could engage in a responsible geoengineering program that would have the highest probability of success with the least damage.

As a side note:

"There was a fear that the detonation of that first bomb would also initiate the destruction of the world. This fear was based on the exceedingly small but finite probability that the explosion of this bomb would initiate an unstoppable chain reaction in the most common element in the world: hydrogen. Their fears were perhaps not totally unfounded, as a rumor persists that the energy liberated by that bomb exceeded the very best theoretical calculations by as much as twenty percent, begging the question 'where did it come from?'

http://www.scienceiq.com/Facts/AtomicAndHydrogenBombs.cfm
Content from External Source
The author you quoted made some fundamental errors. First of all, The most common element in the world is iron. Next, Teller considered the possibility of self sustaining reaction of Nitrogen, not Hydrogen. Scientists knew that even a 20% increase in yield could not ignite the atmosphere. Finally, the author tries to support his premise with a rumour!
 
Last edited:
Teller's gruff and ready reputation probably fueled the rumor . . . still, this was new territory . . . I bet he had some doubts never- the - less . . .
 
Bumping this. The Large Hadron Collider, or 'LHC' project (construction) was begun in the mid-1980s and first test of the mechanism occurred on 10 September, 2008 (just to test the magnets, and stability of aim).

A quite excellent documentary was produced in 2013, released in wider distribution in early 2014: "Particle Fever".

And from the website "Rotten Tomatoes":
http://www.rottentomatoes.com/m/particle_fever/

The mention of the world-wide "fear" of the 'LHC' is referenced in the documentary.

"Official trailer":


(PS....at time 0:44....a time-lapse view with Alps in foreground, and an airliner making a contrail!! Really, REALLY doubt this was "photo-shopped"!! Just sayin'....) ;)


Add: If you happen to subscribe to the streaming and/or DVD "rental" service with NetFlix, then this is available (both formats). There are possibly other venues where it can be obtained for viewing online (and of course, the DVD is available for purchase, for those so inclined).




 
Last edited:
I didn't catch the doomsday reference.

Hmmm....referenced part-way into the video. (Went back to re-watch....NetFlix streaming video seems extraordinarily slow to update, when you "jump around" in the stream...so please bear with me....)....you probably didn't notice it because you are not a typical "CT believer".

Time reference in the documentary: 24:40 to (about) 27:30 (or so).
 
Actually also, a discussion (about 52:00 to 57:00 discusses the concept of the "Multi-Verse" and....well, the physicist seems to be focused so much on these "other bubble universes" that could "kill us"....he neglects to realize that OUR OWN UNIVERSE is not really amenable to our Human form of life....AT ALL!!!!

If unprotected by environmental suits......We would die immediately on the Earth's Moon....or, any OTHER moon of any planet orbiting our Sun. We wouldn't last long on Mars, either...certainly not on Venus or Mercury.

Makes the "concept" (as espoused by various religions) of the "human as perfect" somewhat worthy of review? Needs some perspective, eh??

AND during that few minutes of segment....there is a VERY poignant message from (?...one of the physicists) about "eternity" and so-called "heaven"... (as told to him by his mother, when he was a child)....

I know that the version of "heaven" that is so simply described there? NOT my idea of a place I want to spend 'eternity' in.

But, I digress............
 
Last edited:
FYI, I remember a time travel test at CERN and I worked-myself-into a scientists forum (those working at CERN) & a running joke was the black bird that came through the portal and dropped a bread crumb, which destroyed some of the equipment. I have so many backup storage devices and some that have "failed," I have no desire to retrieve the data.

What I am interested in is particle physics from CERN as it relates to the internet (security and control "underneath/superior" to currently established telecom equipment/software and computer systems. Please tag me or message me any threads.
 
FYI, I remember a time travel test at CERN and I worked-myself-into a scientists forum (those working at CERN) & a running joke was the black bird that came through the portal and dropped a bread crumb, which destroyed some of the equipment. I have so many backup storage devices and some that have "failed," I have no desire to retrieve the data.

What I am interested in is particle physics from CERN as it relates to the internet (security and control "underneath/superior" to currently established telecom equipment/software and computer systems. Please tag me or message me any threads.
____________________

In regards to the God particle, hasn't some of the research advanced into spiral quantum physics; including the exponential ability for data transmission through fiber optics?
 
This one came up again because of something Steven Hawking said.

http://www.dailymail.co.uk/news/art...e-destroy-universe-warns-Stephen-Hawking.html

Here's the quote from the preface of his new book:
The Higgs potential has the worrisome feature that it might become metastable at energies above 100bn gigaelectronvolts. This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn't see it coming
Content from External Source
Now, after the LHC's upgrade is finished, it's highest energy will be about 6500 GeV per beam (let alone per particle), fifteen million times too low to accomplish this. The particle accelerator required to create a metastable higgs boson would weigh more than the entire earth, meaning any civilization able to do this would have to be pretty high up Type II on the Kardashev scale. Even then, that metastable particle destroying the universe is dependent on one particular model being correct, out of several competing models which CERN's results have not yet helped resolve.

You know, though, that model could be right. So keep that in mind for when we finish our first Dyson Sphere and start disassembling planets to build the terrorcollider (I assume all Type II and Type III civilizations would use supervillain naming rules): Safety limit goes at 99,999,999,999 GeV.
 
Last edited:
"We wouldn't see it coming" is actually comforting.
Never see it coming and never realize it happened. Assuming this specific model is the most accurate, once this happens, the result is effectively the separation of mass from matter and an abrupt end of chemistry and gravity working the way we're used to. To quote Dr. Breen from Half Life 2, "When this reaches full power you will be destroyed in every way it is possible for a person to be destroyed. And several more that are fundamentally IMpossible!"

Of course, if the other models are the more accurate ones, the system will either collapse back to its natural apocalypse-free state, or we will have created a tiny bit of mass that has no matter to go with it, breaking a tiny piece of the universe but not destroying the rest. Breaking the universe is still Fall of Adam kind of stuff, but at least we can... I don't know, put it in a jar or something.
 
This one came up again because of something Steven Hawking said.

http://www.dailymail.co.uk/news/art...e-destroy-universe-warns-Stephen-Hawking.html

Here's the quote from the preface of his new book:
The Higgs potential has the worrisome feature that it might become metastable at energies above 100bn gigaelectronvolts. This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn't see it coming
Content from External Source
Now, after the LHC's upgrade is finished, it's highest energy will be about 6500 GeV per beam (let alone per particle), fifteen million times too low to accomplish this. The particle accelerator required to create a metastable higgs boson would weigh more than the entire earth, meaning any civilization able to do this would have to be pretty high up Type II on the Kardashev scale. Even then, that metastable particle destroying the universe is dependent on one particular model being correct, out of several competing models which CERN's results have not yet helped resolve.

You know, though, that model could be right. So keep that in mind for when we finish our first Dyson Sphere and start disassembling planets to build the terrorcollider (I assume all Type II and Type III civilizations would use supervillain naming rules): Safety limit goes at 99,999,999,999 GeV.
As a mere bicycle mechanic, I hesitate to question Stephen Hawking, but the Oh My God particle had 3×10^20 electron volts and didn't destroy the universe.https://www.fourmilab.ch/documents/OhMyGodParticle/
 
Of course, if the other models are the more accurate ones, the system will either collapse back to its natural apocalypse-free state, or we will have created a tiny bit of mass that has no matter to go with it...

...or it might be (and we will never be aware) that our actions create another universe, as part of the hypothesized "multi-verse" model....and after a certain period of time within it, one day an intelligent species will evolve (after so many billions of their 'cycles' (what we call 'years') and they will wonder how "it all started"...
 
This one came up again because of something Steven Hawking said.

http://www.dailymail.co.uk/news/art...e-destroy-universe-warns-Stephen-Hawking.html

Here's the quote from the preface of his new book:
The Higgs potential has the worrisome feature that it might become metastable at energies above 100bn gigaelectronvolts. This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn't see it coming
Content from External Source
Now, after the LHC's upgrade is finished, it's highest energy will be about 6500 GeV per beam (let alone per particle), fifteen million times too low to accomplish this. The particle accelerator required to create a metastable higgs boson would weigh more than the entire earth, meaning any civilization able to do this would have to be pretty high up Type II on the Kardashev scale. Even then, that metastable particle destroying the universe is dependent on one particular model being correct, out of several competing models which CERN's results have not yet helped resolve.

You know, though, that model could be right. So keep that in mind for when we finish our first Dyson Sphere and start disassembling planets to build the terrorcollider (I assume all Type II and Type III civilizations would use supervillain naming rules): Safety limit goes at 99,999,999,999 GeV.

With all due respect (I mean no disrespect & I thank you for your comments) I have a question, hindsight being 20/20, would you have been so embolden to make such statements without having post-experimental scientific data (i.e. given the unknown variables to the experiments)? Also, I would also like your scientific analysis of the atom bomb (i.e. supposedly some scientists feared they couldn't compute the potential effects of the experiment. Thank you :)
 
With all due respect (I mean no disrespect & I thank you for your comments) I have a question, hindsight being 20/20, would you have been so embolden to make such statements without having post-experimental scientific data (i.e. given the unknown variables to the experiments)? Also, I would also like your scientific analysis of the atom bomb (i.e. supposedly some scientists feared they couldn't compute the potential effects of the experiment. Thank you :)

P.S. I thought his comment seemed ridiculous & I said so yesterday in social media, but in a different fashion. However, I didn't have the credentials, so I was hesitant to "call him out."
 
Whose? Dr. Hawking's?

Yes, Stephano's (or Steven as some call him) comment seemed ridiculous. Again though, I would really like feedback from those that are using hindsight as 20/20...... Lest we forget we got to these 4 forces of nature by taking some unknown experimental risks. Is that not correct? I truly believe that science should slow down taking such risks without appropriate prudent/fiduciary stewardship.

 
Yes, Stephano's (or Steven as some call him)

I have never heard Dr. Stephen Hawking referred to as "Stephano". We're talkin' about this guy, right?:


(...maybe "Stephano" is some sort of inside joke? Private. For his personal friends and family?).

In any case, AFAIK Dr. Hawking's comments in RE: the 'Higgs boson' seem to infer energies that are purely theoretical, and not likely to be actually existent in our current Universe.

http://www.cambridge-news.co.uk/Pro...s-boson-8216/story-22898716-detail/story.html

EXCERPT:
However, the professor admits that a particle accelerator powerful enough to reach the critical 100bn GeV would need to be larger than the Earth.
Content from External Source
.....methinks this genius is having a bit of a laugh......
 
I've heard him called that, but only as one of the insults teapartiers attached to him when they were trying to paint him as an illegal immigrant (also Stefan, Stuhammad, Hawkov, and there was an Asian one I can't remember).

As for plausibility? Well, yes, he's not talking out of his ass: one of the several competing models really does approach a limit at immense energy levels, at which very strange things happen that shouldn't happen. In this event, there's then several models for what happens next, most of which don't involve the universe being destroyed. None of those models make a nice catchy intro to sell books with.

Now, the media's interpretation of that as him saying, "For God's sake stop poking the thing," somehow manages to be more absurd than when they thought he'd proved black holes don't exist (by proving that they don't violate conservation).

Physics is full of these kinds of limits, and Hawking likes to talk about them, as does Niel Degrasse Tyson and Phil Plait, all for the same reason: They sell books.

Another example is the relativistic speed limit: Any mass bearing object accelerated to the speed of light has infinite kinetic energy and any interaction with it will release infinite energy, radiating infinite light and heat into the universe and raising everything's temperature by infinity degrees.

When you take an event that can't actually happen and plug it into physics to see what would happen, the result is very often "infinite energy is released and the entire universe is destroyed."

Impossible events will have impossible consequences. This is what roughly two thirds of Randall Munroe's book What If? is about, like baseball pitches destroying cities, laser pointers vaporizing the moon, and civilization imploding after everybody in the world tried to go to Delaware at the same time.

The LHC might be able to narrow down the models when it comes back online next year, but probably won't be able to confirm or deny that undefined limit. There are no natural processes (real or theoretical) that can do it, either. A pair instability supernova falls several orders of magnitude short, and that's an event so energetic we didn't actually believe they were possible until we started to detect them.

Artificial processes can exceed natural ones, though, on a small enough scale, and in this case it's pretty easy to calculate what you need to test the hypothesis, since it's not quite infinite: you need a particle accelerator weighing more than Earth, and the entire output of a star larger than the sun to power it.

I'm actually not sure whether or not the entire solar system has enough rare earth metals to build this collider (let alone the Dyson sphere solar array to power it), but whether it does or not, we're talking about a project that a K2 civilization would have difficulty completing. A K2 civilization is a thing so far out there that words like "patently absurd" just don't cut it. The Galactic Empire from Star Wars is patently absurd, but is merely a mid range K1 civilization* - the contractors who built the Death Stars for them would not even know how to begin constructing this thing.

So, to answer the question: Is Hawking right? Maybe - that model's predictions match what we found when we observed the Higgs boson. So do several other models' predictions. If he is, is it a thing that can happen? No, and he didn't say it was. Does that mean he's not allowed to sell books designed to make science fun and witty for people without advanced physics degrees? ****ing of course not.



*-Nerd time: Their galactic reach may suggest a K3 civilization, but a K3 civilization is one that doesn't just control a galaxy, but can utilize all energy within that galaxy. A K2 civilization must utilize all energy within an entire star system, something that we don't see often in popular science fiction and not at all in Star Wars.
 
Last edited:
Another example is the relativistic speed limit: Any mass bearing object accelerated to the speed of light has infinite kinetic energy and any interaction with it will release infinite energy, radiating infinite light and heat into the universe and raising everything's temperature by infinity degrees.
Why don't we see this with neutrinos then? Not saying they travel at the speed of light since that was discounted a year a two ago and chalked up to an error of sorts. But why are neutrino's able to travel at 99% of the speed of light, and still have almost 0 mass.. And with regards to the LHC, when they are firing these protons at almost the speed of light before crashing one into another, do we see their mass increase as they increase in speed.
 
We don't see that with neutrinos because they A. They have extremely low mass and B. They don't travel at the speed of light. 99% of the speed of light is, as far as energy is concerned, a LONG way from 100%, the relativistic correction becomes increasingly crazy the closer you get to c, but only becomes infinite if you actually *reach* c, which is impossible.

http://www.wolframalpha.com/input/?i=relativistic kinetic energy calculator

Now, the calculations I'm about to play with aren't dealing with the mass of a neutrino, but a fairly large object of 1 kilogram, because that's the default unit and I don't want to get banned for breaking Wolfram Alpha again by converting the mass of a neutrino to kilograms:

Anyway, set mass to 1, and v=.99c, then click =. You get about 5x10^17j for energy.
Increase v to .999c, an increase of right about 1%, and e=2x10^18, a factor of 4. That .009c increase in speed took three times as much energy as the first 0.99c.
Now v=.9999c, e=6x10^18, a factor of three - meaning the last .0009c took twice as much energy as the first .999. 2/3 as much energy as the last step, but for 1/10 as much velocity.
Now let's skip a couple steps forward: v=.999999c and e=6x10^19. 10x increase in energy for about a 0.001% increase in speed.

It continues on from there, with more and more energy for less and less velocity, but as long as you're still talking about numberable fractions of c, you're still talking about relatively small amounts of energy (that 6x10^19j in the last calculation compares pretty well with the daily energy release of a major hurricane, and going back to the .99c calculation, that was "only" about as much energy as the Krakatoa eruption).
ref: http://en.wikipedia.org/wiki/Orders_of_magnitude_(energy)

At 10 9's (0.9999999999c), energy is up to 6x10^21, roughly comparable to the world'd entire remaining petroleum reserve.
At 15 9's, it's 2x10^24, which is about equal to the total solar radiation that reaches Earth in one year.
At 19 9's, I broke Wolfram Alpha, but bringing it down to 18, the energy is comparable to the sun's total output in one second. Certainly a lot if it's about to hit the planet you live on, but as far as the universe is concerned, this is not a particularly large amount of energy.

Of course, we're talking about 1kg. A neutrino weighs a lot less than a kilogram, and even at 19 9's joules aren't a convenient unit of energy for it, like trying to weigh a guinea pig in metric tons. It's not going to destroy anything, but will certainly excite some electrons.

This is sort of like Zeno's Paradox, excpet real - you can't actually reach the speed of light because each progressive step towards it takes more energy than the last. You get half way there, then half of the rest of the way, then half of that and half of that but never actually reach it. Each step is has a calculable finite energy requirement, it's always that last step that costs infinity and breaks the equations.




Anyway, as for the increase in mass in the LHC... sort of. We don't weigh particles in fractions of a gram or anything like that, we actually "weigh" them by their energy in electronvolts. So, yes, we do see the increase in mass, but only indirectly by the increase in energy.
 
Last edited:
Anyway, as for the increase in mass in the LHC... sort of. We don't weigh particles in fractions of a gram or anything like that, we actually "weigh" them by their energy in electronvolts. So, yes, we do see the increase in mass, but only indirectly by the increase in energy.
We often hear about protons traveling at 99% the speed of light in the LHC, but what are these protons actually traveling at since you seem to have a good understanding of whats happening there. Also I read somewhere that there are on average 600 million collisions per second after the protons beams converge at one of 6 different detectors sites along the LHC. How are the able to process that many collisions and determine which collisions are worth viewing?
 
I had to look that one up, so from Wikipedia:

When running at full design power of 7 TeV per beam, once or twice a day, as the protons are accelerated from 450 GeV to 7 TeV, the field of the superconducting dipole magnets will be increased from 0.54 to 8.3 teslas (T). The protons will each have an energy of 7 TeV, giving a total collision energy of 14 TeV. At this energy the protons have a Lorentz factor of about 7,500 and move at about 0.999999991 c, or about 3 metres per second slower than the speed of light (c).[35]It will take less than 90 microseconds (μs) for a proton to travel once around the main ring – a speed of about 11,000 revolutions per second. Rather than continuous beams, the protons will be bunched together, into 2,808 bunches, 115 billion protons in each bunch so that interactions between the two beams will take place at discrete intervals never shorter than25 nanoseconds (ns) apart. However it will be operated with fewer bunches when it is first commissioned, giving it a bunch crossing interval of 75 ns.[36] The design luminosity of the LHC is 1034 cm−2s−1, providing a bunch collision rate of 40 MHz.[37]
Content from External Source
Roughly 8 9's.

As for selecting the most relevant data, they've developed a few different computer systems over the years that help with that. I don't know a lot about them specifically, except that they've made major advancements in grid computing and distributed computing.
 
I had to look that one up, so from Wikipedia:

When running at full design power of 7 TeV per beam, once or twice a day, as the protons are accelerated from 450 GeV to 7 TeV, the field of the superconducting dipole magnets will be increased from 0.54 to 8.3 teslas (T). The protons will each have an energy of 7 TeV, giving a total collision energy of 14 TeV. At this energy the protons have a Lorentz factor of about 7,500 and move at about 0.999999991 c, or about 3 metres per second slower than the speed of light (c).[35]It will take less than 90 microseconds (μs) for a proton to travel once around the main ring – a speed of about 11,000 revolutions per second. Rather than continuous beams, the protons will be bunched together, into 2,808 bunches, 115 billion protons in each bunch so that interactions between the two beams will take place at discrete intervals never shorter than25 nanoseconds (ns) apart. However it will be operated with fewer bunches when it is first commissioned, giving it a bunch crossing interval of 75 ns.[36] The design luminosity of the LHC is 1034 cm−2s−1, providing a bunch collision rate of 40 MHz.[37]
Content from External Source
Roughly 8 9's.

As for selecting the most relevant data, they've developed a few different computer systems over the years that help with that. I don't know a lot about them specifically, except that they've made major advancements in grid computing and distributed computing.
Honestly, as far as jobs go in the world of physics and engineering, working at LHC must be up there... Just amazing when you think about the achievements made, the cost made, and luckily academia and governments alike realizing the importance of such a "machine"
 
Sometimes governments realize the importance. There's an unfortunate tendency in governments to look at a project that's half paid for and say, "Wait, this won't be done for two more elections? What's the chances of that much time happening?" and pulling the plug. That's what happened to the SSC in Texas, which at first light would have been almost three times as powerful as we hope to upgrade the LHC.

A lot of people working at CERN right now were lining up to work with that beast.
 
[Admin: Originally posted by Trigger Hippie, but I though I deserved a thread of its own]



That's not the first time you've said something like this.



Your premise that scientists would recklessly endanger the whole planet with geoengineering programs because they disregarded similar world ending scenarios during the H-Bomb tests, falls short of the mark.

In 1942 the decision was made to research a fission bomb. However, Edward Teller continued attempts to gain support for creating a much more powerful thermonuclear bomb (fusion bomb). It was during those early years, when fusion was not well understood, before even the first controlled fission reaction, that Teller first speculated about how a fission bomb might ignite the atmosphere with a self-sustaining fusion reaction of Nitrogen nuclei. (Teller developed a track record for overstating the likelihood of fusion reactions. Bethe, a key figure in the bomb's development, recounted how the H-Bomb could have been produced much sooner were it not for Teller miscalculating the likelihood of thermonuclear reactions.)

Anyway back to 1942. Upon hearing the prospect of an uncontrolled atmospheric reaction, Oppenheimer set Hans Bethe to look into the matter. Bethe, using early IBM digital computers to achieve his results, calculated that a fission reaction could not induce a thermonuclear reaction in the open atmosphere. Research resumed and the first A-Bomb was constructed.

During the Trinity test, Enrico Fermi recalled Teller's idea of igniting the atmosphere. In an attempt to relieve some tension, he started taking bets on whether the test would destroy the world, or merely glass the State of New Mexico.

Development of a fusion bomb began after the war. Soon the notion of igniting the atmosphere surfaced once again. Only this time it was speculated that a thermonuclear reaction could trigger the fusion of Nitrogen nuclei in the atmosphere. In 1946, Teller's own Calculations showed that the bomb was not large enough trigger a cascade, and even if it were, other physical phenomenon would disperse the energy required to sustain the reaction. He concluded the prospect was so improbable that it was considered impossible. Oppenheimer agreed.




This meme that mad scientists will risk the destruction the world for the sake of their precious experiments persists to this day. A Scientific American article caused a big flap when it speculated the Relativistic Heavy Ion Collider might produce a doomsday scenario. This was seen yet again with The Large Hadron Collider.

Scientists were not careless when they evaluated the possibility of global destruction during the first nuclear bomb tests nor are they careless when evaluating geoengineering experiments today. Some are advocating for more research in the subject so that if, at some time in the future, should it be the only card left to play, we could engage in a responsible geoengineering program that would have the highest probability of success with the least damage.

As a side note:

"There was a fear that the detonation of that first bomb would also initiate the destruction of the world. This fear was based on the exceedingly small but finite probability that the explosion of this bomb would initiate an unstoppable chain reaction in the most common element in the world: hydrogen. Their fears were perhaps not totally unfounded, as a rumor persists that the energy liberated by that bomb exceeded the very best theoretical calculations by as much as twenty percent, begging the question 'where did it come from?'

http://www.scienceiq.com/Facts/AtomicAndHydrogenBombs.cfm
Content from External Source
The author you quoted made some fundamental errors. First of all, The most common element in the world is iron. Next, Teller considered the possibility of self sustaining reaction of Nitrogen, not Hydrogen. Scientists knew that even a 20% increase in yield could not ignite the atmosphere. Finally, the author tries to support his premise with a rumour!


Just wanna say, the most common element on earth is NOT iron it's oxygen. Making up 47%. Iron only takes up 5%. Alluminum is a higher element than iron which is at 8%.
 
Just wanna say, the most common element on earth is NOT iron it's oxygen. Making up 47%. Iron only takes up 5%. Alluminum is a higher element than iron which is at 8%.
That's for the Earth's crust. The Earth as a planet has more iron.

The original claim was that hydrogen was the most abundant element in "the world". It's the most abundant in the universe, so I suppose the correctness here hinges on what you mean by "the world".
 
Back
Top