The Large Hadron Collider
has been controversial since it was first proposed. This collider is believed to be capable of energy levels necessary for the creation of microscopic black holes. Initial estimates suggested that these black holes would evaporate in less than a billionth of a second. The first concern was, what if these calculations are wrong and black holes are created that don't evaporate, but instead interact with the matter around them and grow until a black hole consumes the entire Earth.
Initially, I considered this to be impossible because cosmic rays hit the Earth's atmosphere at much higher energies than even this collider is capable of and the Earth has survived this bombardment for the last 4.5 billion years, and so it seems unlikely that this collider will produce dangerous black holes if nature hasn't done so in 4.5 billion years.
However, tonight a guest on Coast to Coast AM pointed out something that caused me to reconsider my position on this. The cosmic rays are impacting the upper atmosphere with a huge amount of kinetic energy. Any microscopic black holes that might be created will be imparted with a huge amount of energy and will most likely have velocities in excess of the Earth's escape velocity and thus are likely to go right through the Earth, come out the other side, and keep on going. By contrast, the LHC which collides two beams head on, will produce particles with low velocity relative to the Earth and so any micro black holes that are created, if they don't evaporate, might hang around and swallow up the planet.
So now I am beginning to think perhaps this is not a good idea. I also have some questions with respect to the scientific value of the project. The thing that makes me question the scientific value requires some understanding of how particle accelerator experiments are conducted. You accelerate particles and slam them into a target, or in this case, two beams of particles slam into each other, and a huge spray of secondary particles results. The higher the beam intensity or beam current, the more events per second happen. You then have a huge number of detectors that detect various particles that are produced and feed this information into very fast computers that then try to make sense of it all.
Typically, when you do the experiment you are looking for a particular type of particle to result, and typically these particles are extremely short-lived and you never really see them directly. What you actually see are tracks from the decay products, that is, what the particles decay into. So when you're looking for a particular theoretical particle, what you actually are looking for are three or four or five or more particles that result from the decay of that particle, or an entire decay chain. But when you've got high beam intensities and millions or billions of events per second, there is a problem, and that problem is one of determining if the various particles matching the decay profile of the particle you are looking for are in fact all originated from the same event, or from several individual events in which case you end up with a false positive for the particle you are actually looking for.
There is a limit to the temporal resolution possible in the detection system. You can only make the time window so small because electronics and computers are only so fast. The higher the beam current or intensity, the larger the number of collisions per second and the higher the risk that independent events will match the decay signature of the particle you are looking for. Existing colliders are already pushing the limits of the computers ability to sort out the detector data. The LHC will produce significantly more data, and so there is a very real possibility that they're going to find virtually anything that they are looking for, regardless of whether or not it actually exists.
Thus, if they find the Higgs Boson, the so called "God Particle", there is no guarantee that they've really found it, only probabilities, and those probabilities, and the probability that a detected event is real and not just a coincidence of events, isn't terribly high. So assuming the Earth isn't rapidly destroyed and we're all still around to discuss results, we'll have to take any data that is generated with a grain of salt anyway. Seems like a huge amount of money is being spent for something that is going to produce highly questionable results under the best of circumstances.
There is also another way that the LHC may destroy the Earth, and that is through the creation of "strange matter". The idea here is that ordinary matter is composed of up and down quarks, but another type of particle is postulated that includes "strange" quarks. Some theories suggest that this matter may be even more stable than ordinary matter, and may interact with ordinary matter by absorbing it and converting it to more strange matter creating a situation very similar to the black hole scenario except that in this case the Earth would be rapidly converted to one big chunk of strange matter rather than black holes. The fact that we haven't observed any strange matter in nature and that ordinary matter still exists in large quantities argues against this being likely, but at least some theories suggest it is possible.
Beyond all of this I am just less than happy with the way science is getting done in this field, because instead of looking at the data and trying to create a theory that fits with observation, they're doing a search for data that confirms their existing theories. When you have a machine that is essentially a big particle white noise generator in which it is possible to find ANY particle signature you look for, this is really meaningless. It will be used to endorse an existing theory that might well be entirely wrong, and by doing so prevent us from looking for a correct theory and prevent science from advancing.