While the conditions required to create nuclear energy usually require extreme temperatures—think of the processes that power the sun—the theory of cold fusion states that such a reaction is possible at room temperature. It’s a deceivingly simple concept, but the implications are spectacular: if a nuclear reaction could occur at room temperature, then an abundance of energy could be created without the dangerous waste that results from nuclear power plants. This groundbreaking theory briefly seemed to have become a reality in 1989, when the electro-chemists Martin Fleischmann and Stanley Pons published experimental results suggesting that they had achieved cold fusion—and the precious “excess energy” it was hoped to produce—in an experiment where an electric current was run through seawater and a metal called Palladium. The response to Pons and Fleischmann’s claims by the media and the scientific community was overwhelming. The experiments were hailed as a turning point in science, and it was briefly believed that with cold fusion energy would be cheap, clean, and abundant.
How it was Proven Wrong:
The fervor over cold fusion died down as soon as other scientists tried to replicate the experiment. Most failed to get any kind of similar results, and after their paper was closely studied, Fleischmann and Pons were accused not only of sloppy, unethical science, but were even said to have stretched the truth of their results. For years after, the idea of cold fusion became synonymous with fringe science. Still, despite the stigma attached to it, many have argued that there was never anything necessarily wrong about cold fusion as a theory. In recent years, scientists have once again started to experiment with new ways of achieving a so-called “tabletop nuclear reaction,” with some even claiming to have achieved surprising success.