In the 1950s, nuclear power began to be used for commercial electricity generation as well as to drive submarines, navy ships and icebreakers, and people spoke glowingly of future nuclear-powered trains and aircraft. Enthusiasts proclaimed that electricity from a nuclear plant would be “too cheap to meter”. To put this into context, that is roughly the rate at which the average UK resident used electricity in 2011. When that same nuclear chain reaction is channelled into electricity generation, a single gram of uranium can provide 1 kilowatt of power for a year. In the aftermath of this destruction, proponents of nuclear power made the case that splitting the atom could also be used for benevolent ends, supplying clean energy for all. In December 1942 Enrico Fermi achieved the first self-sustaining nuclear chain reaction, harnessing the process that would be used to lay waste to the Japanese cities of Nagasaki and Hiroshima in August 1945. Here we focus on fission, exploited in hundreds of reactors around the world in 2011, it provided about 13.5 per cent of the world’s electrical energy A brief history of fission After more than half a century of research, fusion remains technologically elusive. In other words, there is an imbalance in the number of protons (positively charged particles) and electrons (negatively charged particles) in a chemical species. By fusing together the nuclei of two light atoms, or by splitting a heavy atom in a process called fission, we can release some of this binding energy. An ion is defined as an atom or molecule that has gained or lost one or more of its valence electrons, giving it a net positive or negative electrical charge. The protons and neutrons in an atom’s nucleus are bound together by the strong nuclear force. Read more: “ Instant Expert 32: Nuclear energy“
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |