Radiocarbon dating also referred to as carbon dating or carbon dating is a method for determining the age of an object containing organic material by using the properties of radiocarbon , a radioactive isotope of carbon. The method was developed in the late s at the University of Chicago by Willard Libby , who received the Nobel Prize in Chemistry for his work in It is based on the fact that radiocarbon 14 C is constantly being created in the atmosphere by the interaction of cosmic rays with atmospheric nitrogen. The resulting 14 C combines with atmospheric oxygen to form radioactive carbon dioxide , which is incorporated into plants by photosynthesis ; animals then acquire 14 C by eating the plants. When the animal or plant dies, it stops exchanging carbon with its environment, and thereafter the amount of 14 C it contains begins to decrease as the 14 C undergoes radioactive decay. Measuring the amount of 14 C in a sample from a dead plant or animal, such as a piece of wood or a fragment of bone, provides information that can be used to calculate when the animal or plant died.
The Reliability of Radiocarbon Dating
Radiocarbon Dating’s New Calibration Curve - Archaeology Southwest
Radiocarbon, or Carbon, dating is probably one of the most widely used and best known absolute dating methods. It was developed by J. Arnold and W. Libby in , and has become an indispensable part of the archaeologist's tool kit since. It's development revolutionized archaeology by providing a means of dating deposits independent of artifacts and local stratigraphic sequences.
How Global Warming is Affecting the Accuracy of Radiocarbon Dating
Radiocarbon dating is a key tool archaeologists use to determine the age of plants and objects made with organic material. But new research shows that commonly accepted radiocarbon dating standards can miss the mark -- calling into question historical timelines. Archaeologist Sturt Manning and colleagues have revealed variations in the radiocarbon cycle at certain periods of time, affecting frequently cited standards used in archaeological and historical research relevant to the southern Levant region, which includes Israel, southern Jordan and Egypt. These variations, or offsets, of up to 20 years in the calibration of precise radiocarbon dating could be related to climatic conditions.
When news is announced on the discovery of an archaeological find, we often hear about how the age of the sample was determined using radiocarbon dating, otherwise simply known as carbon dating. Deemed the gold standard of archaeology, the method was developed in the late s and is based on the idea that radiocarbon carbon 14 is being constantly created in the atmosphere by cosmic rays which then combine with atmospheric oxygen to form CO2, which is then incorporated into plants during photosynthesis. When the plant or animal that consumed the foliage dies, it stops exchanging carbon with the environment and from there on in it is simply a case of measuring how much carbon 14 has been emitted, giving its age. But new research conducted by Cornell University could be about to throw the field of archaeology on its head with the claim that there could be a number of inaccuracies in commonly accepted carbon dating standards. If this is true, then many of our established historical timelines are thrown into question, potentially needing a re-write of the history books.