How do geologists know how old things are?
Geologists routinely work in millions and billions of years, so normal calendars and methods of representing spans of time don't always work very well. Fortunately, being as clever as they are, geologists have managed to come up with other ways.
Ever since the first human looked at a piece of the earth and wondered about it, we have been assigning a relative order to the events which have taken place. This sandstone was deposited first, then this limestone, and then it was folded, and so on. These attempts to decipher the complexity of the earth and arrange it into definable units of time has evolved into the Relative Geologic Time Scale. This works great, and with some guidance from the Laws of Geology and GeoFantasy, it really helps geologists make sense out of what we see on the surface of the earth.
The problem still remains, though: we may know that the sandstone is older than the limestone, but by how much? In order to answer this, we need to know the ages of the sandstone and limestone, and be able to subtract. Subtraction is easy - the trick is knowing the ages. Unfortunately, there is no way to know the age of something just by looking, and it wasn't until Henry Becquerel discovered radioactivity in 1896 that a way was found to figure out how old rocks are. The first radiometric age dates were calculated in 1907 by a study of how long it takes for uranium to decay into lead. The science has evolved, and geologist routinely calculate "absolute" ages for all kinds of rocks and minerals. But what if our assumptions concerning the decay of unstable elements are wrong? As with everything else in geology, use the data as needed, but don't carve them in stone!
Click here to ask GeoMan a question
Return to Ask GeoMan's Index of Questions
Return to GeoMan's Home Page
You are GeoManiac number since April 1, 1997