English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

5 answers

Ar/ Ar is expensive, takes a long time to do, and is only accurate to about 100,000 years

2007-07-25 08:09:23 · answer #1 · answered by Gwenilynd 4 · 2 0

The range is 30 000 to 4.6 billion years. The method is less accurate for young rocks due to the low rates of daughter product and is limited to certain rock formations. A sub-method of the K/Ar method is the Ar/Ar method which examines ratios of the two isotopes of Argon and can yield an age spectrum. K/Ar, Ar/Ar are part of the same dating technique.

As for the link to Creation Science, an oxymoron if there ever was one, I couldn't possibly comment.

2007-07-27 13:58:18 · answer #2 · answered by Derek H 2 · 0 0

It's K/Ar dating, not Ar/Ar... The ratio of the potassium (K) parent isotope to the Ar (Argon) daughter isotope is measured in a sample to be dated. Given that K alters via radioactive decay to Ar at a known rate, the age of the sample can be calculated. It's only appropriate for certain types of rock, e.g. igneous rocks, that are rich in potassic minerals like feldspsar. The assumption is that at the time of rock formation, the ratio of parent to daughter isotope was 100% to zero. There are problems with dating K-rich metamorphic rocks that have been altered after their original formation. As you look at older and older rocks, the likelyhood of some sort of alteration becomes more likely... so the answers from this method become more interpretative (bigger margins of error).

It's subject to certain assumptions and limitations associated with any analytical technique, not just radiometric ones. In particular, the sample must not have been contaminated with more recent material. There are practical ways of mitigating against contamination in the laboratory, but the risk of this can never entirely be removed. The technique involves the basic asumption that the rate of radioactive decay is constant throughout geological time. However, there is not a shred of evidence that the rate of radiocactive decay varies.

ChasChas (as usual) posts his usual ill-informed creationist pseudo-arguments. C14 dating: Carbon is a highly adsorbative material, so is readily prone to atmospheric contamination, unless the sample has been preserved in an anaerobic condition and sampled in air-tight conditions. As for Mount St. Helens, nearly all of the material erupted was ash and dust derived from earlier (pre-historic or older) eruptions, having been preserved within the volcano cone until the really big eruption in 1981 blew the whole thing apart. So it's hardly surprising that dating techniques for this material give considerably earlier dates!!!

2007-07-26 01:21:11 · answer #3 · answered by grpr1964 4 · 2 0

All radiometric dating methods rely on unprovable assumptions.
1. That the original amount of parent and daughter isotope is known.
2 That the rate of decay has been constant
3 that no contamination has occured.

There is good reason to suppose that radiometric methods are highly flawed. For example diamonds, which are allegedly millions of years old, contain carbon 14 which should no be detectable after about 50k years. Likewise all coal deposits contain C14.
On the other hand rock from Mt St Helens (just tens of years old) was dated as millions of years old.

Take radiometric 'ages' with a big pinch of salt!

http://www.creationontheweb.com/content/view/3059/

2007-07-25 11:06:18 · answer #4 · answered by a Real Truthseeker 7 · 0 3

go the full way son, she will love it ;)

2007-07-25 07:37:55 · answer #5 · answered by kiteboylee 2 · 0 0

fedest.com, questions and answers