The original post: /r/askscience by /u/CommercialSimple7026 on 2024-11-09 04:19:08.

This is probably a kind of dumb question, and i’ve kind of seen it answered before, but wanted more clarity. I have always wondered how we know radiometric dating and other methods like carbon dating to be accurate? I have already read answers such as it follows a “rate of decay” and it’s like a “clock that was fully wound up at the start, but has now run down half way. If you watch how much time it takes per turn and how many turns the spring can take, you can figure out how long ago it was fully wound.” But I don’t find this answer very sufficient (i could be dumb). How do we know the rate of decay follows a particular pattern? How do we know it decays linearly or exponentially or in any set way at all if we have not observed the entire decaying process of the elements we are tracing? (or even a fraction of it since isotopes like uranium-235 have a half-life of 700 million years). In other words, is it possible that our dating methods could be completely wrong since we evidently assume a set pattern for decay? Are we just giving a guess? I am probably missing something huge, and I am incredibly ignorant in this topic, but i’ve just had that question nagging me recently and am looking for an answer.