It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
No I don't and that was not the point. The point still is, the rate of decay changes in a cycle of 33 days and it was linked to the sun. You said it would normalize out, and I said that's assuming the sun remained the same for all those billions of years. Until any of you can prove that article wrong, you can not use the argument that the rate of decay is a constant. Maybe you can use it to date stuff for 100.000 years, maybe a million, since it might've been constant for that period but billions is stretching it because there's no reference point to say it has always been this way, especially if the sun has an influence.
Originally posted by iterationzero
reply to post by vasaga
So you don't have a source that describes the effects of supernovae on terrestrial radioisotope decay rates?
Again, you can't see the irony. You're nothing more than a repeater. The last link you posted, was from wikipedia, and the source was basically old data, compared to the article I posted, which is not even a year old. So your sentence applies more to you than me. The rate of decay can no longer be considered a constant. Live with it.
Originally posted by MrXYZ
Originally posted by vasaga
reply to post by iterationzero
Fine.
reply to post by MrXYZ
My god you really are blind. Never mind. Stay inside your biased bubble.
In other words, you can't refute the info I gave...and therefore ignore that information because it goes against yoru irrational belief
A number of experiments have found that decay rates of other modes of artificial and naturally-occurring radioisotopes are, to a high degree of precision, unaffected by external conditions such as temperature, pressure, the chemical environment, and electric, magnetic, or gravitational fields. Comparison of laboratory experiments over the last century, studies of the Oklo natural nuclear reactor (which exemplified the effects of thermal neutrons on nuclear decay), and astrophysical observations of the luminosity decays of distant supernovae (which occurred far away so the light has taken a great deal of time to reach us), for example, strongly indicate that decay rates have been constant (at least to within the limitations of small experimental errors) as a function of time as well.
On the other hand, some recent results suggest the possibility that decay rates might have a weak dependence (0.5% or less) on environmental factors. It has been suggested that measurements of decay rates of silicon-32, manganese-54, and radium-226 exhibit small seasonal variations (of the order of 0.1%), proposed to be related to either solar flare activity or distance from the sun. However, such measurements are highly susceptible to systematic errors, and a subsequent paper has found no evidence for such correlations in a half-dozen isotopes, and sets upper limits on the size of any such effects. However, research at Purdue University indicates that the rate of radioactive decay may not be truly constant, but slightly influenced by solar flares due to variations in solar neutrino flux.
Originally posted by vasaga
No I don't and that was not the point.
The point still is, the rate of decay changes in a cycle of 33 days and it was linked to the sun. You said it would normalize out, and I said that's assuming the sun remained the same for all those billions of years. Until any of you can prove that article wrong, you can not use the argument that the rate of decay is a constant. Maybe you can use it to date stuff for 100.000 years, maybe a million, since it might've been constant for that period but billions is stretching it because there's no reference point to say it has always been this way, especially if the sun has an influence.
Not to mention there's stuff like leakage when rocks come in contact with liquids blah blah but that's another story.
But, in any case, even if I don't have anything to provide regarding supernovae, is it really that preposterous to think that if a sun can influence decay rates, that a supernova also can? Well if it's billions of light years away, maybe not. And also, I never said it had an influence on terrestrial decay rates right now. We don't know what happened in the past. For all we know decay rates were a lot faster or a lot slower in the past. That would throw the whole age of everything in a loop.
And oh btw, the only real calibration we have, at least for carbon-14, is tree rings. So it's reliable to a certain extent.
For other stuff, they use rocks with no radioactive elements as a reference. Either from space or here on earth. Space rocks are also a stretch. We don't know the decay rate outside of our solar system. I'm not even sure if they even tested the decay rates on other planets to determine if they're the same.
If I'm talking complete nonsense, let me know WHY please. Thank you.
Originally posted by vasaga
reply to post by MrXYZ
You didn't do jack. The other two provided useful info. All you did was repeat your outdated info and then declare yourself the victor. But, I can't expect anything else from a simple mind like you... But fine... Believe whatever you want to. I'm clearly a creationist and always wrong and completely biased and ask only stupid questions, and you're very smart, always say right things, and have the best thinking methods ever.edit on 9-4-2011 by vasaga because: (no reason given)
Scientific Fact No. 7 - Chromosome Count Proves Evolution is Wrong There is no scientific evidence that a species can change the number of chromosomes within the DNA. The chromosome count within each species is fixed. This is the reason a male from one species cannot mate successfully with a female of another species. Man could not evolve from a monkey. Each species is locked into its chromosome count that cannot change. If an animal developed an extra chromosome or lost a chromosome because of some deformity, it could not successfully mate. The defect could not be passed along to the next generation. Evolving a new species is scientifically impossible. Evolutionists prove that getting a college education does not impart wisdom.
reply to post by edsinger
Scientific Fact No. 5 - DNA Error Checking Proves Evolution is Wrong The scientific fact that DNA replication includes a built-in error checking method and a DNA repair process proves the evolutionary theory is wrong. The fact is that any attempt by the DNA to change is stopped and reversed.
Originally posted by linliangtai
reply to post by MrXYZ
All evolutionists believe the Earth is 4.5 billion years old. In fact, the Earth is forty million billion years old as mentioned in “God’s story” wretchfossil.blogspot.com...
God said there has never been a new star in the universe. That means our sun and Earth is as old as all the other stars. At present, scientists are unable to test rocks for ages over 500 billion years old.
All human civilizations proceeded in a circular rather than linear way. All civilizations went in small cycles of 12,000 years. Many such small cycles form a large cycle of about 50 million years. At the end of each large cycle there was a mass extinction event. The latest small cycle began 6000 years ago for much area on Earth. That’s possibly why some Creationists mistook the present 6000-year-old civilization for a 6000-year-old Earth.
Originally posted by linliangtai
reply to post by MrXYZ
Mud slinger.