No no, you don't quite get it and I think the reason why is correct.
As you approach the speed of light, time slows down, at the speed of light time = zero.
This means that you will not have any time to have "head lights" beaming in front of you, you will have reached your destination instantly. To a
person not going the speed of light, they will see just one light ray (you and your head light) but the ray of light will have different spectrums
(yours being longer than the head light).
Something of that effect.
This is why I stated a conceptual example of .99 light speed.
To you the light travels light speed away from you, however, to an observer, you are going .99 the speed of light and the head light is going light
speed, so the head light and you are not distancing yourselves as fast as you perceive.
This is because of time dilation.
You are going "slower" in time than the person who is stationary. So the person who is stationary is seeing you following fast behind your own head
light.
You on the other hand, 1 second for you is 1 million years to that stationary observer.
So in your one second, you see the light travel as fast as light will travel in that one second if from a stationary source.
This is because the light has distanced itself from you not by 1 second, but by 1 million years compared with a stationary source.
This is why to you as the near-light speed traveller, you see the light travel the speed of light from you, even though you are nearly as fast as
it.
At light speed, it does not matter because time = 0.
This is how it is different from Newtonian Physics (classical physics).
In classical physics, if you are travelling 10 miles an hour, and throw something 10 miles an hour, you see it travel from you at...10 miles an hour,
not at 20 miles an hour.
However, with light, if you shine light off your car you do not see it going the speed of light minus your speed, C-100kmh for instance.
That is false, it is too fast for you to tell, but you see it going C, not C-100kmh.
The person who is not moving sees it going C.
And as I explained above this is because of "time dilation".
In a car, throwing a ball, you see it go the speed it is thrown, because you must subtract the speed at which you are "chasing it". Following
it.
But if the ball were at light speed, and you were at .99 light speed, when you threw it, it would zip away from you at the speed of light. And in a
few hours you would reach the destination of the ball where it stopped, only it will have been many years and you are many light years away from where
you threw it?
I'm going in circles now, but does the concept make sense?
RECAP:
When you go .99C and you see your head light travel for a second, you see it move away for one year at the speed of light until it has gone 1 light
year beyond your position. 1.0C
The stationary observer must watch your "car" chasing the head light at .99 speed of light, for a million years to see that same year of travel that
you observe...but in that million years, the light of the head light, and your "car" will be exactly 1 light year apart in distance.
The miracle of time dilation.
1 million years and 1 year are incorrect approximations, the accurate comparison in time can be determined with the time dilation formula, which is in
the 20th post given in this thread.
*EDIT* The time dilation formula is not given in the 20th post, it can be derived from it however, I have forgotten the formula off hand but I
believe it is something like:
T = (1-(v/c)) Where T = time; v = velocity of observer; c = velocity of light.
I think that is the formula, though I think there is a square root going on or something, but basically, when you approach the speed of light time
aproaches zero.
I would like someone well grounded in physics to determine whether or not this concept is accurate, half-accurate, or off-base.
[edit on 19-7-2004 by FreeMason]