English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I launch a probe out into space and the timer on the probe is set to take photo’s and transmit them back every 5 seconds.

Following the theory of relativity, and assuming the probe is always travelling at the same speed, will the time difference between the photos receive increase cumulatively or will it be the constant difference between how far the light takes to travel the 5 seconds the probe travelled away from Earth?

i.e. will the gap between photo’s received increase as it gets further away, or will the gap always be 5 seconds + (the time it takes the light to travel the distance the probe travelled in 5 seconds)?

2007-02-27 23:36:58 · 10 answers · asked by Paul M 5 in Science & Mathematics Astronomy & Space

10 answers

The gap will increase even though the photos are being taken at 5 second intervals - due to the fact that radio signals travel slower than light - further away it travels the longer it will take

2007-02-27 23:42:45 · answer #1 · answered by jamand 7 · 0 3

The clock on the spaceship is moving more slowly than a counterpart on Earth, so the faster the probe moves the longer the gap will be between photos, not just because of the distance traveled, but because for every 5 seconds that passes on the probe, more than 5 seconds will have passed on Earth.

2007-02-28 07:59:22 · answer #2 · answered by SteveA8 6 · 0 0

The answer to your specific question.....

You will recive photos at 5 second intervals.

All that will change is the frequencey [lower]/wavelength [higher] of the signal. You tell it to transmit at a certain frequency and because it is moving away the frequencey is less [the wavelength gets stretched because in the time taken to emit a wavelenghth, the source moves away, hence wavelength stretched]

This was used by the Cassini probe. Someone stuffed up on the message for how to start the mission. The frequency was too high for the processor to cope with that far out. Or something.

By slowing down the spaceship they made the radio wave arive quicker, this increased the frequencey which meant the spaceship could follow its instructions. Mission saved.

The time gap is not important. It's the time between the signals. This is what is dependent on speed

2007-02-28 18:27:59 · answer #3 · answered by BIMS Lewis 2 · 0 0

The time difference between each signal being received should remain constant aslong as the probe remains at a constant speed.

However due to time dilation (see reference) the time between each signal being transmitted will not be 5 seconds in the reference frame of earth (a travelling clock ticks slower). So the time difference for receiving data would be gamma*5 + the time it takes the light to travel the distance the probe travels in gamma*5 seconds(again see the reference for the equation for gamma relating the probes speed with the speed of light).

I think that is right, special relativity is quite confusing.

2007-02-28 07:53:07 · answer #4 · answered by Mike 5 · 0 0

As was pointed out, the clock on the probe ticks more slowly (by t/(√(1 - (v²/c²)) where v is the probes speed, c is 3*10^8, and t is the length of one 'tick' as measured on Earth). But this is a constant value, so it won't change unless the probes speed changes. It will take longer and longer to receive an individual image (since the probe is getting further and further away) but the time between images will remain constant.

HTH ☺

Doug

2007-02-28 08:47:39 · answer #5 · answered by doug_donaghue 7 · 1 0

The time between pictures would increase as a function of distance. The 5 second gap would only remain constant if the probe was stationary.

2007-02-28 07:45:04 · answer #6 · answered by Del Piero 10 7 · 0 0

If the space vehile is moving away(receeding from earth)and the speed is much less than the speed of light, It is not needed to apply Relativity formula.
What applies in this case is The" Doppler Effect Formula."
The doppler effect shows that as the space vehicle receeds any
any radio wave trasmitted would experience a change of frequency.So on earth the signals are received at a lower frequency than the transmitted frequency.The velocity of the wave still remains the same which is at the speed of light.
So the the time photos are received initially will arrive at a time which is equal to the distance of transmission divide by the speed of light=Distance/C.
Atfer that the increments of 5seconds would increase in proportion to the distance but only in small amounts. And the receiving radio will have to continually adjust its frequency due to the doppler effect.
Therefore you are correct about a changing receiving time signals between transmisson intervals .

2007-02-28 08:55:41 · answer #7 · answered by goring 6 · 0 0

The gap will always remain 5 seconds plus the distance traveled during shots.
The clock would stay the same,you are not accelerating.

2007-02-28 08:49:47 · answer #8 · answered by Billy Butthead 7 · 0 0

It will always take 5 seconds in the same medium (vacuum), baring any gravitation well it might encounter.

2007-02-28 09:15:33 · answer #9 · answered by Anonymous · 0 0

my guess is that the gap would get bigger and bigger.

2007-02-28 08:38:30 · answer #10 · answered by Prince_Krona 2 · 0 0

fedest.com, questions and answers