two clocks/watch (clock A & clock B) synchronised minute for minute, seconds for seconds exactly. after 10 year passed, there's surely a deviation between both time that the clock reads, probably half a second or perhaps half a minute, so how do i determine which one is the accurate one, how do i know clock A is faster or slower or perhaps how do i determine clock B is the one which is slower or faster. supposing that i take a third clock (clock C) as reference how do i determine whether it is accurate as well. any clue? please justify
2006-07-26
04:05:58
·
5 answers
·
asked by
Anonymous
in
Science & Mathematics
➔ Physics