A physical quantity is related to four observables and as follows;
The percentage errors of measurement in and are 1%, 3%, 4% and 2%, respectively. Then the percentage error in the quantity is:
1. 12%
2. 13%
3. 14%
4. 15%
It is claimed that two caesium clocks if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 sec. What does this imply for the accuracy of the standard caesium clock in measuring a time interval of 1 sec?