- Home
- Standard 11
- Physics
In an experiment to determine the acceleration due to gravity $g$, the formula used for the time period of a periodic motion is $T=2 \pi \sqrt{\frac{7(R-r)}{5 g}}$. The values of $R$ and $r$ are measured to be $(60 \pm 1) \mathrm{mm}$ and $(10 \pm 1) \mathrm{mm}$, respectively. In five successive measurements, the time period is found to be $0.52 \mathrm{~s}, 0.56 \mathrm{~s}, 0.57 \mathrm{~s}, 0.54 \mathrm{~s}$ and $0.59 \mathrm{~s}$. The least count of the watch used for the measurement of time period is $0.01 \mathrm{~s}$. Which of the following statement($s$) is(are) true?
($A$) The error in the measurement of $r$ is $10 \%$
($B$) The error in the measurement of $T$ is $3.57 \%$
($C$) The error in the measurement of $T$ is $2 \%$
($D$) The error in the determined value of $g$ is $11 \%$
$A,B,C$
$A,B,D$
$B,C$
$A,C$
Solution
The observed values of time period, $T _{ i }=0.52 s, 0.56 s, 0.57 s, 0.54 s$ and $0.59$ s
The mean value of time period, $T =\frac{\sum T _{ i }}{5}=\frac{2.78}{5}=0.56 s$
Magnitude of absolute error in each observation,
$\left|\Delta T _1\right|=|0.56-0.52|=|0.04| s$
Similarly, $\left|\Delta T _2\right|=|0.0| s \left|\Delta T _3\right|=|0.01| s \left|\Delta T _4\right|=|0.02| s \left|\Delta T _5\right|=|0.03| s$
Mean absolute error in time period,
$\Delta T _{ m }=\frac{0.04+0.00+0.01+0.02+0.03}{5}=0.02 s$
$\therefore$ Error in $T , \frac{\Delta T _{ m }}{ T } \times 100=\frac{0.02}{0.56} \times 100=3.57 \%$
Error in the measurement of $r : \frac{\Delta r }{ r } \times 100=\frac{1}{10} \times 100=10 \%$
From the equation given, we get: $g =\frac{28 \pi^2( R – r )}{5 T^2}$
$\therefore$ Error in the measurement of $g$ :
$\frac{\Delta g }{ g } \times 100=\frac{\Delta R +\Delta r }{( R – r )} \times 100+2 \frac{\Delta T _{ m }}{ T } \times 100$
$\Rightarrow \frac{\Delta g }{ g } \times 100=\frac{1+1}{(60-10)} \times 100+2(3.57) \%=11.14 \%$
Thus options $A, B$ and $D$ are correct.