Signal Travel Time
1. **State the problem:**
A satellite orbits 1000 km above Earth's surface. The angle at the satellite between the line to Earth's center and the line to a receiving dish is 27°. We need to find the time a signal traveling at 300,000,000 m/s takes to reach the dish.
2. **Identify given data:**
- Radius of Earth, $R = 6370$ km
- Satellite altitude above Earth, $h = 1000$ km
- Distance from Earth's center to satellite, $r = R + h = 6370 + 1000 = 7370$ km
- Angle at satellite, $\theta = 27^\circ$
- Signal speed, $v = 300,000,000$ m/s
3. **Find the distance from satellite to dish:**
The satellite, Earth center, and dish form a triangle with angle $27^\circ$ at the satellite.
Since the dish lies on Earth's surface, the distance from Earth's center to dish is $R = 6370$ km.
We want the length of the side from satellite to dish, call it $d$.
Using the Law of Cosines in triangle with sides $r$, $R$, and $d$ opposite angle $\theta$:
$$d^2 = r^2 + R^2 - 2 r R \cos(\theta)$$
Substitute values:
$$d^2 = 7370^2 + 6370^2 - 2 \times 7370 \times 6370 \times \cos(27^\circ)$$
Calculate $\cos(27^\circ)$:
$$\cos(27^\circ) \approx 0.8910$$
Calculate each term:
$$7370^2 = 54,316,900$$
$$6370^2 = 40,576,900$$
$$2 \times 7370 \times 6370 \times 0.8910 \approx 83,665,000$$
So:
$$d^2 = 54,316,900 + 40,576,900 - 83,665,000 = 11,228,800$$
Take square root:
$$d = \sqrt{11,228,800} \approx 3350.7 \text{ km}$$
4. **Convert distance to meters:**
$$d = 3350.7 \times 1000 = 3,350,700 \text{ m}$$
5. **Calculate time taken for signal to travel distance $d$:**
$$t = \frac{d}{v} = \frac{3,350,700}{300,000,000} = 0.011169 \text{ seconds}$$
6. **Round to nearest thousandth:**
$$t \approx 0.011 \text{ seconds}$$
**Final answer:** The signal takes approximately **0.011 seconds** to reach the dish.