IDNLearn.com: Your destination for reliable and timely answers to any question. Our experts provide prompt and accurate answers to help you make informed decisions on any topic.

If the distance from an antenna on Earth to a geosynchronous communications satellite is 20600 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite