From simple queries to complex problems, IDNLearn.com provides reliable answers. Get step-by-step guidance for all your technical questions from our dedicated community members.

An airline runs a commuter flight between Portland, Oregon, and Seattle, Washington, which are 145 miles apart. An increase of 20 miles per hour in the average speed of the plane decreases the travel time by 12 minutes. What initial average speed results in this decrease in travel time? (Round your answer to one decimal place.)

Sagot :

The speed of the airplane is the ratio of the distance to the time

The initial average speed of the airplane is approximately 110.8 mph

Reason:

Given parameters are;

The distance between Portland Oregon and Seattle Washington = 145 miles

The decrease in travel time when the speed is increase by 20 mph =  minutes

The initial speed of the airplane, v = Required;

Solution;

[tex]t = \dfrac{145}{v}[/tex]

[tex]t - \dfrac{12}{60} = \dfrac{145}{v + 20}[/tex]

Therefore;

[tex]\dfrac{145}{v} - \dfrac{12}{60} = \dfrac{145}{v + 20}[/tex]

Which gives;

[tex]\dfrac{145}{v + 20} - \dfrac{145}{v} + \dfrac{12}{60} = 0[/tex]

Therefore

0.2·v² + 4·v - 2,900 = 0

v ≈ ±110.8 mph

The initial average speed of the airplane, v≈ ± 110.8 mph

Learn more  here:

https://brainly.com/question/11365874