Q:

Driving to your friend's house, you travel at an average rate of 35 miles per hour. On your way home, you travel at an average rate of 40 miles per hour. If the round trip took you 45 minutes, how far is it from your house to your friend's house?

Accepted Solution

A:
Answer:14 milesExplanation:The relation that will be used to solve this problem is:[tex]Distance = Velocity*Time[/tex]which can be rewritten as:[tex]Time = \frac{Distance}{Velocity}[/tex]Assume that:Distance from your house to your friend's house is dTime taken from your house to your friend's house is t₁Time taken from your friends house to your house is t₂1- From your house to your friend's house:Average rate = 35 miles/hourTherefore:[tex]t_{1} = \frac{d}{35}[/tex]2- From your friend's house to your house:Average rate = 40 miles per hourTherefore:[tex]t_{2} = \frac{d}{40}[/tex]3- Round trip:We know that the round trip took 45 minutes which is equivalent to 0.75 hoursThis means that:t₁ + t₂ = 0.75 hours[tex]\frac{d}{35}+\frac{d}{40}=0.75\\ \\\frac{40d + 35d}{1400}=0.75\\  \\\frac{75d}{1400}=0.75\\  \\ 1050 = 75d\\ d=14 miles[/tex]Hope this helps :)