A plane flies over city A and reaches city B, 115 miles away, in 15 minutes. Find the speed of the plane in miles per hour assuming the plane flew at a constant speed.
Just divide the miles travelled by the time: 115/15= 7.67 miles per minute Then convert the minutes to hours by multiplying the 7.67 and the 1 by 60 (remember that "per" indicates "over one", so 7.67 miles/1 minute): 7.67*60/1*60= 460.2 miles per hour.