In Capstone, you might acquire position vs. time data for your simple harmonic oscillator that look like the example below. How could you measure the period of oscillation directly from the sinusoidal graph? A sine function has also been fit to these data, with parameters in the white box. In particular, the angular frequency of the fit function is ? = 5.19 rad/s. Compare the period you measured graphically to the period you would get from the ? value. If there is a discrepancy, can you explain it or reduce it by a more careful measurement? Capstone also displays max, min, and mean values for the data. These are max = 0.305 m, min = 0.199 m, and mean = 0.253 m. What is the amplitude of the oscillation?