You are throwing darts at a square target with corners at
(0,0)
, and
(1,1)
. Each dart lands uniformly over the target. Let
Y
be the distance from the dart to the
x
-axis, and similarly,
X
be the distance from the dart to the
y
-axis. Let
D
be the Manhattan distance from dart to the origin,
X+Y
. What is the probability that
D≤0.5
? Express your answer as a fraction
a/b
in simplest form. What is the probability that
0.5≤D≤1
? Express your answer as a fraction a/b in simplest form.