A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small.

Respuesta :

Answer:

the distance between interference fringes increases

Explanation:

For double-slit interference, the distance of the m-order maximum from the centre of the distant screen is

[tex]y=\frac{m \lambda D}{d}[/tex]

where [tex]\lambda[/tex] is the wavelength, D is the distance of the screen, and d the distance between the slits. The distance between two consecutive fringes (m and m+1) will be therefore

[tex]\Delta y = \frac{(m+1) \lambda D}{d}-\frac{m \lambda D}{d}=\frac{\lambda D}{d}[/tex]

and we see that it inversely proportional to the distance between the slits, d. Therefore, when the separation between the slits decreases, the distance between the interference fringes increases.