contestada

When the magnetic flux through a single loop of wire increases by , an average current of 40 A is induced in the wire. Assuming that the wire has a resistance of , (a) over what period of time did the flux increase? (b) If the current had been only 20 A, how long would the flux increase have taken?

Respuesta :

COMPLETE QUESTION:

When the magnetic flux through a single loop of wire increases by 30 Tm^2 , an average current of 40 A is induced in the wire. Assuming that the wire has a resistance of 2.5 ohms , (a) over what period of time did the flux increase? (b) If the current had been only 20 A, how long would the flux increase have taken?

Answer:

(a). The time period is 0.3s.

(b). The time period is 0.6s.

Explanation:

Faraday's law says that for one loop of wire the emf [tex]\varepsilon[/tex] is

[tex](1). \: \: \varepsilon = \dfrac{\Delta \Phi_B}{\Delta t }[/tex]

and since from Ohm's law

[tex]\varepsilon = IR[/tex],

then equation (1) becomes

[tex](2). \: \:IR= \dfrac{\Delta \Phi_B}{\Delta t }.[/tex]

(a).

We are told that the change in magnetic flux is [tex]\Phi_B = 30Tm^2[/tex],  the current induced is [tex]I = 40A[/tex], and the resistance of the wire is [tex]R = 2.5\Omega[/tex]; therefore, equation (2) gives

[tex](40A)(2.5\Omega)= \dfrac{30Tm^2}{\Delta t },[/tex]

which we solve for [tex]\Delta t[/tex] to get:

[tex]\Delta t = \dfrac{30Tm^2}{(40A)(2.5\Omega)},[/tex]

[tex]\boxed{\Delta t = 0.3s}[/tex],

which is the period of time over which the magnetic flux increased.

(b).

Now, if the current had been [tex]I =20A[/tex], then equation (2) would give

[tex](20A)(2.5\Omega)= \dfrac{30Tm^2}{\Delta t },[/tex]

[tex]\Delta t = \dfrac{30Tm^2}{(20A)(2.5\Omega)},[/tex]

[tex]\boxed{\Delta t = 0.6 s\\}[/tex]

which is a longer time interval than what we got in part a, which is understandable because in part a the rate of change of flux [tex]\dfrac{\Delta \Phi_B}{\Delta t}[/tex] is greater than in part b, and therefore , the current in (a) is greater than in (b).