Answer:
The answer is "[tex]0.143 \times 10^3 \ s[/tex]"
Explanation:
Please find the complete question in the attachment.
The size of the file[tex]= F = 2 \ Gbits = 2 \times 10^9 \ bits[/tex]
8 file copies have had to be uploaded from Server. So uploading size of to all
Rate of Server upload [tex]= u = 83 Mbps = 83 \times 10^6 \ bps[/tex]
therefore, the minimum time for the server to upload the file:
[tex]= 8 \frac{F}{u} \\\\= \frac{(8 \times 2 \times 10^9)}{(83 \times 10^6)} \ s \\\\= 0.193 \times 10^3 \ s[/tex]
Receive rate of the slower server:
[tex]=dmin \\\\= min\{d_1,d_2,d_3,d_4,d_5,d_6,d_7,d_8\} \\\\= 14\ Mbps \\\\= 14\times 10^6\ bps[/tex]
The minimum time necessary to get the file:
[tex]= \frac{F}{dmin}\\\\ = \frac{(2 \times 10^9)}{(14 \times 10^6)}\ s \\\\= 0.143 \times 10^3 \ s[/tex]