Suppose that the RTT delay between a client and an HTTP server is 30 msecs; the time a server needs to transmit an object into its outgoing link is 0.85 msecs; and any other HTTP message not containing an object has a negligible (zero) transmission time. Suppose the client again makes 100 requests, one after the other, waiting for a reply to a request before sending the next request. Assume the client is using HTTP 1.1 and the IF-MODIFIED-SINCE header line. Assume 50% of the objects requested have NOT changed since the client downloaded them (before these 90 downloads are performed) How much time elapses (in milliseconds) between the client transmitting the first request, and the completion of the last request?
Suppose that the RTT delay between a client and an HTTP server is 30 msecs; the time a server needs to
transmit an object into its outgoing link is 0.85 msecs; and any other HTTP message not containing an object
has a negligible (zero) transmission time. Suppose the client again makes 100 requests, one after the other,
waiting for a reply to a request before sending the next request.
Assume the client is using HTTP 1.1 and the IF-MODIFIED-SINCE header line. Assume 50% of the objects
requested have NOT changed since the client downloaded them (before these 90 downloads are performed)
How much time elapses (in milliseconds) between the client transmitting the first request, and the completion
of the last request?
Trending now
This is a popular solution!
Step by step
Solved in 2 steps with 1 images