Assume we have a client using a web browser that is directly connected to a web server. We are assuming there is a direct link between the client and the server. The RTT delay between the client and the server is 10 msec. The time it takes the server to transmit an object to its outgoing link is 2 msec. Now suppose that the client has the web page that it wants already cached locally. But the user wants to constantly check for updates and sends conditional get requests one after the other 20 times -- that is, once it gets a response and/or the updated object it sends out the next request. If we are using persistent HTTP and the cached object is up-to-date 40% of the time, how much time will elapse between sending the first request and the completion of the last one?
Assume we have a client using a web browser that is directly connected to a web server. We are assuming there is a direct link between the client and the server. The RTT delay between the client and the server is 10 msec. The time it takes the server to transmit an object to its outgoing link is 2 msec. Now suppose that the client has the web page that it wants already cached locally. But the user wants to constantly check for updates and sends conditional get requests one after the other 20 times -- that is, once it gets a response and/or the updated object it sends out the next request. If we are using persistent HTTP and the cached object is up-to-date 40% of the time, how much time will elapse between sending the first request and the completion of the last one?
Trending now
This is a popular solution!
Step by step
Solved in 3 steps