Friday, August 24, 2012

What is the relationship between bandwidth and latency?


So what is the relationship between bandwidth and latency? If your Internet connection speed has adequate bandwidth, because the latency to slow it down? Or does it? Just how exactly does the latency affect the internet? These are just some of the most frequently asked questions ...... what follows are some answers in both technical and industry professionals.

Latency is the time it takes the data (packets) to get from point A (home / modem) to point B (the destination). Latency occurs because each of the "stop" your data has to make on the road to Point B. These arrests, called hops, routers are different and in some cases through the Internet server that handles and routes the traffic accordingly. The more hops that are added into the data path, the higher your latency will become. The B is the farthest point, usually a higher latency is experienced, simply because there is more distance and hop across. In addition, each of these hops can become busy so to speak, therefore, the busier they get more time that will take them to respond to your requests for traffic, so a higher latency.

Most file transfer over the Internet using TCP / IP. The receiver constantly sends messages to the sender (ACK) and let you know everything is the will or, if the packets must be resent. If the channel is a high-latency communication that reverse too long because of the transmitter to stop sending ACKS are received.

TCP also has a mechanism to slow start. The sender has no idea of ​​end-to-end channel capacity. A slow start is designed to prevent overwhelming intermediate connections are slower.

Esentially, the bandwidth is the speed between you and your ISP, nothing beyond that, your ISP has no control.

In reality, the latency may or may not be a problem. Given that the latency is the delay between getting information from point A to point B, is much more of a problem in interactive applications the large transfers.

With the transfer of large size, if the bandwidth is sufficient, reliable and properly configured, you will not notice much of a latency problem with high latency connections. Once the "tube is triggered", the data flows at full speed. As long as the ACK packets are returned at regular intervals frequent enough that the retransmission does not occur, the flow is constant and the delay is only really just the initial startup of the transfer.

However, with interactive applications, the initial delay is what really can kill you. While it is an exaggeration, say that they have a 1 second latency and sending a packet takes 1 second. If you send a file that is 10 packets long, the total connection time is 11 seconds. If you are sending a single packet and waiting for a response back to a single package, and you do this twice, the total connection time is 8 seconds, but still it is sent only 40% of the traffic so much.

The Web is a kind of traffic between the two. Is not typically a transfer of large size, but is not highly interactive as an online game. The traffic is typical page short bursts of requests (high latency), followed by long periods of inactivity, while watching the page. There are some tricks you can do to help reduce this as a problem. There are proxy servers and pre-fetch utility to "preload" the page for you. During this period where you are looking at the page and the connection is setting idle, the prefetcher can download the pages that are linked to the current one. When you request one, it is hoped that the page has been cached and can be displayed much faster. Otherwise, you are worse then having to wait to be loaded. This can work well for more static pages, but if you are looking for something for dynamic pages (eg Google Maps), a prefetcher does not work as well or at all. Moreover, checking to see if your browser is using an adequate number of connections that can make things better.

The bottom line is that there is a relationship between bandwidth and latency. But it may or may not be a problem .......

No comments:

Post a Comment