Could you please tell me what the factors that determine the network latency are?
And what exactly is the measurement of bandwidth. Some say Hertz, others say (giga/mega/kilo...)bit/s. And, in TCP/IT Guidebook, latency x bandwidth to calculate capacity of a,say, adsl line. How comes it, i mean, if bandwidth measured in hertz?
by the way, I also ask about the theoretical data rate and the real one (throughput): for example the FPT Mega Style ADSL pack claims its ideal downloading speed is 1,5Mbps, which means the real data rate should never exceed 1,5Mbps. But in reality the number even reaches more than 200KB/s (1,6Mbps) (of course with the assistance of download accelerator)