A new study by researchers at Rensselaer Polytechnic Institute, AT&T Labs, and the University of Nevada, Reno suggests that an Internet where all traffic is treated identically would require significantly more capacity than one in which differentiated services are offered.
Findings from the study were presented June 22 at the Fifteenth IEEE International Workshop on Quality of Service (IWQoS 2007) in Evanston, Ill. IWQoS is a premier workshop on quality of service research, featuring rigorously reviewed technical sessions and papers.
As the Internet becomes more crowded with high-bandwidth applications and content, a wide-ranging debate is taking place about the issue of “network neutrality,” which involves both economic and technical aspects. One aspect of the debate involves whether application traffic that requires performance assurances (e.g., VoIP) could be serviced differently, or what the impact would be if all traffic were to be treated in an undifferentiated manner.
“We wanted to take one piece of the overall debate and approach it quantitatively,” said principal investigator Shivkumar Kalyanaraman, professor of electrical, computer, and systems engineering at Rensselaer. “The study makes clear that there are substantial additional costs for the extra capacity required to operate networks in which all traffic is treated alike, and carrying traffic that needs to still be assured performance as specified in service level agreements (SLAs).”
Using computer models, the researchers compared the current “best-effort” approach with a tiered model that separates information into two simple classes — one for most types of information and another for applications requiring service level assurance for high-bandwidth content like video games, telemedicine, and Voice over Internet Protocol (VoIP).
The study was meant to answer one basic question, according to Kalyanaraman: “If I want to meet the needs of applications that require service level assurances, how much more capacity do I need"”
The additional capacity needed for an undifferentiated network compared to a differentiated network is referred to as the Required Extra Capacity. The study estimates that the Required Extra Capacity in even modestly loaded networks could approach 60 percent. At times of heavy demand on the network, the Required Extra Capacity in an undifferentiated network could amount to an additional 100 percent or more of the total capacity required when differentiation is permitted.
“Clearly, an undifferentiated network in this context is less efficient and more expensive,” said coauthor K.K. Ramakrishnan of AT&T Labs. “We believe understanding the real impacts of the alternative strategies is important as the debate about network architecture unfolds.”
Jason Gorss | EurekAlert!
Multi-year study finds 'hotspots' of ammonia over world's major agricultural areas
17.03.2017 | University of Maryland
Diabetes Drug May Improve Bone Fat-induced Defects of Fracture Healing
17.03.2017 | Deutsches Institut für Ernährungsforschung Potsdam-Rehbrücke
Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.
The results will be published on March 22 in the journal „Astronomy & Astrophysics“.
Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...
Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.
Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...
In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...
Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to simulate these confined natural conditions in artificial vesicles for the first time. As reported in the academic journal Small, the results are offering better insight into the development of nanoreactors and artificial organelles.
Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to...
20.03.2017 | Event News
14.03.2017 | Event News
07.03.2017 | Event News
24.03.2017 | Materials Sciences
24.03.2017 | Physics and Astronomy
24.03.2017 | Physics and Astronomy