A new study by researchers at Rensselaer Polytechnic Institute, AT&T Labs, and the University of Nevada, Reno suggests that an Internet where all traffic is treated identically would require significantly more capacity than one in which differentiated services are offered.
Findings from the study were presented June 22 at the Fifteenth IEEE International Workshop on Quality of Service (IWQoS 2007) in Evanston, Ill. IWQoS is a premier workshop on quality of service research, featuring rigorously reviewed technical sessions and papers.
As the Internet becomes more crowded with high-bandwidth applications and content, a wide-ranging debate is taking place about the issue of “network neutrality,” which involves both economic and technical aspects. One aspect of the debate involves whether application traffic that requires performance assurances (e.g., VoIP) could be serviced differently, or what the impact would be if all traffic were to be treated in an undifferentiated manner.
“We wanted to take one piece of the overall debate and approach it quantitatively,” said principal investigator Shivkumar Kalyanaraman, professor of electrical, computer, and systems engineering at Rensselaer. “The study makes clear that there are substantial additional costs for the extra capacity required to operate networks in which all traffic is treated alike, and carrying traffic that needs to still be assured performance as specified in service level agreements (SLAs).”
Using computer models, the researchers compared the current “best-effort” approach with a tiered model that separates information into two simple classes — one for most types of information and another for applications requiring service level assurance for high-bandwidth content like video games, telemedicine, and Voice over Internet Protocol (VoIP).
The study was meant to answer one basic question, according to Kalyanaraman: “If I want to meet the needs of applications that require service level assurances, how much more capacity do I need"”
The additional capacity needed for an undifferentiated network compared to a differentiated network is referred to as the Required Extra Capacity. The study estimates that the Required Extra Capacity in even modestly loaded networks could approach 60 percent. At times of heavy demand on the network, the Required Extra Capacity in an undifferentiated network could amount to an additional 100 percent or more of the total capacity required when differentiation is permitted.
“Clearly, an undifferentiated network in this context is less efficient and more expensive,” said coauthor K.K. Ramakrishnan of AT&T Labs. “We believe understanding the real impacts of the alternative strategies is important as the debate about network architecture unfolds.”
Jason Gorss | EurekAlert!
The importance of biodiversity in forests could increase due to climate change
17.11.2017 | Deutsches Zentrum für integrative Biodiversitätsforschung (iDiv) Halle-Jena-Leipzig
Win-win strategies for climate and food security
02.10.2017 | International Institute for Applied Systems Analysis (IIASA)
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses