The growing demand for bandwidth for data-intense applications, such as online videos and cloud gaming, and the increasing commercial relevance of the Internet have stimulated a global debate on the viability of the Internet's economic model and the role of net neutrality regulation.
The consultation process of the European Commission in the second half of 2010 resulting in over 300 responses shows the vivid interest of policy makers and regulators, industry, and the general public on this matter in Europe. “However, a thorough analysis of the implications of net neutrality regulation on some possible Internet business models adapted to the different market conditions in Europe, foremost European access regulation, is missing,” states Hans W. Friederiszick, coauthor of the study “Assessment of a sustainable Internet model for the near future.”In the independent academic study by Hans W. Friederiszick, Jakub Kaluzny, Simone Kohnz, and Lars-Hendrik Röller and commissioned by Deutsche Telekom, ESMT Competition Analysis (CA) presents four Internet business models and deliberates on the potential regulatory implications of each. Each business model focuses on a different aspect, like congestion management, new services or consumer choice, thus covering a broad universe of potential future scenarios. Each model affects congestion, innovation, as well as consumer and
social welfare which a regulatory approach should take into account. Since it is difficult to predict with any certainty which business models will dominate in the future, the prudent approach requires authorities to apply an informed “wait and see” approach: closely monitoring market developments and forcefully reacting to any emerging competitive threats rather than acting pre-emptively and therewith preventing some beneficial business models from developing.
Lars-Hendrik Röller, ESMT president and senior advisor at CA, said, “European regulators should carefully consider the economic consequences of regulation of the Internet. This report provides them with comprehensive insights on how different business models are affected by different regulatory frameworks.”
The four profitable Internet business models identified by the authors:
The “Congestion Based Model” tackles congestion problems through congestion-based pricing; however no quality differentiation is introduced. Specifically, in this business model ISPs are assumed to charge content providers higher prices for traffic in peak periods than in off-peak periods. For example, the cost for a provider of movie downloads could be significantly higher when an end user downloads a HD movie during the peak evening period than in the early morning hours or within a 24-hour period. End users in this business model can choose between flat rates with differentiated data caps.
The “Best Effort Plus Model” preserves the traditional best effort network for existing services and assumes that content providers and end users are priced as in the status quo if they operate on the best effort level. However, these restrictions do not apply to innovative future services, for which pricing and minimum service requirements follow individual negotiations between the eyeball ISP and the content provider.
The “Quality Classes – Content Pays Model” stresses the perceived need of different applications for various degrees of quality of service and offers different quality classes open for different applications. Unlike in the previous business model, the quality classes encompass all services, including currently available traditional services. Depending on their requirements, a content provider could purchase a transit quality most appropriate for its type of content. For example, a content provider offering HD movie streaming or gaming services requiring low latency would purchase a more expensive premium quality class to ensure the quality of experience for end users. End users would still pay a uniform flat rate in this model and experience the quality as chosen by the content provider.
The “Quality Classes – User Pays Model” focuses on consumer choice for higher quality levels and offers multiple quality classes for end users that are designed to match their different usage patterns. For example, end users who frequently use interactive applications might choose the quality class which is more suitable for dealing with such applications, i.e., that offers a low level of delay and jitter.
To download the study: http://www.esmt.org/info/latestPress contact
Mathematical confirmation: Rewiring financial networks reduces systemic risk
22.06.2017 | International Institute for Applied Systems Analysis (IIASA)
Frugal Innovations: when less is more
19.04.2017 | Fraunhofer-Institut für Arbeitswirtschaft und Organisation IAO
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses