The growing demand for bandwidth for data-intense applications, such as online videos and cloud gaming, and the increasing commercial relevance of the Internet have stimulated a global debate on the viability of the Internet's economic model and the role of net neutrality regulation.
The consultation process of the European Commission in the second half of 2010 resulting in over 300 responses shows the vivid interest of policy makers and regulators, industry, and the general public on this matter in Europe. “However, a thorough analysis of the implications of net neutrality regulation on some possible Internet business models adapted to the different market conditions in Europe, foremost European access regulation, is missing,” states Hans W. Friederiszick, coauthor of the study “Assessment of a sustainable Internet model for the near future.”In the independent academic study by Hans W. Friederiszick, Jakub Kaluzny, Simone Kohnz, and Lars-Hendrik Röller and commissioned by Deutsche Telekom, ESMT Competition Analysis (CA) presents four Internet business models and deliberates on the potential regulatory implications of each. Each business model focuses on a different aspect, like congestion management, new services or consumer choice, thus covering a broad universe of potential future scenarios. Each model affects congestion, innovation, as well as consumer and
social welfare which a regulatory approach should take into account. Since it is difficult to predict with any certainty which business models will dominate in the future, the prudent approach requires authorities to apply an informed “wait and see” approach: closely monitoring market developments and forcefully reacting to any emerging competitive threats rather than acting pre-emptively and therewith preventing some beneficial business models from developing.
Lars-Hendrik Röller, ESMT president and senior advisor at CA, said, “European regulators should carefully consider the economic consequences of regulation of the Internet. This report provides them with comprehensive insights on how different business models are affected by different regulatory frameworks.”
The four profitable Internet business models identified by the authors:
The “Congestion Based Model” tackles congestion problems through congestion-based pricing; however no quality differentiation is introduced. Specifically, in this business model ISPs are assumed to charge content providers higher prices for traffic in peak periods than in off-peak periods. For example, the cost for a provider of movie downloads could be significantly higher when an end user downloads a HD movie during the peak evening period than in the early morning hours or within a 24-hour period. End users in this business model can choose between flat rates with differentiated data caps.
The “Best Effort Plus Model” preserves the traditional best effort network for existing services and assumes that content providers and end users are priced as in the status quo if they operate on the best effort level. However, these restrictions do not apply to innovative future services, for which pricing and minimum service requirements follow individual negotiations between the eyeball ISP and the content provider.
The “Quality Classes – Content Pays Model” stresses the perceived need of different applications for various degrees of quality of service and offers different quality classes open for different applications. Unlike in the previous business model, the quality classes encompass all services, including currently available traditional services. Depending on their requirements, a content provider could purchase a transit quality most appropriate for its type of content. For example, a content provider offering HD movie streaming or gaming services requiring low latency would purchase a more expensive premium quality class to ensure the quality of experience for end users. End users would still pay a uniform flat rate in this model and experience the quality as chosen by the content provider.
The “Quality Classes – User Pays Model” focuses on consumer choice for higher quality levels and offers multiple quality classes for end users that are designed to match their different usage patterns. For example, end users who frequently use interactive applications might choose the quality class which is more suitable for dealing with such applications, i.e., that offers a low level of delay and jitter.
To download the study: http://www.esmt.org/info/latestPress contact
Europe's microtechnology industry is attuned to growth
10.03.2017 | IVAM Fachverband für Mikrotechnik
Preferential trade agreements enhance global trade at the expense of its resilience
17.02.2017 | International Institute for Applied Systems Analysis (IIASA)
Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.
The results will be published on March 22 in the journal „Astronomy & Astrophysics“.
Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...
Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.
Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...
In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...
Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to simulate these confined natural conditions in artificial vesicles for the first time. As reported in the academic journal Small, the results are offering better insight into the development of nanoreactors and artificial organelles.
Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to...
20.03.2017 | Event News
14.03.2017 | Event News
07.03.2017 | Event News
24.03.2017 | Materials Sciences
24.03.2017 | Physics and Astronomy
24.03.2017 | Physics and Astronomy