A report on the OECDBEREC Workshop on Interconnection and Regulation
I presented at an OECD/BEREC workshop that was held on 20 June 2012 in Brussels, and I’d like to share some personal impressions and opinions from this workshop.
The OECD/BEREC workshop was a policy-oriented peering and exchange forum. It was not a conventional operational peering forum where the aim is to introduce potential peers to each other and facilitate peer-based interconnection of network operators, but a workshop that involved both network operators and various national and EU regulators, together with input from the OECD. This was a forum that allowed regulators and network operators to discuss issues of interconnection, both as a means of informing various national regulatory roles within the EU, and a means of allowing operators to understand regulatory perspectives and objectives.
BEREC is the Body of European Regulators for Electronic Communications. They have an interesting work program published on their web site including analysis of issues such as transparency and the quality of the service provided to consumers.
So what is “new” in the interconnection and peering environment in 2012? As far as I can tell the answer is pretty much nothing! I observed at this workshop much the same conversations that network operators have had for the past two decades. Larger network operators with significant market power generally tend to protect their market position by entering into peering arrangements in a highly selective manner, while attempting to position those who they perceive to be smaller operators into the relative position of customer of their services. Smaller operators tend to take a different position, and make extensive use of peering in order to optimize their relative position by increasing the richness and reach of peering arrangements as a means of reducing their dependency on transit providers. In this sense nothing much has changed in this environment. The dynamics and motivations of the individual players in the interconnection space are pretty constant, and while the fortunes of individual players have waxed and waned over the years, and this is reflected in their changing individual roles in the interconnection environment, the overall shape of the interconnection environment, and more fundamentally the efficiency and effectiveness of this largely deregulated market in interconnection has continued unaltered. The “sameness” of the conversations is perhaps reflective of an observation that this system is indeed operating effectively in allowing common market forces to act as a counter-balance to individual efforts that could have otherwise distorted the environment and impacted the end user’s quality of experience or the retail costs of the consumer service portfolio. Perhaps this “sameness” is a sign that the environment is a healthy one at present.
There are, however, a couple of new considerations in this conversation that I noted at this workshop.
The first is the presence of Content Distribution Networks (CDNs) into the environment. CDN operators typically present their content at regional exchange points and attempt to enter into peering arrangements with all networks operators who are present at the exchange point. Unlike the transit and access networks, CDNs have no peering “tier” position to protect, and for that reason they tend to be ready and enthusiastic to peer with all comers, large and small. At times, and in certain markets, CDNs have encountered resistance from the larger network operators, who evidently have the perception that to peer with a CDN would either compromise their established tier position in the peering hierarchy, or that a peering arrangement is contrary to their perception of the CDN as a logical customer of their network. One of the better illustrations of the tensions in the modern day CDN is Netflix, whose CDN delivers high volumes of video content through the operator’s access network. The access network operators have attempted to make the case that this imposition of a massive content locale has caused them to incur further expenses to increase the capacity of their access network infrastructure, while the CDN itself continues to demand free access to their network. On the other hand, the CDN position is well illustrated by a CDN operator at the workshop, who characterized the position of some EU network operators’ resistance to peering with a CDN as akin to the situation of a letter delivered from the US to an EU destination by having the sender of the letter in the US taking the letter by hand in a flight across the Atlantic at the sender’s expense, paying for a taxi to reach the front door of the letter’s destination address, and then have the building’s doorman then demand money to deliver this letter to the first floor within the building!
It’s true that the relative positions of content providers and access networks have seesawed over the past few decades. Initially content provision was seeking a sustainable financial model for the provision of content on the network, and content providers eyed the access network’s revenues and claimed that without their content, end consumers had no incentive to purchase an access product. At one stage they were demanding a share of the access revenues from the access providers. But that was many years ago, and these days the content providers have established a secure, and some would argue (including many stock markets), highly lucrative financial model that sustains their activities. It is now the turn of the access networks to plead poverty and set forth their claim for a share of the content provider’s revenue streams! One of the side effects of this is the current issue of the forms of interconnection between these CDNs and the larger access providers.
The second new topic I heard in the conversation at the workshop is the influence of the parallel discussion in this industry over the International Telecommunications Regulations (ITRs), which, in turn, has revived some of the more enduring myths in this industry, which surfaced at BEREC. I am referring to the European Telecommunications Network Operators (ETNO) position, which advocated the use of inter-provider Quality of Service mechanisms as a means of introducing a concept of inter-provider network transactions and associated transactional accounting and the imposition of transaction accounting settlement rates into Internet interconnections. Another large provider at the workshop also voiced a desire to see the Internet head towards a general position that was termed “sender” pays. Notionally, the end user who generated the packet is expected to fund its delivery to the packet’s intended destination. If this sounds like a simple reformatting of the desire on the part of the access network operators to compel content providers to share their wealth and pay access providers for the privilege of accessing the provider’s customers, or yet another round of the crude forms of attempted coercion that content and carriage players have tried out on each other for many years, then that’s because it is precisely that! The content providers are concerned that in some markets the access providers are assuming positions of clear market dominance, and are arguing that there is a potential regulatory situation if this position of market power is used to impose terms of competition and interaction on other actors in their market space. Again this is not a novel argument, and all players use well-rehearsed lines, but some access network operators appear to hold out the hope that if they can invoke the ITRs to bolster their case, then they would emerge in an advantaged position in their negotiations with the content delivery networks and content providers.
For an industry that is supposedly based firmly in technology and engineering, I find myself constantly surprised at the extent to which industry actors can base a negotiation position upon incomplete and poorly understood technical concepts, or even to rely completely on technical mythology, rather that basing their positions on the more mundane practical constraints of the feasibility of cost effective engineering approaches. Specifically, I am referring to the continuing voices of confident expectation on the part of some players that they can transform either all or their part of the Internet’s interconnection environment into something more aligned to the historical telephony model, replete with concepts of “sender pays” and QoS interconnects, in the hope that such a realignment would better serve their perceived self-interest. But maybe such expressions are more about kite flying and posturing than a expression of determined intent, as a practical examination of the actual nature of interconnections in the Internet shows a relatively uniform landscape of customer/provider or peering arrangements behind most interconnections, and no substantive evidence that inter-provider QoS, inter-provider MPLS VPNs, inter-provider multicast, or even inter-provider NGN architectures are any more than a collection of myths in this space.
Perhaps its just me, and perhaps I am increasingly intolerant of this kind of opportunistic posturing that attempts to portray as viable what is more along the lines of ill-conceived and inefficient adornments to the common substrate of an amazingly efficient and brutally simple IP architecture. However, perhaps I’m not alone in holding that view (although others may not be so extreme in their dismissal of them). The workshop’s reception to the ETNO position on inter-provider QoS and the generic stance of “sender pays” was certainly chilly at this particular meeting, and indeed could even be termed distinctly frosty!
It also appeared to me that ETNO itself is increasingly aware that there is a risk that maintaining this public stance undermines, to some extent, their own considerable credibility and authority in the industry. There were some mitigating statements from ETNO in this meeting, in response to a request for ETNO to consider withdrawing its proposal to the WCIT ITR review process.
In terms of the larger regional regulatory agenda, the OECD/BEREC workshop was illustrative of an evolution in terms of a regulatory stance from an historical role as arbitrator between industry actors, to that of an advocate for consumer interests, and is, I think, reflective of the change in this industry where the sustained level of competition in all aspects of this industry has empowered the end consumer to an extent not seen previously in this industry. I would argue that in many ways, this is a unique development in telecommunications, and while not without its attendant risks in passing over to market-based dynamics functions that were previously well within a regulatory purview, has been one where the accumulation of benefits to both the end consumer and to national economies is clearly evident. BEREC itself is working on documents relating to the issues in interconnection, competition issues, and practices that may cause harm to end users, and matters relating to the quality of the service that is delivered to consumers. The differences in the regulatory role between what was a very highly regulated environment used by telephony and the largely liberalized Internet environment is evident here.
There is also awareness that one of the more critical risk factors here in this market-driven environment is the creation of “bottlenecks” in the delivery of services to customers. Such bottlenecks admit the introduction of “gatekeepers” which, in turn, admit the potential to impose rentals on those parties who are forced to pass services through the bottleneck. If there is a failure of competitive pressure in the access market, there is a significant risk of such forms of forced distortions appearing in the market through the exploitation of such bottlenecks to extract scarcity rentals from those parties who are forced to pass their services through such points of constraint and imposed third party control.
BEREC’s position, and the OECD position on the whole, has been largely one that is strongly in favour of market-based remedies, and reflects a reluctance to place the regulator back into the position of being the service facilitator. In my opinion this is a well-informed and insightful position, and one that matches the larger landscape of the Internet, which itself has largely decoupled the roles of carriage and content services. The outcomes, as observed in the development of the EU market for digitally mediated services and telecommunications, are very encouraging. Transit prices continue to fall, local exchange points provide a rich mesh of high quality connectivity, and the market is open to innovative models for service delivery, as evidenced by the recent introduction of the CDN sector, as an example.
There are always issues in such dynamic environments, and the Internet is no exception. Economies of scale can be seen to be at work in many aspects of the Internet. It was noted that just 200 entities provide (or aggregate) some 50% of the network’s delivered content. These entities, the “hypergiants” of content, expose underlying issues in the interconnection environment that this is not necessarily a set of negotiations between equals, and that forcing functions may be present within such negotiations It was also noted that CDNs now account for around 50% of a retail ISP’s traffic profile, and therefore the relationship between CDNs and access providers are now a critical component of the Internet’s interconnection structure.
There is also some effort on the part of a number of national regulatory bodies to better understand the “consumer experience.” For example the “Sam Knows” programs in the US and the UK represent a somewhat novel form of industry self-monitoring that enjoys the active support of the national regulatory body. There is some interest to understand the issues related to headline sync speed and data throughput, and measure the consumer experience in terms of stability and streaming performance. Should the interconnection regime look to optimize particular applications? Should access networks attempt to optimize their operating parameters to support a streaming content model, or should this be left to the application, and instead have the networks optimize bulk throughput? The BEREC program essentially can ground this work in a well-researched analytical framework, but larger policy objectives from these kinds of performance monitoring exercises are unclear.
The final session of the day was focussed on the ITRs. ETNO presented its proposal, and, as noted above, encountered a somewhat frosty reception from the workshop’s participants. Indeed, such was the level of concentration on this particular topic that Rob Beckstrom’s comments relating to the new gTLDs in the DNS passed without comment. I gathered the strong impression that within this particular forum there is a strong commitment to market-based mechanisms in a liberalized regulatory framework for telecommunications services. While there is a certain degree of almost whimsical acceptance of the continued existence of a regulatory framework around telephony as just a fact of life, there is a strong undercurrent of resistance to expanding this regulatory framework into the Internet, even within the more constrained form of looking at just the area of voice over IP applications and their interaction with the legacy telephone system. I have covered the ITR topics in other recent articles, so I won’t repeat that here. I will note that the impression I gathered from both the regulators and the industry actors who attended this workshop was that there was a general level of support for the notion that the imposition of regulatory-inspired overheads upon the current arrangements for Internet services in Europe would be highly regressive and best not contemplated. With some small exceptions, there was a strong degree of confidence in the market-based mechanisms from most industry actors, and a general sense of beneficial consumer and economic outcomes on the part of the regulatory agencies from the European sector.