FG3 report

The third focus group at the SESERV Athens Workshop involved a discussion around Interconnection agreements and monitoring, with an emphasis on technologies promoting collaboration between ISPs for QoS-aware service provision. In particular, the participants were asked to express their views on socio-economic challenges that would appear, if a framework proposed by the ETICS research project was candidate for adoption. The technologies under the spotlight enrich the current best-effort Internet by allowing ISPs to collaborate during service management and establish QoS-aware network services.

The session was moderated by a SESERV partner, who was briefed to adopt a middle-ground: encouraging discussion and maintaining focus without too much interruption.

After a taxonomy of stakeholder roles was presented by the moderator, all fourteen participants were asked to select the most appropriate role based on their experiences or preferences. The table below provides the actual distribution of roles to focus group participants, indicating that all roles apart from the Infrastructure Provider had at least one representative. Please note that one participant had declared to play the Infrastructure Provider role (Last-mile provider) but was acting as an Edge-ISP. We believe this did not negatively affect the discussion since for this particular case-study significant overlap was expected on their interests).

Stakeholder role Number of participants per role Terms used
Connectivity Providers  2 EDGE, TRANSIT 
Information Providers  1GAMING PROVIDER
Infrastructure Providers  0 -
Users  2 GAMER-1, GAMER-2
Policy makers  2 REGULATOR, ADMINISTRATIVE-AUTHORITY
Content owners  2CONTENT-1, CONTENT-2
Technology Makers  4 PROJECT-1, PROJECT-2, VENDOR-1, VENDOR-2

The focus group continued with a presentation of one aspect of the ETICS project: the introduction of a Facilitator to manage and control routing across various ISPs in support of QoS (and QoE) expectations of end-users, in this case gamers; this introduced and presented the “technologies promoting collaboration between ISPs for QoS-aware service provision”.

The focus group continued the technology maker representative (PROJECT-1) introducing the motivation for such a framework by stating that the existing marketplace structure provides no incentives for providing Quality of Experience (QoE) to end-users collectively, which creates several tussles and consequently uncertainties for the Future Internet. Then he presented the proposed technology by providing a scenario where gamers request premium quality access to a gaming server and continued by describing a number of possible coordination schemes for realizing this case study. These alternative architectural schemes define a) whether atomic offers are made available to the consortium of Network Service Providers (NSPs) proactively or on-demand and b) which entity will stitch those atomic network services (or Assured Service Quality goods) together and form the end-to-end path.

After the architecture had been presented, it was agreed to focus on a particular implementation scheme characterised by the introduction of a Facilitator function to manage and control routing across various ISPs in support of QoS (and QoE) expectations of end-users, in this case gamers.

Taking into account the technological background of most participants, the presentation of the framework raised a number of questions among the different stakeholders, initially focused on the practicalities of collaboration and the assurance of end-to-end quality:

PROJECT-1: And obviously there is also the issue of the performance of the system and the experience of the users and how well-provisioned the paths will be, which will depend on the amount of cooperation versus competition these facilitators will have.

including some scepticism that appropriate collaboration could be achieved:

[PROJECT-1 continues] So these facilitators obviously represent sometimes a customer [but on occasion] also they represent a user, thus they represent both sides of the market. Networks are those who offer their resources and users are those who drive the demand and they have to offer resources also to the competing facilitators. So would there be simple market strategies such as ‘always offer worse performance to market rivals so that their customers are displeased and decide to move away’?

But such concerns were not seen as the most significant. Instead, there was a deep concern that smaller ISPs would be disadvantaged.

EDGE: […] I play the role of Edge ISPs, the smaller ones, and frankly speaking I feel very uncomfortable with this situation. Because, let’s say I am an Edge-ISP so I rely on a Transit-ISP  for getting my connectivity to the game server and I’m not part of any Facilitator. So I feel I have totally lost control and power. And I would [not] consider this as a fair, let’s say, solution and I would encourage actually the regulator to come up with a different suggestion on how to implement this.

ending here was a plea for intervention from the regulator. However, the same stakeholder begins to explore other possible solutions:

[EDGE continues] One idea that I could possibly accept is the case of one facilitator where the different NSPs are shareholders. So I feel that I have access and some control and I can share some revenues of this facilitator company. If I just see this picture I feel more uncomfortable, I can be very flexible but that is just…

Unfortunately, neither this (for instance which revenue sharing scheme would be considered a fair one) nor the call to the regulator were developed further at this stage. Instead, and something which proved to be a regular feature of the focus group, a dominant voice effectively tries to deal with a misunderstanding on the role of Facilitators:

PROJECT-1: [...] Theoretically nothing blocks a certain NSP from having its own facilitator, representing just its own network. Facilitator is just an entity for composing an SLA chain. That’s it.

and the possible effects of current market shares on future market structure:

PROJECT-1: Now whether it is meaningful to do this in a business environment then, OK, this is a different question. I can see clearly that a well-established network would have higher probability of attracting more customers because it already has customers. So yes I agree that smaller networks will have less chances for establishing that will be attractive. But I’m saying that this would also be the case in the current market to compete for something different than net service.

Picking up on this, the moderator attempts to re-engage EDGE to come back in and develop any other pertinent ideas:

FG3Mod: So EDGE can you think of a strategy to help you participate in this market as a facilitator?
EDGE:  Rather than allowing only large NSPs to play the role of facilitator we have to come up with a different model just to allow the smaller players to feel that they can participate in this new situation. I’m sure they don’t feel comfortable with the current situation and because this is something new, they want to make it right this time. So, if I can change, somehow, the change towards what they feel is better for them (the smaller ISPs), or pushing the regulator, I would try to do so since I’m in their position.

If the smaller ISPs feel disadvantaged, with their own recourse being via the regulator, then perhaps they need to join together and take the initiative themselves. They would effectively take out the large carriers and be able to institute a certain degree of autonomy:

PROJECT-1: Just to add something to the discussion, they could team-up multiple smaller operators and create their own facilitator. And actually in today’s Internet you have something similar, which is called ‘donut peering’ in the US: multiple small edge ISPs that do not want to buy transit connectivity from the tier-1 providers. So what they do is go to all interconnection points and establish peering agreements amongst themselves. So basically, they bypass tier-1 providers. So you can imagine such business strategies being applied here.

The Edge ISP agreed that this would be a good strategy for making small ISPs more powerful, even though the US market is very different from the European one. However, the Edge ISP still believes regulatory intervention is the way to go:

EDGE: I’m trying to learn from history. So back in 1992, […], one of the baby bells which was an edge ISP, managed to push the regulator and change the situation in US, with the telecom act of 1996 and eventually edge ISPs took over the entire market. This is the path I would suggest.

For the smaller ISPs, then, there are major concerns about how the introduction of a Facilitator  role would affect their ability to control their own commercial destiny. Without regulatory intervention, it appears that smaller ISPs are likely to retreat from the market, or be forced to collaborate with other small ISPs to increase their control during QoS path setup.

Irrespective of what goes on in the network, when it comes down to it, the end-users are really only concerned with the quality of their experience:

GAMER-1: I want to come in. As the gamer, the end-user pays a flat-rate or whatever, but then you recognized that I am prepared to pay more for a good experience. But you guys are talking about is the QoS and looking at different ways, different actors to engage. I’m not interested in that; I’m interested in this stickiness, so what you are gonna give me? So as said I’m going to give you my custom and I will pay an extra rate for QoE. How does this help me, how does this guarantee me, am I going to get a good experience, whether I am at home or visiting my family in a different area?

The point is immediately lost, and instead the network provider refers to post hoc contracts which would be invoked for the benefit of the user. The end-user though tries to bring it back to his own level.

PROJECT-1: I guess you would have an SLA and if it violated then the NSP will pay a penalty. So it is in his interest to deliver you what he promised by means of 
GAMER-1: You tell me I have to pay a premium rate, which I’m happy to do but how does this ensure me that my premium rate guarantees me the service?

The network provider insists that appropriate SLA management is the way to handle all eventualities. To some degree, this appears to be a shift of responsibility: with an appropriate measure and appropriate management the problem will be resolved.

PROJECT-1: That’s a matter of SLA definition. There should be an SLA that saying you have the right to use that gaming service and this should have a delay of that ms, jitter of that ms, and in case of net violation you should be rewarded by means of a penalty that the actor you have the SLA with will pay you. Basically, the idea is that for any ASQ good you have also an associated monitoring mechanism and an SLA penalty. So once you don’t get what you paid for then you will be compensated.

SLAs are really an external instrument overlaid on what is actually happening in the network. For them to be effective there needs to be some kind of management in place. The regulator attempts to open up this point:

ADMINISTRATIVE-AUTHORITY: What’s the monitoring mechanism?
PROJECT-1: Just a way to look in the network...
ADMINISTRATIVE-AUTHORITY: No what is it actually?

Other stakeholders take up the discussion though, and think first of the issue of responsibility – who pays the penalty for SLA violation – and then to consider the thorny issue of how and what to monitor. Perhaps this was behind the regulator’s question.

PROJECT-2: So in case of a contract violation, either you have redundant policies within each ISP and everyone shares the responsibility, or there are delicate mechanisms to track the performance of each ISP.
TRANSIT: As a NSP of course we are monitoring our traffic, the bad thing is that we have different ideas of monitoring traffic. The mobile side has different measurement tools because they focus very much on the radio link, and the fixed side has other monitoring tools that we have to bring them together. We interconnect with other partners; they still have different tools. So it’s a landscape of many different tools.

Ultimately, end users will probably have to have some kind of SLA monitoring tool which will allow them to check that the premium rates they have asked and paid for are actually what is being provided.

One very significant factor for reputable ISPs in delivering the promised quality is fear of public opinion:

TRANSIT: So, the press is having its role, of bringing transparency into the market. We are fighting very hard to find what they measure and where they are focusing on. If the press is focusing on facebook usage I think I would have done it myself, but we are focusing on the top applications. […] And if we interconnect with other carriers we look at quality, how we perceive it, and ask for their monitoring concepts […] and choose someone who fits our standards.

The interconnectivity provider acknowledges that they need to be responsive to how others perceive the service they offer and be mindful that this may involve changing the metrics they use. But the end-user clearly picks up on the reference to the press and the SNS facebook, specifically as a forum to be heard.

GAMER-1: So this means that you care about your reputation and me, as an end-user, should make sure that press represents me. So your incentive to be a top provider …
TRANSIT: The end-user has quite power by blogs or similar platforms.

Social networks are therefore a powerful medium for the regulation and control of Internet services:

[TRANSIT continues] There is quite a power on the social web in criticizing business, and especially the gamers they are very technical savvy and they really find out and you cannot cheat them. They have the time enough to run very deep traceroute and similar things and would discover everything and they would post it to the public. So the Internet has to make things transparent.

The discussion then shifts away from end-users and returns to consider issues at the network level again. One of the vendors raised the need for standardised interfaces for performing SLA monitoring and mentioned that there can be limitations by particular technologies on what can be monitored:

VENDOR-1: If you are talking to many networks you have to make sure that you provide inter-vendor interoperability and when there is a fault you need to be able to decide from which side it is coming from. Because talking about SLAs between a network operator and end-users, it is also coming down to vendors for identifying whose fault it is.

Furthermore, Content Providers identified the lock-in issues to vendors as very important. They appeared to be satisfied with the existing Best-Effort Internet, where all competing providers receive similar quality and connectivity costs are more predictable.

CONTENT-1: We are not in control [in this particular scenario]. Facilitator is supposed to be neutral. We are happy with the BE effort, where quality is the same and we can control the prices, especially since there is the fear of ISPs having more lock-in power.

It appears that the Edge and smaller ISPs are not alone in identifying fears for their existing business; the network equipment manufacturer (NEM) perceives a similar risk:

VENDOR-2: My problem is that I already sell to these guys. I’m wondering what the potential customers might be. So, I’m now wondering do I want to sell to all of them or do I want to sell to these facilitators instead? [So, do I] stay with my existing customers selling this same entity, [or is there] some new entity? This is my first question.

As evidenced, the ramifications of introducing the facilitator function (and its implementation details) are very complex. The original tussle – assuring QoS for real time services across multiple networks – and its potential resolution via the advent of a facilitator to manage collaboration between and across the different networks has implications beyond the movement of traffic across different connection environments.


FG3K: [The] Facilitator does not have its own boxes; it is a business entity
PROJECT-1: We can see [the] facilitator as a server that accumulates offers from the underlying networks and does some SLA composition by means of stitching SLA offers together.
VENDOR-2: [So I could be] selling the server, the software. I’m not going to do that unless this is more profitable than selling capacity to the people or to the ISP. I have to make sure that I don’t kill my own business.

Despite the new opportunity – selling servers and software – the NEM remains conservative, wanting to protect the existing market until it becomes clear that a new market would be just as profitable. The NEM then starts to explore more technical aspects of his current plight.


PROJECT-1: I guess the traditional market for ISPs is larger, I don’t see an issue here.
VENDOR-2: That’s what I’m worried about. I’m worried about my existing profits. I need to understand whether demand for less delay overlaps with routing problem or a capacity problem. If that’s a routing problem I need to understand whether I can solve that problem. My existing routing solution might not be as good for low-delay routing; and so I’m just saying that I may come up with a new product and not make any more money in the end. Anyway routing is routing, you cannot make low-delay routing anyway.

There is a genuine dichotomy for the NEM: developing new product but with a risk that it will not sell and help bridge the profitability gap.


PROJECT-2: It is not that your product is not good, I’m just addressing another problem.
VENDOR-2: What are the parameters that contribute to low delay?
PROJECT-2: If you want to calculate the shortest path your definition of the metric, which can be a cost metric, hops, whatever, but if this metric is not robust then the shortest path but many not be […] the right metric, so there is no competition between vendors.
VENDOR-2: Yes but I’m worried. I’m trying to work out how I can make the most from all customers. What if this delay is queuing time or delay inside the network. If I was selling discriminating capacity I could sell it for a higher price than non-discriminating capacity to make sure I won’t be out of business.
[…]

Again, the technology provider doesn’t share the NEM’s concerns. Perhaps they rely on the adoption by smaller vendors for triggering technology take up.

However, the NEM has looked at different options: developing a different product, shifting customer base (from ISP to facilitator), and even reviewing any underlying market requirement that they have yet to address. Ultimately, though, the NEM is concerned about what the potentially detrimental effect the proposed facilitator would have on his business.


VENDOR-2: Those are all my concerns. Whether the facilitator role would harm my market for existing equipment

The moderator attempts to encourage the NEM to a more pragmatic approach, based more closely on the technical solution proposed at the outset.


FG3Mod: Could you think of a way to react in this case? For example would not designing ETICS-based equipment be a candidate strategy, others possibly?

The NEM does indeed look creatively for solutions, manipulating the technology to help protect his investment.


VENDOR-2:  Customer lock-in. Do I make it with standardized interface or proprietary interface? My customers would like a standard product. But, then I have to worry about the regulators.

The NEM has little room to manoeuvre and may ultimately fall fowl of the same regulator that the Edge ISP wanted to engage to help protect his business. Regulatory intervention is a two-edged sword therefore, on the one hand helping to balance the market, but on the other constraining independent responses to market changes due to anticipated legislation.

So we are left with an interesting take on regulation and public opinion in the third focus group. Traditionally, regulation has been presented in the context of tussle methodology as the ultimate arbiter for conflict resolution: they have the power to impose solutions when an impasse is reached. However, in this focus group, it has become clear that such intervention may not be necessary because Internet stakeholders have understood how regulators are expected to react and may abandon any anti-competitive strategies beforehand. Furthermore, we are seeing that it is not perhaps the regulator but the “technical[ly] savvy” end user who may dictate the results, like in the case of reputable ISPs who want to meet the end-users’ demand by proactively managing their network.

The third SESERV Athens workshop keynote, almost in passing, noted that the telecoms regulators were of little real benefit and often did not have any independent means to corroborate what operators claimed; instead, it is the subjective response of end-users to varying performance that provides the information about traffic management. End-users may indeed demand appropriate monitoring and SLA management as discussed, but perhaps it is time to involve the suitably informed end-user as a key stakeholder and arbiter.


Comments