Why Orchestration Needs AI
Posted on: 28th April 2017
The great promise of virtualization – SDN and NFV in particular – has been the ability to make rapid, instantaneous change to networks. The freedom to expand required resources seamlessly, without limitation. The replacement of physical work conducted in the field (having been planned for, budgeted for, resourced for, ahead of time) with electronic control, delivered “on demand” and “just in time”.
From the very first, the industry recognised the need for an Orchestrator – a capability that would provide overarching control of virtual resources. That could turn demand into the instructions required to satisfy demand for services.
The problem is, that vision was incomplete.
The work on defining an industry standard for Orchestration, so that surrounding systems could interoperate, while necessary and worthwhile, also left critical questions out of scope:
- when should I use NFV A vs NFB B?
- Which VNFs represent a critical point of service failure?
- When will it be profitable to migrate physical services to virtualized alternatives?
- How will this all work on networks of continental scale?
As a result, while ETSI MANO was a springboard towards a realization of the vision, operators also continued to develop beyond it to meet the end goal of scalable business automation. (AT&T’s ECOMP paper does not include a single reference to AI, though much of its public discussion about its virtualization effort does).
MANO provides a standardised basis on which network change can be effected. But it does not (nor did it seek to) provide a basis for determining which network change should be effected.
In a world in which network change was slow and expensive (that is, the conventional world of dedicated-function devices and physical assets), telcos had the time to draw up policies, design conventions, rules of thumb.
But virtualization could change that completely.
The challenge of virtualization is not to imagine physical devices replaced by software equivalents. The average smartphone or connected TV gives enough of a reference point to understand that. The challenge of virtualization is to imagine a business capable of continuous adaptation to both external market and internal network conditions, continuously optimizing based on by information (data) but driven by business objectives. And that is a paradigm that has already been played out in other industries, from pharmaceuticals to manufacturing, leveraging Artificial Intelligence.
The core activity of a telco – service and network design – is not part of any industry standard reference model for Orchestration. Yet automating it is critical to the success of the drive to virtualize networks. Insighful metrics alone can’t deliver that.
Today we’re announcing the release (from BOCO Inter-Telecom) of a new NFV Orchestrator, based on the excellent foundation of Open-O and ONAP, enhanced with the only technology capable of making intelligent, business-driven design choices at speed and scale: AI.