Does Network Virtualization Make Your Head Spin?
Posted by Jonathan Tombes on Nov 1, 2018For nearly two decades, I’ve tracked trends at the SCTE-ISBE Cable-Tec Expo. This year’s event in Atlanta provided many chances to do so, including the Light Reading breakfast session on “Virtualizing the Cable Architecture.” My high-level takeaway from this Oct 23 gathering: Virtualization is in a holding pattern, at least as far as cable is concerned.
It is not surprising that an industry once focused exclusively on physical gear has trouble with this topic. After all, experts define virtualization as the creation of separate images of hardware and OS on the same server. In other words, in contrast to hardware. That made me all the more interested in what the software experts on this panel had to say.
Needed: Virtualization Business Drivers
Pete Koat, CTO of Incognito Software, which provides operators with service orchestration for residential broadband, characterized the progress of virtualization as “two steps forward, one step backward.” He said the lack of business drivers has largely confined this technology to the lab and that “fractured” approaches to virtualization pose other impediments.
The fractures have occurred in realm of standards and industry organizations. One model is the Open Network Automation Platform (ONAP). Designed to accelerate the development of a virtual network functions (VNF) ecosystem, it consolidated two separate efforts and has the support of AT&T, China Mobile and several dozen of the largest cloud and network operators, including Comcast. Another approach is Open Source MANO (OSM). (MANO = management and network orchestration.) It aligns with European Telecommunications Standards Institute (ETSI) models; its adopters include British Telecom and Telefonica.
The cable industry may incline toward ONAP, given Comcast’s support; but divergence suggests that the category is still a work in progress. “Reference architectures are good,” said Oren Marmur, VP and head of NFV at Amdocs, a software and services provider to communications and media companies. “But there are too many standards.”
A proposed Software-Defined Networking (SDN)-NFV framework, attributed to the SDN NFV World Congress, illustrates the complexities. Shared by Light Reading moderator Alan Breznick in his introductory comments, it is a headache-inducing eye chart. (See image above.)
CableLabs, Cox and DAA
For its part, CableLabs is a founding member of another group, Open Platform for NFV (OPNFV), a collaborative project initiated in 2014 under the Linux Foundation. A few later, CableLabs launched SDN and NFV Application Platform and Stack (SNAP), its own effort to accelerate and ease the adoption of network virtualization within the cable industry.
While engaged in testing and development, CableLabs serves the needs of its practical-minded members. “NFV is not innovation,” said Don Clarke, CableLabs’ principal architect of network technologies, reminding the audience in Atlanta that new technology is not an end unto itself. “It’s what you do with NFV that’s innovation.”
The right business goals are key, but it takes more. Marmur underscored cultural implications of a shift toward software. Coat said that any such change would need to involve people and process, not just technology.
In his keynote, Executive Director of Advanced Technology for Cox Communications Jeff Finkelstein shared a projected 10-year access technology evolution. The Cox plan for getting to symmetrical, multi-gigabit data services involves moving from DOCSIS 3.1 to Remote PHY to Full Duplex DOCSIS 3.1 to targeted PON. What about virtualization?
Start with DAA
The migration toward a Distributed Access Architecture (DAA), in which PHY and possibly MAC functionality moves to the edge, is where virtualization could enter the picture. Incidentally, a five-hour, pre-conference session on DAA held the day before this breakfast session was heavily attended. (And at last year’s Expo, the pre-conference session on Remote PHY drew twice as many as expected.)
That seems about right. The industry needs to get a grip on DAA first. Then it will be able to see how virtualization fits in.