Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

SDN networks transform Big Data into information capital

Bithika Khargharia, senior engineer, Vertical Solutions and Architecture at Extreme Networks | Oct. 16, 2013
SDN has huge potential to build the intelligent adaptive network for Big Data analytics. Due to the separation of the control and data plane, SDN provides a well-defined programmatic interface for software intelligence to program networks that are highly customizable, scalable and agile, to meet the requirements of Big Data on-demand.

SDN can configure the network on-demand to the right size and shape for compute VMs to optimally talk to one another. This directly addresses the biggest challenge that Big Data, a massively parallel application, faces - slower processing speeds. Processing speeds are slow because most compute VMs in a Big Data application spend a significant amount of time waiting for massive data during scatter-gather operations to arrive so they can begin processing. With SDN, the network can create secure pathways on-demand and scale capacity up during the scatter-gather operations thereby significantly reducing the waiting time and hence overall processing time.

This software intelligence, which is fundamentally an understanding of what the application needs from the network, can be derived with much precision and efficiency for Big Data applications. The reason is two-fold: 1) the existence of well-defined computation and communication patterns, such as Hadoop's Split-Merge or Map-Reduce paradigm; and 2) the existence of a centralized management structure that makes it possible to leverage application-level information, e.g. Hadoop Scheduler or HBase Master.

With the aid of the SDN Controller which has a global view of the underlying network its state, its utilization etc. — the software intelligence can accurately translate the application needs by programming the network on-demand.      

SDN also offers other features that assist with management, integration and analysis of Big Data. New SDN oriented network protocols, including OpenFlow and OpenStack, promise to make network management easier, more intelligent and highly automated. OpenStack enables the set-up and configuration of network elements using a lot less manpower, and OpenFlow assists in network automation for greater flexibility to support new pressures such as data center automation, BYOD, security and application acceleration.

From a size standpoint, SDN also plays a critical role in developing network infrastructure for Big Data, facilitating streamlined management of thousands of switches, as well as the interoperability between vendors that lays the groundwork for accelerated network build out and application development. OpenFlow, a vendor-agnostic protocol that works with any vendor's OpenFlow-enabled devices, enables this interoperability, unshackling organizations from the proprietary solutions that could hinder them as they work to transform Big Data into information capital.

As the powerful implications and potential of Big Data become increasingly clear, ensuring that the network is prepared to scale to these emerging demands will be a critical step in guaranteeing long-term success. It is clear that a successful solution will leverage two key elements the existence of patterns in Big Data Applications & the programmability of the network that SDN offers. From that vantage point, SDN is indeed poised to play an important role in enabling the network to adapt further and faster, driving the pace of knowledge and innovation.

 

Previous Page  1  2 

Sign up for MIS Asia eNewsletters.