Back in March which seems like ages ago, Aruba Networks announced the release of Aruba ESP. It’s the industry’s first cloud-native platform designed to automate, unify and secure the Edge. Why the need for this new platform in today’s world? What are its secret powers for your network? And, how does it work?
According to IDC, 55 billion devices will be connected within the next two years and are expected to generate 79.4ZB of data by 2025. Combine that with the shifts to work from home and distributed work forces, there is a definite need for the right tools to keep pace. With this large amount of data at the Edge, today’s networks and the teams that manage them are struggling to keep up.
Organizations need to ensure they have the right network foundation while being ready for the next big technology transition or event. This is where the need for Aruba ESP came in. Aruba ESP combines AIOps, Zero Trust Security, and a Unified Infrastructure.
It helps IT with the following:
Aruba ESP offers services at the Edge that include onboarding, provisioning, orchestration, analytics, location and management. These are accessed through Aruba Central. The SaaS consumption model enables rapid deployment and provides unified management, AIOps, and security. Through Central, network admins can use AI insights to help quickly troubleshoot, identify, and resolve issues before issues occur.
Several new innovations are within the Aruba ESP platform:
Recently, new enhancements were announced that help unify IoT, IT and Operational Technology networks to enable customers to quickly adapt to changing environments and user requirements. Unifying these networks, enables hyper-aware facilities that are safer, more adaptive, and enhance productivity. This is a big leap forward over what can be achieved with basic connectivity and machine learning-based monitoring.
These enhancements are integral to sensing, analyzing, and reacting to device data and contextual information. Virtually every subsystem spanning machine inputs and outputs (I/O) on a manufacturing floor through multimedia devices in the CEO suite can be accommodated. Solutions are available for education, enterprise, healthcare, hospitality, industrial, manufacturing, retail, transportation and government applications.
Some use cases with Aruba ESP-based hyper-awareness include smart buildings, industrial/manufacturing facilities and the broader Intelligent Edge.
Aruba ESP produces AI- powered insights with greater than 95% accuracy. It helps automatically improve communications and visibility across and among IoT, IT and OT Networks.
I’m certain anyone reading this is well aware of the statistics about Data growth and how it is impacting storage requirements across all industries. This isn’t a new challenge in our industry, but the conversation does have an added twist when we consider the impact of IoT. We commonly read about companies experiencing anywhere from 20% to 50% year over year growth. The terms “exploding!”, “explosive”, and “exponential” are usually found in articles associated with data growth (and now I’ve used all three in one post). While this data growth continues to be spurred on by the traditional sources associated with business data (databases, documents, email, etc.), we are seeing even greater capacity requirements being generated by IoT devices.
For this post, when I speak of data from IoT, I am lumping together data generated by security video, temperature gauges, vibration sensors, connected cars… you get the idea. In fact, according to some sources, IoT data is 10x the growth of that associated with traditional data sets. And, IDC estimates IoT devices will grow to 28.1 Billion by 2020. So, data collected from these devices, and storage solutions needed to maintain this data, will become increasingly important.
Among our clients, we see a tremendous growth in the need for storage to maintain Security Surveillance Video. Beyond simply providing a place for video streams to be written, our clients are analyzing the video; utilizing software to perform anomaly detection, facial recognition, etc. I shared a couple posts recently, written by two of my colleagues at Zunesis, that expands on this topic. And, Analytics isn’t isolated to video only. The value of IoT devices is that they capture data at the edge, where it is happening, and this is true across all IoT devices. Once collected, software can perform analysis of the data to derive meaning beyond the data points and, in many cases, produce actionable insights. So, storage required for IoT data needs to be able to hit large scale quickly and have performance characteristics that allow analytics in near real time. And, of course, this storage still needs to provide reliability and availability associated with any business-critical data.
To meet the storage requirements defined above, HPE has created a hardware platform and partnered with two software defined storage (SDS) companies to provide solutions for scale-out storage that will grow from a couple hundred terabytes to petabytes and provide both the reliability and performance required of the data generated by the ever-expanding number of IoT devices. The HPE hardware is part of the HPE Apollo family. The software that utilizes this hardware comes from Software Defined Storage providers, Qumulo and Scality. Here is a summary for each of these solution components:
The Apollo Family of systems from HPE are each designed to provide compute, storage, and networking that meet the needs of both scale-up and scale-out requirements. They are targeted at workloads supporting Big Data, analytics, object storage and high-performance computing.
The scale-out compute part of the HPE Apollo System portfolio includes the Apollo 2000 System for hyperscale and general-purpose scale-out computing, the Apollo 4000 System Family is targeted at Big Data analytics and object storage while the Apollo 6000 and 8000 Systems are designed to support HPC and supercomputing. Density, ease of management (all incorporate HPE iLO management), and efficient rack-scalability are features shared by all members of the portfolio.
Qumulo is a software defined scale-out NAS that scales to billions of files in a flash-first design. With the Apollo/Scality solution, you can scale from 200TB to over 5PB of usable capacity. This solution uses advanced block-level erasure coding and up-to-the minute analytics for actionable data management. The file services provided by Qumulo are also supported in the public cloud, currently on Amazon Web Services.
Use cases include:
Scality is a a software defined Scalable Object Storage solution that supports trillions of objects in a single namespace. With the Apollo/Qumulo solution, you can scale to over 5PB of usable capacity. The access and storage layers can be scaled independently to thousands of nodes that can be accessed directly and concurrently with no added latency.
Use cases include:
So, yes, data footprint is growing and won’t be slowing down anytime soon. If your data set is outside the traditional business data sets and requires scale-out storage that supports large numbers of files and the ability to perform actionable analysis quickly, then you probably need to look outside of the traditional scale-up storage solutions and look at solutions purpose-built for these large-scale workloads. HPE Apollo, Qumulo, and Scality are a great starting point for your research.
The term “Intelligent Edge” is used in many ways, but perhaps the best way to think of it is as a place. The edge is where the action is. It’s a manufacturing floor, a building, a campus, a city, your house, a telecommunications outpost, a sports arena, or in other words, where the “things” are in the Internet of Things (IoT). IoT data originates remotely, often from equipment at the edge, enabling immediate access to the data and affecting immediate control of “things”
The edge is “intelligent” because now there’s technology in these places that’s smart, connected, computational, and controlling. Crucially, the Intelligent Edge provides analytics capabilities that were formerly confined to on-premises or cloud data centers.
The media and telecom industries face growing distribution pressures from increased video resolution, new formats, expanding bandwidth, and the need for better security and reliability. As a result, telecom service providers are placing sophisticated compute and control systems in businesses and homes. These distributed intelligent edges make the services more competitive and improve customer experiences.
Edge computing can yield immediate insights from edge data at relatively low cost. Edge compute can be improved by shifting enterprise-class compute, storage, and management from the data center out to the edge. Organizations can leverage compute at the edge to:
The key to HPE’s IoT strategy is edge computing, combining OT and IT into one appliance. In June of 2016, HPE announced the EdgeLine 1000 and EdgeLine 4000 series of devices that allow more processing power to be deployed into the field near these connected devices that generate data. Both devices accommodate the same M510 or M710x ProLiant cartridges used in HPE’s signature Moonshot Chassis, with room for one cartridge on the Edgeline 1000 appliance and four cartridges on the EL4000. Both appliances have integrated wireless networking on the system.
In Las Vegas, Zunesis recently showcased AI-based image and video analytics applications that run on the Edgeline Devices (in tandem with the customers VMS platform). The features and use cases are far reaching – just a few key features we showcased:
Also exciting is HPE’s new Global IoT Innovation Lab – Asia Pacific (APAC), located at HPE’s APAC headquarters in Singapore, one of four globally that offers immersive Edge Experience Zones to demonstrate practical IoT use cases for industries such as oil & gas, manufacturing, engineering, healthcare, retail, smart cities and more.
For more details, check out this great article on the new Lab – http://www.techtradeasia.info/2018/02/hpe-opens-singapore-based-global-iot.html.