In 2024, edge devices are projected to hit a staggering 17.08 billion. And according to IDC, the data volume generated at the edge will skyrocket at a compound annual growth rate of 34% through 2027, surpassing growth rates of core data centers and endpoint devices. This surge is ushering in a new era of edge computing that will fundamentally reshape our data-driven world.

Edge computing is no longer a peripheral concept but a central player in our global network infrastructure. Though this advancement brings processing closer to data sources for increased efficiency, it also introduces a series of concerning security challenges. Understanding these vulnerabilities is essential to maintaining a robust security posture in increasingly complex network environments.

Five Critical Edge Computing Security Considerations:

  1. Increased attack surface : As the edge grows, so does your business’s attack surface. With billions of devices operating at the edge, malicious actors have more potential entry points than ever before—and growing. Without robust and responsive security strategies that can quickly adapt to threats at any point in your network, you risk falling victim to cyber-attacks that can lead to data breaches, operational disruptions, and significant financial and reputational damage.
  2. Data privacy and compliance: With processing occurring closer to where data is generated, not only must you manage the sheer volume of data coming in, but also ensure it complies with a complex web of data protection laws that vary by region and industry (e.g., GDPR, HIPPA, CCPA), to avoid breaches that could compromise customer privacy and your company’s integrity.
  3. Network security: As data moves between edge devices and the central network, it becomes more vulnerable to unauthorized access or interception. To protect the data in transit at every point of vulnerability, robust encryption and secure communication protocols are essential to keep cyber hackers out and confidential information safe.
  4. Device and endpoint management: With the sheer volume and variety of devices in your edge environment, centralized management and security monitoring has become more complex, but critical. To maintain integrity across the network, you need to be able to continuously identify, authenticate, and monitor the status of each device, while also ensuring they meet security compliance across its lifecycle.
  5. Scalability of security solutions: As edge computing environments grow, security strategies must grow to evolve with it. Traditional security measures are no longer sufficient. Today, a more flexible and intelligent system is required—one that can quickly learn from and respond to new threats as they emerge.

Edge computing offers tremendous potential to drive efficiency and innovation. But if your edge environment is not adequately protected, you’re opening the door to new, highly damaging threats that could undermine your entire network. With Aruba ClearPass, a robust network access solution that secures connectivity across an organization, you can safely realize anywhere, anytime connectivity while still maintaining a strong security posture that protects your business.

For more information about how Zunesis can help you secure your edge environment and embrace its many benefits using Aruba ClearPass, contact us here.






Back in March which seems like ages ago, Aruba Networks announced the release of Aruba ESP. It’s the industry’s first cloud-native platform designed to automate, unify and secure the Edge. Why the need for this new platform in today’s world? What are its secret powers for your network?  And, how does it work?


Why the Need for Aruba ESP?

According to IDC, 55 billion devices will be connected within the next two years and are expected to generate 79.4ZB of data by 2025. Combine that with the shifts to work from home and distributed work forces, there is a definite need for the right tools to keep pace. With this large amount of data at the Edge, today’s networks and the teams that manage them are struggling to keep up.

Organizations need to ensure they have the right network foundation while being ready for the next big technology transition or event. This is where the need for Aruba ESP came in.  Aruba ESP combines AIOps, Zero Trust Security, and a Unified Infrastructure.


What can Aruba ESP do?



It helps IT with the following:

  • Identify and resolve issues quickly, preempting problems before they impact the business.
  • Protect against advanced threats from a vanishing security perimeter.
  • Monitor and manage thousands of wired, wireless and WAN devices across campus, branch, data center, or remote worker locations.
  • Quickly deploy network services at scale at support changing business needs.
  • Allow continued infrastructure investment in the face of uncertain financial changes.

Aruba ESP offers services at the Edge that include onboarding, provisioning, orchestration, analytics, location and management. These are accessed through Aruba Central. The SaaS consumption model enables rapid deployment and provides unified management, AIOps, and security. Through Central, network admins can use AI insights to help quickly troubleshoot, identify, and resolve issues before issues occur.


Significant innovations within Aruba ESP

Several new innovations are within the Aruba ESP platform:

  • Cloud-native management for any size enterprise: The industry’s only controller-less, cloud-based platform that provides full-stack management and operations for wired, wireless and SD-WAN infrastructure of any size campus, data center, branch, and remote worker locations to be consumed on-premises or in the cloud.
  • Simplified daily operations with unified infrastructure: The latest version of Aruba Central has simplified navigation, advanced search, and contextual views.
  • Reduced resolution time with AI and automation: Aruba’s new AI Insights reduces troubleshooting time by identifying hard-to-see network configuration issues and providing root-cause, prescriptive recommendations and automated remediation to continuously optimize network operations.
  • AI-powered IT Efficiencies: AI Search enables IT Teams to eliminate “swivel chair” investigations. AI Assist uses event-driven automation to collect and post all relevant data for both the internal help desk and Aruba Technical Assistance Center (TAC)
  • Granular visibility across applications, devices and the network: User-center analytics from User Experience Insight to identify client, application, and network performance issues faster.
  • Extension of next-gen switching to distributed and mid-size enterprises: The Aruba CX6200 switch series brings built-in analytics and automation capabilities to every network edge where user and device connectivity occurs, generating insights that can be applied to informing better business outcomes.
  • Ongoing innovation with new Developer Hub: A comprehensive resource for developers that includes Aruba APIs and documentation to streamline the development of innovative, next-generation edge applications leveraging the open Aruba ESP platform.

Recently, new enhancements were announced that help unify IoT, IT and Operational Technology networks to enable customers to quickly adapt to changing environments and user requirements. Unifying these networks, enables hyper-aware facilities that are safer, more adaptive, and enhance productivity. This is a big leap forward over what can be achieved with basic connectivity and machine learning-based monitoring.

These enhancements are integral to sensing, analyzing, and reacting to device data and contextual information. Virtually every subsystem spanning machine inputs and outputs (I/O) on a manufacturing floor through multimedia devices in the CEO suite can be accommodated. Solutions are available for education, enterprise, healthcare, hospitality, industrial, manufacturing, retail, transportation and government applications.


Some Use Cases

Some use cases with Aruba ESP-based hyper-awareness include smart buildings, industrial/manufacturing facilities and the broader Intelligent Edge.

Hyper-aware smart buildings for enterprises, education, healthcare, hospitality, retail, and government:

  1. Building control and digital twin enablement: Identify sub-optimized processes, recommend operational enhancements, and monitor the trajectory of energy usage needed for proactive interventions.
  2. Context- Aware, real-time integrated emergency response and notification.: It actively communicates with tenants, visitors and staff. The use of 4D Graphics for first responders enables them to quickly see where people are within buildings.
  3. Seamless extension of the 5G Footprint with Wi-Fi: Mobile operators can extend 5G footprint into the building. It seamlessly powers Wi-Fi calling using Aruba Air Slice Technology.

Hyper-aware industrial facilities:

  1. Migrating from break/fix to proactive maintenance: Enables machinery sensors to monitor equipment to identify points of failure. Notifies before they happen, improve productivity, reliability, and efficiency.
  2. Reducing mean time to repair with location services: Provides site occupants with turn-by-turn navigation to a destination without human assistance.
  3. Monitoring personnel and asset safety: Can deliver real-time 3D situational awareness by tracking the location of people and assets.  It can integrate with automated ventilation, geofencing, and vehicular navigation systems.

Aruba ESP produces AI- powered insights with greater than 95% accuracy.  It helps automatically improve communications and visibility across and among IoT, IT and OT Networks.

Have more questions about Aruba ESP? Attend our webinar on September 30th or reach out to one of our account reps to learn more.

Future of Data Growth


I’m certain anyone reading this is well aware of the statistics about Data growth and how it is impacting storage requirements across all industries. This isn’t a new challenge in our industry, but the conversation does have an added twist when we consider the impact of IoT.  We commonly read about companies experiencing anywhere from 20% to 50% year over year growth. The terms “exploding!”, “explosive”, and “exponential” are usually found in articles associated with data growth (and now I’ve used all three in one post). While this data growth continues to be spurred on by the traditional sources associated with business data (databases, documents, email, etc.), we are seeing even greater capacity requirements being generated by IoT devices.


For this post, when I speak of data from IoT, I am lumping together data generated by security video, temperature gauges, vibration sensors, connected cars… you get the idea. In fact, according to some sources, IoT data is 10x the growth of that associated with traditional data sets. And, IDC estimates IoT devices will grow to 28.1 Billion by 2020. So, data collected from these devices, and storage solutions needed to maintain this data, will become increasingly important.


Storage for IoT Data


Among our clients, we see a tremendous growth in the need for storage to maintain Security Surveillance Video. Beyond simply providing a place for video streams to be written, our clients are analyzing the video; utilizing software to perform anomaly detection, facial recognition, etc. I shared a couple posts recently, written by two of my colleagues at Zunesis, that expands on this topic. And, Analytics isn’t isolated to video only. The value of IoT devices is that they capture data at the edge, where it is happening, and this is true across all IoT devices. Once collected, software can perform analysis of the data to derive meaning beyond the data points and, in many cases, produce actionable insights. So, storage required for IoT data needs to be able to hit large scale quickly and have performance characteristics that allow analytics in near real time. And, of course, this storage still needs to provide reliability and availability associated with any business-critical data.


To meet the storage requirements defined above, HPE has created a hardware platform and partnered with two software defined storage (SDS) companies to provide solutions for scale-out storage that will grow from a couple hundred terabytes to petabytes and provide both the reliability and performance required of the data generated by the ever-expanding number of IoT devices. The HPE hardware is part of the HPE Apollo family. The software that utilizes this hardware comes from Software Defined Storage providers, Qumulo and Scality. Here is a summary for each of these solution components:


The Apollo Family of compute and storage:


The Apollo Family of systems from HPE are each designed to provide compute, storage, and networking that meet the needs of both scale-up and scale-out requirements. They are targeted at workloads supporting Big Data, analytics, object storage and high-performance computing.



The scale-out compute part of the HPE Apollo System portfolio includes the Apollo 2000 System for hyperscale and general-purpose scale-out computing, the Apollo 4000 System Family is targeted at Big Data analytics and object storage while the Apollo 6000 and 8000 Systems are designed to support HPC and supercomputing. Density, ease of management (all incorporate HPE iLO management), and efficient rack-scalability are features shared by all members of the portfolio.


Qumulo File Fabric (QF2)


Qumulo is a software defined scale-out NAS that scales to billions of files in a flash-first design. With the Apollo/Scality solution, you can scale from 200TB to over 5PB of usable capacity. This solution uses advanced block-level erasure coding and up-to-the minute analytics for actionable data management. The file services provided by Qumulo are also supported in the public cloud, currently on Amazon Web Services.



Use cases include:

  • Media & Entertainment
  • Security Video
  • Life Sciences & Medical Research
  • Higher Education
  • Automotive
  • Oil & Gas
  • Large Scale Online/Internet
  • Telco/Cable/Satellite
  • Earth Sciences


Scality RING


Scality is a a software defined Scalable Object Storage solution that supports trillions of objects in a single namespace. With the Apollo/Qumulo solution, you can scale to over 5PB of usable capacity. The access and storage layers can be scaled independently to thousands of nodes that can be accessed directly and concurrently with no added latency.



Use cases include:

  • Media & Entertainment
  • Security Video
  • Financial Services
  • Medical Imaging
  • Service Providers
  • Backup Storage
  • Public Sector


So, yes, data footprint is growing and won’t be slowing down anytime soon. If your data set is outside the traditional business data sets and requires scale-out storage that supports large numbers of files and the ability to perform actionable analysis quickly, then you probably need to look outside of the traditional scale-up storage solutions and look at solutions purpose-built for these large-scale workloads. HPE Apollo, Qumulo, and Scality are a great starting point for your research.

The term “Intelligent Edge” is used in many ways, but perhaps the best way to think of it is as a place. The edge is where the action is. It’s a manufacturing floor, a building, a campus, a city, your house, a telecommunications outpost, a sports arena, or in other words, where the “things” are in the Internet of Things (IoT).  IoT data originates remotely, often from equipment at the edge, enabling immediate access to the data and affecting immediate control of “things”


The edge is “intelligent” because now there’s technology in these places that’s smart, connected, computational, and controlling.  Crucially, the Intelligent Edge provides analytics capabilities that were formerly confined to on-premises or cloud data centers.


The media and telecom industries face growing distribution pressures from increased video resolution, new formats, expanding bandwidth, and the need for better security and reliability. As a result, telecom service providers are placing sophisticated compute and control systems in businesses and homes. These distributed intelligent edges make the services more competitive and improve customer experiences.


Seven Reasons To Compute at the Edge


Edge computing can yield immediate insights from edge data at relatively low cost. Edge compute can be improved by shifting enterprise-class compute, storage, and management from the data center out to the edge. Organizations can leverage compute at the edge to:


  1. Minimize Latency: There are many applications that require immediate insight and control. For some mission-critical functions, compute must take place at the edge because any latency is intolerable.
  2. Reduce Bandwidth: Sending big data back and forth from things to the cloud, can consume enormous bandwidth. Edge computing is the easiest solution to this problem.
  3. Lower Cost: Even if bandwidth is available, it can be costly. Efficiency is an important element of any corporate IoT strategy.
  4. Reduce Threats: When you transfer data across the campus, state, country or ocean, it is simply more prone to attacks and breaches. Processing data at the edge can reduce security vulnerabilities.
  5. Avoid duplication: If all the data is collected and sent to the cloud, there will likely be some equipment duplication in memory, storage, networking equipment and software. If this duplication is not needed, then the associated increases in capital and operating expenditures are unwarranted.
  6. Improve reliability: Even without any nefarious activity from hackers, data can be corrupted on its own. Retries, drops and missed connections will plague edge-to-data-center communications.
  7. Maintain Compliance: Laws and corporate policies govern the remote transfer of data.


Reasons to compute at the edge


HPE Edgeline at the Intelligent Edge


The key to HPE’s IoT strategy is edge computing, combining OT and IT into one appliance. In June of 2016, HPE announced the EdgeLine 1000 and EdgeLine 4000 series of devices that allow more processing power to be deployed into the field near these connected devices that generate data.  Both devices accommodate the same M510 or M710x ProLiant cartridges used in HPE’s signature Moonshot Chassis, with room for one cartridge on the Edgeline 1000 appliance and four cartridges on the EL4000.  Both appliances have integrated wireless networking on the system.


HPE Edgeline E1000

HPE Edgeline E1000



HPE Edgeline E4000

HPE Edgeline E4000


In Las Vegas, Zunesis recently showcased AI-based image and video analytics applications that run on the Edgeline Devices (in tandem with the customers VMS platform).   The features and use cases are far reaching – just a few key features we showcased:


  • Immediate identification of key image elements, such as people or an abandoned bag.
  • Processing video nearer the point of capture, the camera, eliminating the need to transfer large amounts of video data into the datacenter or cloud, thus reducing bandwidth costs, accelerating reaction time and lowering risk of corruption or espionage.
  • Engagement geometry for smart video surveillance that can direct camera viewpoints and aggregate video data of a particular location
  • Long videos can be compressed into minutes by superimposing all moving elements over the same unchanging background. The technology is ideal for identifying popular items or routes.


Also exciting is HPE’s new Global IoT Innovation Lab – Asia Pacific (APAC), located at HPE’s APAC headquarters in Singapore, one of four globally that offers immersive Edge Experience Zones to demonstrate practical IoT use cases for industries such as oil & gas, manufacturing, engineering, healthcare, retail, smart cities and more.


For more details, check out this great article on the new Lab – http://www.techtradeasia.info/2018/02/hpe-opens-singapore-based-global-iot.html.


EMAIL: info@zunesis.com




Zunesis, Inc.
12303 Airport Way, Suite 100,
Broomfield, CO 80021
(720) 221-5200

Las Vegas
6671 Las Vegas Blvd S
Building D Suite 210, Office 260
Las Vegas, NV 89119
(702) 837-5300

Copyright © 2023 Zunesis. All Rights Reserved. | Website Developed & Managed by C. CREATIVE, LLC