...

The Edge Is the Remedy for Sluggish Computing in Higher Education


 

What Is Edge Computing, and How Are Universities Using It?

Edge computing is an architecture that enhances compute and storage capacity at the edge of the network, when and where it is needed. It’s a type of distributed computing strategy. Edge computing isn’t an alternative to the cloud; rather, it can complement a cloud architecture. If the data center and network core are the center of the network, a typical “edge” would be an on-campus building (or, more likely, an entire remote campus). That’s the edge of the network, and edge computing architectures put compute and storage capacity out there, far from the center.

The classic use case for edge computing is IoT, and the easiest-to-understand IoT example is security cameras. When cameras are installed or upgraded, edge computing can help correct a few predictable but often ignored design problems.

As colleges and universities invest more in video monitoring, the number of cameras on campus has skyrocketed. Security teams love to install 4K cameras that record at 30 frames per second without considering the 20 megabits per second of data throughput each camera will generate when people are moving around.

How Edge Computing Improves Campus Physical Security

In a traditional, centralized architecture — whether in an on-campus data center or in the cloud — everything is sent to a network video recorder. It’s true that a campus with 10-gigabit-per-second interbuilding links can handle a lot of cameras, but bandwidth isn’t the only consideration; streaming all that data to storage creates a real challenge and runs up serious costs.

Storing all that video isn’t impossible, though. Higher education IT teams have experience with high-performance computing, and they definitely can build storage subsystems and CPU capacity to handle thousands of cameras. But that’s not a very effective use of resources because HPC hardware is an expensive way to solve a problem that edge computing architecture could handle at a fraction of the price.

RELATED: Read these three key AI takeaways for organizations using cloud computing.

An alternative architecture uses edge computing, keeping the video stream for a building within that building and handling detection and alerts close to the cameras. Anything of interest is streamed immediately to a central security operations center, with the raw footage stored at the edge, just a click away. Edge computing is a great fit for this application. And, if there are remote buildings that don’t have fiber connections to the main campus, edge computing can help with that as well because cameras won’t be competing with users for bandwidth and won’t impact the user experience by increasing latency and congestion.

Edge computing can also be a win for video applications in the cloud by offering lower storage and computing costs, lower bandwidth fees and less congestion between the campus and the cloud.

Edge Computing Optimizes Technology Throughout Campus

Video isn’t the only IoT application where edge computing makes sense. For example, smart building sensors that frequently monitor power, temperature, humidity, presence, light and more can generate a lot of data that doesn’t often change. Using edge computing to pre-process IoT monitoring data distributes the load and makes central monitoring and reporting more responsive.

And it’s not just IoT that can benefit from edge computing. IT applications such as network switch monitoring can benefit from an edge computing architecture. Many monitoring applications already support remote data collection agents for exactly this reason. Many IT teams must trim the network status information they collect to reduce load on central databases and monitoring consoles. With a distributed model using edge computing, they can collect every imaginable performance statistic at frequent intervals — a huge timesaver when it comes to debugging and troubleshooting network performance complaints.

Edge computing is not a particular piece of software or hardware you install, but a design model for IT teams that delivers benefits by putting compute and storage capacity at a distance from central data centers.

But edge computing also comes with costs: There are servers and storage systems to be purchased, installed, managed, monitored, upgraded and secured at the edge of the network. IT teams should make a careful cost-benefit analysis before they dive into edge computing.

Not every application, and not every building, is perfect for edge computing. But having edge computing as part of your toolkit can deliver a better user experience, higher application performance and more efficient use of resources.

Source link

#Edge #Remedy #Sluggish #Computing #Higher #Education