What Does Edge Computing Mean?
Edge computing is the practice of processing data as close to its source as possible in order to reduce network latency by minimizing communication time between clients and servers.
In an edge computing network architecture, data that was traditionally sent to a central data center or remote cloud service is processed locally. In many cases the data is processed on the originating device itself and only the most important data is ever transferred off the device.
In addition to facilitating real-time data processing, the benefits of edge computing include:
- Improved response time - data doesn’t have to travel to and from a remote data center for processing.
- Bandwidth optimization - only the most important data needs to be transferred over the network.
- Security optimization - the security risk footprint is reduced because less unencrypted data is sent over the network.
The technologies that are driving edge computing include the Internet of Things (IoT), software-defined networking (SDN), fifth generation wireless (5G) networking and blockchain.
Techopedia Explains Edge Computing
Edge computing works in various ways, and contributes to IT architectures in different capacities. It is a frequent and popular means of enhancing networks to promote efficiency and more capable security for business systems.
In the early days of big data, a consistent philosophy emerged: that best practices, in most cases, involved routing data to a central data warehouse, where it would be stored, retrieved, analyzed and sculpted. This has remained a dominant model until recently, when “edge” data collection started to arise as a practical alternative.
To collect data near the edge of a network, businesses look far afield from the data warehouse and consider how to gather and analyze data near its source. An excellent example is in internet of things (IoT) systems, where it may not be practical to funnel a lot of device or sensor data into the data warehouse.
The pursuit of edge analytics is gaining ground in IoT architectures and other types of enterprise systems. Because companies can “thin” data or otherwise cull data results, edge data collection and analytics can help with issues such as network congestion and latency.
An intelligent device has its own computing capability so it can process data as close to its source as possible. While this is useful when an instantaneous transfer of information is essential, it also increases the risk that intelligent devices at the edge can become attack surfaces for cybersecurity threats.
To protect this new type of network node, many organizations are turning to Secure Access Service Edge (SASE) which combines software-defined wide area network (SD-WAN) capabilities with network security services. The SASE framework includes capabilities such as cloud access security brokers (CASBs), Zero Trust and next-gen firewalls as a service (FWaaS) in a single cloud service model.
Edge Computing vs. Fog Computing vs. MEC Computing
A lack of agreed-upon standards has complicated the way edge computing services are being marketed.
Although "edge" seems to be the most popular way of describing the concept of extending the cloud to the point where data originates, the competing labels Fog Computing and MEC Computing are also being used by vendors -- sometimes as synonyms.
To avoid confusion, network architects recommend using the label Edge Computing when discussing the general concept of reducing latency between the data's source and supporting compute/storage resources.
The label Fog Computing should be used when data is sent to a nearby gateway server for processing. Fog Computing is often associated with Cisco. Gateways may also be referred to as a Fog servers or Fog nodes.
The label Multi-Access Edge Computing should be used when discussing the open standards framework for edge computing that is being developed by the nonprofit group ETSI. The framework is designed to ensure developers have access to a consistent set of APIs.