We are now quite accustomed to the concept of a Content Delivery Network or ordinarily known as CDN, with that concept and preferment of technology to lead to a new era known as Edge Computing. Edge computing paradigm that brings computation and storage closer to the location it is required, to improve response time and save bandwidth.
CDN was developed in the late 90s to serve web and video content from the Edge server that is deployed close to the users, which evolve in the early 20s to host application and a application components at the edge server.
The increase of IoT devices at the edge of the network is producing a massive amount of data to be computed at data centers, pushing network bandwidth requirements to the limit. Despite the improvements in network technology, data centers cannot guarantee acceptable transfer rates and response times, which could be a critical requirement for many applications.
The aim of Edge Computing is to move the computation away from data centers towards the edge of the network, exploiting smart objects, mobile phones, or network gateways to perform tasks and provide services on behalf of the cloud. By moving services to the edge, it is possible to provide content caching, service delivery, storage, and IoT management resulting in better response times and transfer rates. At the same time, distributing the logic in diﬀerent network nodes introduces new issues and challenges as follows:
- Privacy And Security: The distributed nature of this paradigm introduces shift in security schemes used in cloud computing. Not only should data be encrypted, but different encryption mechanisms should be adopted.
- Scalability: Due to the heterogeneity of the devices, having different performance and energy constraints, the highly dynamic condition and the reliability of the connections, compared to the more robust infrastructure of cloud data centers, and also security requirements may introduce further latency in the communication between nodes, which may slow down the scaling process.
- Reliability: Management of failover is crucial, If a single node goes down and is unreachable, users should still be able to access a service without interruptions. Moreover, edge computing systems must provide actions to recover from a failure and alerting the user about the incident. To this aim, each device must maintain the network topology of the entire distributed system, so that detection of errors and recovery become easily applicable.
- Speed: Edge computing brings analytical computational resources close to the end-users and therefore helps to speed up the communication speed. A well-designed edge platform would significantly outperform a traditional cloud-based system.
- Efficiency: Due to the proximity of the analytical resources to the end-users, sophisticated analytical tools, and Artificial Intelligence tools can run on the edge of the system. This placement at the edge helps to increase operational efficiency and contributes many advantages to the system.
Edge application services reduce the volumes of data that must be moved, the consequent traffic, and the distance that data must travel. That provides lower latency and reduces transmission costs. Computation offloading for real-time applications, such as facial recognition algorithms, showed considerable improvements in response times, as demonstrated in early research. Another use of the architecture is cloud gaming, where some aspects of a game could run in the cloud, while the rendered video is transferred to lightweight clients running on devices such as mobile phones, VR glasses, etc. This type of streaming is also known as pixel streaming.