Wednesday, October 9, 2024

The Future of Decentralized Computing

 

What is Edge Computing? Edge Computing is a range of networks that bring computation closer to where the data is being generated and processed. We have data sources all over because apps keep generating data that can be captured and processed. The apps installed on your phone, for instance, are getting processed outside of it on computational devices. The idea here is to bring the computation as close to the source as possible.

Edge Computing has gotten really popular because of its power of minimum latency, which means that the computation is done in a quick span of time. It is especially useful for streaming, be it online streaming services like Netflix, Hulu, HBO, Max, Disney, Amazon Prime, Apple TV, or for live streaming like Instagram, live events on YouTube, or streaming sports events on various platforms as they happen in real time.

We have come to expect a certain standard in quality of streaming without lags or delays. Even someone from a different geographical location, say India, can live stream a UFC match that is going on in Vegas with absolutely no delays or lags. This is being made possible through Edge Computing, and all of this is happening over cloud.

Let's look at some examples of it to understand the concept better. We'll take an example of a security system. If you are a property owner of a single property then the security system is simple. You may have a couple of cameras at the front door and in your backyard and it streams live visuals that are easy to track and process. However, consider you are in charge of many such properties – apartments, condos, townhouses, or even corporate buildings. There will be hundreds if not thousands of cameras and a wider area to secure. The footage from all these cameras is being captured all the time 24/7, whether or not there is any motion within the areas. This raw footage is transferred real time from Source X to Source Y via the Internet and is going to consume a lot of Internet bandwidth which may be completely unnecessary. This may overload the computing server where the raw footage is being dumped for processing and cost computing delays. You cannot expect the same speed that you would expect from a streaming service because sometimes in such cases the computation may take minutes or even hours. This could work to your detriment by the time the footage of something untoward happening gets to you the damage may already have been done.

Now consider a situation where the server, by running an algorithm, has the capability to only select the footage that has motion in it. So if there is a layer between the data capturing device, in this case the cameras, and the computation servers, like a preface it can do the pre-computation, checking for motion within the footage, and only sending that footage to the main computational servers, after which if there is an issue, the security alarm is triggered.

This whole process could save a lot of load on the servers while also using less data bandwidth. The main benefit of this is that the streaming is flawless and without any delays or lags. This process of building a logic layer at the source where pre-computation happens, before the data is moved to the final destination, is the basic premise of edge computing.

What is Cloud Computing, in contrast? Cloud Computing is the method of using remote servers that are hosted on the internet instead of physical servers to store and process data. Earlier in the internet era one would do all their tasks on premises. Each web application like Photoshop or an app like a food delivery or Rideshare needs to be hosted on a server. Before the advent of the cloud concept, hosting generally happened on premises. On-premises here means the physical servers that a company or individual use to host their apps and software that is owned and maintained by the people or companies running the app.

People used physical servers for different reasons - like lack of trust in an emerging technology or wanting to keep their data where they can see it or even because it was expensive at the time. One would have to buy the high-end machinery and hardware to make this happen, but the application could be deployed and software could be installed on premises too.

Over the years this practice got both expensive and outdated because the bigger your application got the more space you needed in terms of real estate to store the ever-growing servers, hence cloud computing was invented wherein you are an app owner or a company you do not have to worry about infrastructural issues and can host it on a remote server that is maintained by a company. You only need to make sure the application is executable and the framework's architecture database are all hosted remotely. When you need to access any of this information all you need to do is raise a request through codes. The best part about cloud is that it allows you to scale your application seamlessly. You can get better storage services, databases, and networking capabilities. Then there are concepts like platform as a service software, as a service infrastructure, as a service etc.

The first major difference between Edge and Cloud Computing is programming. Cloud Computing is a centralized system where instructions, queries, and requests are coming in from various platforms like mobile apps, web apps, desktop apps, etc. One would typically program the system over Cloud using a specific programming, language depending upon the platform that is making the computational request, whereas on Edge Computing one could use different implementation strategies for web applications, mobile apps, desktop apps, or iot devices.

Second, Edge Computing needs a comprehensive security strategy because there is a pre-computational layer at the source which could potentially be manipulated and is prone to malicious activity. This could lead to a break in information transfer. Cloud Computing does not need extensive security measures because it is generally taken care of by the companies that are providing the service.

Additional security could be added to Cloud platforms too but they are easier to implement and do not need as much effort in general, so it is very important to factor in the highest security details and sophisticated security architecture while building an Edge Computing system at the source.

Third is latency. As discussed earlier the reason Edge Computing gained popularity is because of its power of minimum latency, the time that elapses between a client initiating contact or making a query and the time it takes to receive a response from the computational source in Cloud Computing. Since the computation is happening far away from the source, sometimes hundreds or thousands of miles away, there could be a lag in responses. This could be a bad user experience, especially because we have come to expect a certain standard of streaming and expect immediate processing of any type of instruction. Sometimes not having delays could be a matter of security too.

But in Edge Computing, since the computation is happening close to the source, there are minimal imperceptible or no lags, depending on the application. In simpler terms, Edge Computing is used to process time-sensitive data, whereas cloud computing could be used for diagnostic data.

Fourth is operations. In Cloud Computing all of the computation happens at the destination. A computational system built over cloud in Edge Computing pre-computation happens at the source before the main computation is done at the destination. When we talk about adding an extra computational layer at the source it doesn't always have to be built. Sometimes it could come built in a device like a mobile or laptop or an app itself.

Let's look at pros and cons of Edge Computing first. In Edge Computing there is no latency since the pre-computation happens at the source; it also requires less data bandwidth. That makes Edge Computing desirable for purposes of seamless streaming, enhanced security, collecting health data through variable devices, creating smart homes, and so much more.

The second advantage is security. Since most of the data is processed at the source and it isn't all transferred to a cloud location there could virtually be no data leaks. It is easy to apply traditional encryption and access security protection between Edge and Cloud servers which makes it less vulnerable and more resilient to surface attacks.

The third advantage is quick decision-making. It is possible to make quick decisions because the communication time is very minimal, so it pretty much facilitates real-time decision-making that helps in solving security threats promptly in the security camera example. It is quicker to raise trigger alarms since the data flow is smoother.

Fourth, Edge is more cost effective since the data bandwidth is lower, so transmission costs are lower, making Edge Computing a cost effective technology.

The number one disadvantage of Edge Computing is control and reliability. The thing that makes Edge Computing desirable, having the computation happening closer to the source, can also be a disadvantage. This is because it is a decentralized mechanism and it needs human intervention. It also requires skilled implementers who understand how it works.

The second disadvantage is compatibility. Edge Computing may not be the right fit for every use because large amounts of data are generated and sometimes the pre-computation may not work as expected. Additionally, there are fewer compatible devices available in the market right now, although this may change in the future.

The third disadvantage is security breaches. Although it helps against data leaks, not every device has built-in authentication and security capabilities, so they are susceptible to data tampering at local host points of the Edge Network.

The fourth disadvantage is loss of data. Not all data makes it to the computational servers because it is discarded by the Edge devices. What if the device is faulty and did not capture the data as expected. This leads to a loss of information. Next, let's look at the pros and cons of Cloud Computing. The number one advantage is productivity. Cloud is a completely centralized system, which means no matter where you are located, if Edge fails there is no way to connect to the servers, but with Cloud, even if there is a lag or downtime there will be a response hence the data recovery is easier. Here the chances of failure are very very low.

The second advantage is cost. Services have become universally accepted now. It is also an affordable alternative to on-premises data warehousing and storage because you don't have to worry about real estate for those servers. Moreover many services are packaged in bundles which has made Cloud Computing more accessible in general. In Edge Computing, because of having to build the pre-computation layer, it can be more expensive than Cloud Computing.

The third advantage is data storage. Earlier we could store extra data on external hard drives which were susceptible to damage or being stolen. (this is just on an individual level) Businesses work with humongous amounts of data; they shouldn't be worried about storing it securely. Cloud provides that space and for cheaper rates too. If your storage needs went up by a few terabytes you can always demand more space from the providers without letting any of your systems crash due to extra load. But this is not exactly possible with Edge devices. But the devices may not be able to handle the huge data load.

The fourth advantage is security. Data warehouses are high security places, managed by the Cloud service providers, so there are no security breaches and data leaks since they are monitored 24/7.

There are disadvantages to Cloud Computing as well. Breaches due to vulnerabilities is first and foremost. If the data warehouse is under attack from cyber hackers who may have breached the security lines by tinkering with single point vulnerabilities in the system, your data is at risk of being stolen or tampered with. The biggest names in the Cloud industry are AWS, GCP, and azure. If they are under a cyber threat of any kind or if the data warehouses are physically damaged by any means, it could affect your data. Since these are human-built systems there is always a risk of missing something. This could cause loss of revenue, loss of customer trust, and more importantly the customer's data itself.

The second disadvantage with Cloud is downtime. The system can undergo reboots, have network issues, outages, overheating, and downtime which may cause lags in response and may impact businesses negatively.

Take for example the great Netflix outage that happened in the recent past due to AWS being down. It took hours for them to bring it back up. Or take another example of when the New York subways stopped due to another outage severely impacting computers. Food delivery apps crash too and people can't order food for hours. This happened in 2020 when covid testing had to be halted because the servers were down.

Cloud downtime can have real world repercussions. This is the reason why multi-cloud adoption is gaining popularity, wherein a business hosts its data on multiple Cloud providers.

So overall, Cloud Computing and Edge Computing are too related yet different technologies and they are not at the least at the moment interchangeable.

from the Scaler USA YouTube of Edge Computing versus Cloud Computing on February 13, 2023

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Untangling Quantum Entanglement

  The traditional gulf between science and metaphysics is undergoing a dramatic metamorphosis as the discovery of a ‘quantum entanglement’ ...