Whenever a new term is born into the world of technology, analysts and consultants alike cringe just a little bit. In technology, change is a constant. While most who choose a career in technology embrace the concept of change, there is also a high level of awareness that we may have been better off if some of the advances in technology had never occurred. Each time a new concept is introduced, it is researched, and analyzed, and piloted, to see if perhaps it will deliver us to the promised land. History has proven that if we don’t invest in this kind of due diligence, the impact could be catastrophic. One such pivotal change has been the migration of enterprise applications to the cloud environment. While this has taken a while, most enterprise organizations have now embraced this change, but it has not been without growing pains. As a result of these growing pains, one of today’s current hot tech topics is that of edge computing. Edge computing refers to the use of middleware that brings processing power, and even storage, closer to the source of data. So yes, once again, we are moving to a distributed data model, and there are some very good reasons we should do this.
Reduction of latency
When we migrate to a cloud environment, historically, data storage, along with the computing power required for the work that our applications do, has moved to a datacenter managed by a service provider. The resources that are provided at that datacenter are shared amongst a family of other customers. Since most cloud application service providers are application-specific, high processing times, such as an end-of-the-month pay period, can result in slower response times, or latency. However, if our service provider has embraced edge computing, the processing for each client can be distributed to a geographically closer location and not shared so heavily among the masses. This can alleviate the drag on processing power at one centralized location and help to avoid any latency.
Another cause of latency is the time and bandwidth required to move large amounts of data for processing. When the data is stored in a more geographically convenient location, it can be retrieved and processed more efficiently, once again helping to alleviate any potential latency.
When it comes to implementing enterprise-wide applications, there is one objective that is always top of mind. As part of any project, particularly a cloud migration, we want to ensure that our stakeholders have a positive experience. Keep in mind that we also live in a time when instant gratification is an expectation. If your stakeholders are used to local computing, they are most likely used to seeing an immediate response when they click on pretty much anything. When we move to a centralized cloud, there may be large distances involved or massive sharing of the computing resources. As noted above, these can mean a delay before we are rewarded with the process we have requested, and that is a sure-fire way to receive negative feedback from those who matter.
Reduced bandwidth cost
When processing is done closer to the source, or at “the edge,” it means that data does not have to be sent to the cloud to be processed. When all processing is done in the cloud, imagine the amount of traffic that needs to travel back and forth. When the super-highway is busy, there are accidents and traffic jams. Before the adoption of edge computing, to avoid issues with traffic, we would purchase more bandwidth so that our highway was bigger, wider, faster. But that turned out to be a losing proposition, as the more bandwidth we purchased, the more traffic we sent.
Reliance on a single point of failure is reduced
When processing is brought closer to the end-user, reliance on a datacenter that is in a different city, or country, is reduced. If we must communicate directly to the cloud environment for any processing, should any of the service providers along the way experience a technical difficulty, there is a risk that all processing will be temporarily halted. Edge computing introduces different routes that can be taken to ensure the delivery of the final desired result. One single point of failure will no longer halt progress. When more localized processing is engaged, work can continue as we no longer become tethered to one specific internet service provider. Data processing can continue via the middleware layer.
Responsibility needs to be assigned
The cloud, and edge computing, need to work together in harmony to enable an environment that alleviates the cost of ownership from the enterprise and still provides the speed and power that end users have grown to expect since the birth and death of on-premises datacenters. While many cloud service providers offer data hosting and processing power at a centralized location, the demands of today’s applications can mean that centralized processing can cause latency, and that can be problematic in times of heavy processing. While some of the larger service providers are investing in edge computing, niche cloud players that offer software-as-a-service are not necessarily ready to take the investment plunge into this market. This leaves the enterprise at a severe disadvantage, as we could find that our investment has left us with slower processing, unacceptable latency, and very unhappy stakeholders. The alternative would be for the enterprise itself to invest in edge computing. But that would mean not only additional capital investment but the ongoing operational cost for maintenance and support.
It’s time for a chat
We wanted the cloud. It seemed the ultimate way to reduce infrastructure investment and resourcing costs. As our need for technology increased, so did our need for processing efficiency. At the same time, we began to question the efficiency of centralized processing. In our research, we discovered that processing closer to the source of data is more efficient, and this saw the birth of edge computing. It has become apparent that edge computing will require a substantial investment. But by whom? It’s time to have a conversation with cloud service providers to find out if edge computing is on their roadmap. If not, as our processing needs increase, it may be important to consider this when next updating the strategic plan.
Featured image: Shutterstock