Share this article Share this article ResearchAndMarkets.com's offering. The term 'Edge Computing' refers to computing that pushes intelligence, data processing, analytics, and communication capabilities down to where the data originates, that is, at network gateways or directly at endpoints. The aim is to reduce latency, ensure highly efficient networks and operations, as well as service delivery and improved user experience. By extending computing closer to the data source, edge computing enables latency-sensitive computing, offers greater business agility through better control and faster insights, lowers operating expenses, and results in more efficient network bandwidth support. There have been 3 major computing revolutions in industrial applications - mainframe, client server, and cloud computing. Taking up where these paradigms left off, edge computing is establishing itself as a foundational technology for industrial enterprises with its shorter latencies, robust security, responsive data collection, and lower costs.