July 2021 - Edge Computing | Cloud Computing

Edge: Counter-Revolution or Evolution?

On behalf of the eco Academy, Anne Noelling provides an introduction to the concept of the edge and how companies can begin to develop edge computing scenarios.

Edge: Counter-Revolution or Evolution?

© putilich | istockphoto.com

In the past, data center issues were considered a special topic for pure IT specialists. But with buzzwords like ‘cloud’ and, more recently, ‘edge’,[1] perspectives have shifted dramatically in recent years. Today, everyone knows what the cloud is, while the ‘edge’ is not yet so broadly understood. In this context, the buzzword ‘fog’ should also be mentioned for the sake of completeness. Another term to be considered in this context is ‘real-time Internet’, measured in milliseconds of ‘latency’ (delay). This is needed not only in industrial environments but also in other areas, be it healthcare, logistics, and even in the consumer sector, when it comes to real-time apps or gaming, for example. 

When it comes to the cloud, one frequently heard assumption is that large cloud providers – and, in particular, the hyperscalers with their super-clouds – will steal the thunder from the small players in the industry. But for some years now, there have also been reports that data centers are becoming more and more geographically distributed, that the cloud is becoming the edge, and that economies-of-scale will also be achieved in a decentralized manner in the future, possibly with high-level interconnection of processing capacity along the highways and at mobile base stations. The bottom line is that the migration of data centers – or processing capacity – to the ‘edge’ is something that is recognized by almost all experts – but answers vary on the question of the extent to which this is happening.

Implementing edge scenarios

In the practical implementation of edge scenarios by companies, Dr. Simon F. Rüsche, Managing Director of IPN - IT Precision Network GmbH, advises that “in addition to purely technical aspects of edge and cloud computing, strategic commercial issues must also be taken into account in the design of the architecture. Initial experience can be garnered from previous hybrid cloud architectures”. However, alongside these strategic considerations and other questions that need to be answered for the company’s implementation, many technical questions remain as yet unresolved. After all, with one study postulating that at least 50% of cloud applications ‘actually’ require low or very low latencies already today, and given that – although it has high bandwidths and low signal propagation times – 5G will not enjoy gapless coverage for a long time yet, the conclusion being drawn is that processing will need to shift closer and closer to the location of usage.

But will 5G be indispensable for successful use of the edge? If processing capacities are moving closer to the edge, then there also needs to be a fast connection at the edge. Since this is not possible everywhere via fixed networks – and is, in fact, impossible in some potential application scenarios, such as the autonomous platooning of truck convoys – then a mobile connection with sufficient bandwidth is essential. On this note, experts are already discussing the need for 6G for the edge. 

Use cases – where and in what form could the edge add value? 

The advantages of processing data at the edge are particularly vivid in the example of predictive maintenance. The edge enables data processing and storage directly on the factory floor, as close to the machine as possible. Industrial companies try to detect and analyze errors and unplanned variations in production lines as quickly as possible, at best before errors become apparent. At the edge, the IoT data – e.g., data recorded by sensors on the machine – can be analyzed directly and with the lowest possible latencies. Necessary maintenance or repairs can be carried out directly and production downtimes prevented.

Many more new questions are emerging around the edge, such as what edge security should entail. Another question is scalability: the interconnection of local resources and the creation of a ‘federated data center’ versus the use of concentrated monopoly structures. In view of the variety of assessments and quite contradictory studies on the subject of the edge, a thorough analysis of the respective use case is recommended.

On 11.8.21, an expert roundtable on the ‘edge’, organized by the eco Akademie and the German ict + medienakademie, will be held online and on location (German Language). The program and registration form can be found at www.medienakademie-koeln.de.

[1] The term ‘edge’ represents a decentralized architecture, located – as the word indicates – at the edge of the networks; the term therefore also represents the resource-saving processing of the data directly on site, in a manner not permanently connected to the network. At least some of the data processing takes place on site, rather than in the cloud. In this context, it is worth also mentioning ‘fog’, but there are diverse definitions for this. What the definitions have in common is that the data is processed on so-called ‘fog nodes’. There is no consensus as to whether the ‘fog’ will replace the ‘edge’ or is seen as another layer between the ‘edge’ and the cloud.

Anne Noelling is a Personnel Consultant at HAPEKO Hanseatisches Personalkontor Deutschland GmbH, a role which she took up in July 2021. Before this, with her background in Media Studies, she was Director Corporate Information at the deutsche ict + medienakademie, working closely together with the eco Association’s eco Academy. Until 2019, she undertook a range of roles including project leadership, management and consulting, including working as a marketing consultant.