With forecast of 50 billion IoT devices by 2030, Nife’s founder shares why businesses must take note of edge computing

About 22 billion Internet of Things (IoT) connected devices were in use worldwide at the end of 2018, according to Statista, the German company specialising in market and consumer data.

With the consumption of customer electronics growing steadily, Statista forecasts that around 50 billion IoT devices would be in use worldwide by 2030. Additionally, these numerous IoT devices are expected to create a massive web of interconnected devices spanning everything from smartphones to kitchen appliances. Here, while cloud computing will continue to play a vital role in driving the future of modern network architecture and for enterprises, edge computing will open up exciting possibilities. With edge computing, businesses will be able to process data closer to the source. What this also means is that businesses will have to relook at their IT infrastructure strategy.

The principles behind edge computing are relatively straightforward. Still, the substantial benefits of this fresh approach to the existing network architecture are a little more complex.

Edge computing: An overview
Traditional cloud computing networks are highly centralised. Data is gathered on the outermost edge i.e. end-user devices and transmitted back to the central servers for processing. This IT architecture grew out of necessity. Because, most of the devices near the edge don’t have the right computational power or storage capacity to analyse and process the collected data. Even when more devices could connect to networks over cellular and Wi-Fi, their hardware capabilities relatively limited their functionality.

Take an example of a drone that is used in agricultural fields to collect visuals about the crop landscape and the environment. The visuals collected by the drone are often stored on the devices and processed later. It’s always an afterthought. If the data is retrieved and processed immediately, actions related to fixing the environment can be executed at a faster pace.

However, that challenge is being addressed today with IoT devices being able to gather, store, and process more data. This in turn is opening up an opportunity for businesses to optimise their networks and move more processing power closer to a location where data is gathered at the network edge. The network edge is the area where a device or local network interfaces with the internet.

So, when edge computing comes into play, data doesn’t have to travel all the way back to the central server for the device to perform an action. Thus, edge computing networks can significantly reduce latency and enhance the end-user experience of the application or device. To illustrate how edge computing is distinct from traditional computing, let’s take the example of a video surveillance use case. Consider a corporate office secured with dozens of high-definition IoT video cameras. The cameras that don’t have any local processing power will continuously stream the raw video feed to a cloud server. Then, on the cloud, the video output received from all the cameras is passed through a motion-detection application to infer the video clip. This adds a significant strain on the business’ internet infrastructure on the cloud, given that it has to process the video footage from all the cameras simultaneously.

Now if the motion sensor computation happens at the network edge as is the case in edge computing, each camera can have its internal processing unit running the motion-detection application and then send the footage to the cloud server as needed. This will significantly reduce the bandwidth usage because most of the camera footage will never have to travel to the cloud server. The cloud server will now be used to store only the critical footage and reduce chances of being overloaded.

Relevance of edge computing for business use cases
The speed and flexibility afforded by edge computing to hand data creates an exciting range of possibilities for businesses. It has use cases across a wide range of applications and services for smart Content Delivery Network(CDN), drone surveillance, health care, supply chain monitoring to name a few. It can bring content to life with a superior viewing experience especially in the case of interactive ads and plugins commonly used in e-commerce. Given that drones are increasingly being used for use cases such as wildlife monitoring, industrial infrastructure inspections and even heat maps on agricultural lands, edge computing becomes extremely important and relevant. An edge network can help make sense of large data sets, reduce cost and provide real-time insights.

An interesting point to note is that while edge computing is synonymous with low latency, not all edge cases fall into that category. For instance, AR/VR use case or robotics in manufacturing, the device is able to offload collected data to a nearby edge node, allowing it to be lighter, increasing its battery life, and making it more affordable.

Edge cloud vs Edge computing: What’s the difference?
Conversations on edge computing also see the term edge cloud being used interchangeably. However, in reality, they refer to slightly different things. Edge computing relates primarily to the physical compute infrastructure positioned on the spectrum between the device and hyper-scale cloud and supports various applications. However, the edge cloud is the virtualised infrastructure and business models on top of the computing framework.

Like the cloud, the edge cloud is flexible and scalable. However, unlike static, on-premise servers, the new cloud can handle an unplanned increase in user activities and move these workloads. With access to many servers, the edge cloud becomes an elastic resource pool with the ability to scale – applications both while testing and in production. Just like the cloud accelerated new business ventures, so too will the edge cloud pave the way for new opportunities.

Where is the Edge?
To understand Edge computing, one must understand where the edge really is. While it is fundamental, edge is often a misplaced term and gets used in different scenarios. The answer is – the edge spans anywhere between the end device and the cloud or the internet. The device edge is the computing power of the device. The Network Edge is the compute found at the edge of the network.

Telco edge computing is distributed computation managed by the Telco operator between the network edge and the customer edge. Network edge is extremely prominent with 5G rollouts.

At the Network Edge, where telco edge computing is prominent, there are multiple potential locations for computing on and off the public network. The sites include cell towers, street cabinets, and network aggregation points in the access and core network. It could also be customer premises. The strategy to place edge compute infrastructure for a telco depends on factors including the telco’s current network architecture, the virtualisation roadmap – the plan to have data centre facilities for network applications, and application demand – primarily the use cases telcos have a strategic inclination. For instance, in use cases involving intelligent content out-of-home ads on display kiosks and billboards, the ads need to be displayed within a fraction of seconds – the cell towers that are close to the customer edge provide a wider area to receive information at extremely fast speeds and capture the right audiences who are moving along the screen. The wider area also helps them to be connected to one cell at any given point in time.

While the adoption of cloud computing has grown significantly for enterprises over the last few decades, edge computing is set to revolutionise digital transformation for the Industrial Internet of Things (IIoT). Moreover, with 5G roadmaps in place, the growth of edge computing will speed up.

The adoption of edge computing has been expansive in the last few years. Edge at one point was associated with Content Delivery Network (CDN), delivering static content but now with more people working remotely – with more globe teams, there is a greater push to realise intelligent dynamic edge computing for applications. With more industrialisation, connected devices, the ease of moving to a scalable edge ecosystem, there is acceleration in adoption like never before. All this is possible due to edge consortiums like OpenEdgeComputing, LFEdge and Kinetic Edge Alliance.

As a startup, Nife is leveraging this opportunity to build a global public edge cloud to deliver a faster, more powerful internet experience for individuals and enterprises. We do this by aggregating edges – Telcos, CDNs, Managed services Data Centers, ISPs and Cloud from all across the world to enable applications to move closer to the end-user or device. The application owner gets to decide KPIs (key performance indicators) such as desired — latency, performance, data, and cost. Nife factors in all these requirements in determining where to launch an application to meet its scalability, resiliency, latency, and mobility needs. With access to over 300 locations (700+ regions) worldwide, Nife is ready to roll out the beta version of the public edge cloud and we couldn’t be more excited. With 5G in sight, we have goals of onboarding more customers and partners to our edge ecosystem. And, as a woman entrepreneur, Nife’s acceptance to the NetApp ExcellerateHER, Netapp’s B2B deep tech accelerator programme for women entrepreneurs, strengthens my hope to see greater momentum driving the future of edge cloud.

Blog Source: Yourstory