Possibilities at the Edge:
Putting Intelligence Where Your Data Is
• Trend Article
Product may differ from images depicted.
Share on
The Future of Data Analytics Is Shifting From The Cloud to the Network Edge to Drive Real-time Decision Making
The major technology trend of the past decade has been mass migration to the cloud. The economies of scale and breadth of online services have meant that organisations of all sizes have adopted cloud services for a variety of IT functions, to such an extent that modern approaches to building and running applications are now described as “cloud native.”
But, for businesses that want to stay ahead in the data race, centralising everything inside massive cloud data centers is becoming limiting. The arrival of 5G networks and a boom in connected devices as part of the Industrial Internet of Things (IIoT) will produce vast quantities of real-time data—all of which will need to be rapidly analysed to inform timely business decisions. In a world of emerging technologies and powerful new analytics models, speed is as critical as accuracy—and in this world, the cloud is going to fall short.
According to Gartner1, while only about 10 percent of enterprise-generated data is created and processed outside a traditional data center or cloud, this figure is expected to soar to 75 percent by 2025. Santhosh Rao, Gartner’s Senior Research Director, concludes organisations are therefore going to have to consider a decentralised approach: “As the volume and velocity of data increases, so too does the inefficiency of streaming all this information to a cloud or data center for processing.”
75%
The proportion of enterprise-generated data that will be created and processed outside a traditional data center and cloud by 2025 according to Gartner1
This means making a potentially game-changing shift: away from the cloud towards edge computing.
Oliver Schabenberger, Executive Vice President and Chief Technology Officer at analytics firm SAS, argues the edge should be the starting point for enterprise organisations. This is because everything generating data outside of a data center and connected to the Internet is at the edge.2
“That includes appliances, machines, automobiles, streetlights, smart devices at home, locomotives, pets, and healthcare equipment,” he says.
For data scientists, shifting intelligence to the point of collection opens up a new world of possibilities. For starters, it presents the opportunity to finally realise the potential of IIoT and use connected devices to collect lots of different data types and learn from it without having to sort it first. This allows data scientists to capture insights from things like wind turbines or doors or streetlights, without knowing what they are looking for.
But more immediately it raises the tantalising prospect of three major benefits: faster response, greater scalability due to processing being distributed around the network, and cost savings by minimising the bandwidth used. All of this adds up to being able to push new boundaries in analytics and do more, faster.
“The ability to do intelligence or knowledge discovery at the point of data collection is critical in many applications now.”
Kirk Borne
Principal Data Scientist and Executive Advisor, Booz Allen Hamilton
The Need for Speed
Of all the potential benefits of edge analytics, the speed of data processing is the most pressing. Applications such as streaming media, IIoT, and VR and AR applications all require large amounts of data to be delivered with very low latency. If a sub-second response time is required, then waiting for a request to the cloud is not a viable option. As a result, the potential of some of the most exciting emerging technologies can only be truly realised at the edge.
Kirk Borne, Principal Data Scientist and Executive Advisor at Booz Allen Hamilton, says the race to keep up with the pace of data accumulation is driving the shift to the edge in data science. “The ability to do intelligence or knowledge discovery at that point of data collection is actually critical in so many applications now. You no longer have the luxury of bringing data back to your business center and spending a year analysing it.” This is particularly true with the example of self-driving cars. Autonomous test vehicles operated by Waymo (formerly Google’s self-driving car project) have been estimated to generate between 11TB and 152TB of sensor data every day.3 Sending data back and forth across a network will take at least 150–200 milliseconds,4 assuming the connection is reliable enough.
152 TB
The amount of sensor data reportedly collected every day by some of Waymo’s self-driving test cars.3
Edge devices are key to ensuring the vehicles take up their place as the ultimate IIoT devices, communicating with the environment around them and, crucially, with other vehicles on the road to make split-second decisions. There is no margin for error or lag when reacting to ever-changing road conditions. Pausing for instructions from a cloud server remotely analysing data is simply not an option for self-driving vehicles. Only the edge can deliver the rapid decision-making required to make autonomous cars a safe, viable prospect.
Less glamorous examples are no less critical. For instance, a control system operating an industrial machine will need to stop immediately if a human is too close.5 The smallest lag can be the difference between life and death.
The edge also presents opportunities to improve safety by using real-time data in aerospace.
“Airplane engine manufacturers want to have some sort of reliable heavy-duty compute on the plane itself”
Todd Mostak
CEO and Co-founder of analytics firm, OmniSci.
“Because if there is a problem, [at present] there may not always be the opportunity to send back however many terabytes the plane generates in flight, and people want to do some analysis and anomaly flagging actually on the plane.”
Even where the immediacy of decision-making is less critical, the cumulative business risk of delay can quickly mount up. For example, supermarket chains trying to avoid store waste may turn to the edge to ensure they can provide store managers with real-time insight into the operational performance and temperatures of their fridges across multiple store locations, so they can take prompt action to protect goods in the event of a failure.6
Shaping Customer Experiences
In the Media and Entertainment industry, the edge also enables businesses to meet growing demand for improved, more personalised customer experiences. Data scientists are under increasing pressure to translate user data into valuable insights faster through recommendation engines and sentiment analysis. This is the reason leaders like Netflix, Uber, and Amazon are choosing to develop their business models around the edge.
High-speed 5G networks are only expected to ramp up user expectations for flawless customer experiences on the go. With some network operators estimating performance to be anything from 10 times faster up to 1,000 times faster than current networks, users will be hungry for anytime experiences such as augmented reality or streaming of 4K video to mobile devices—and this demands speedier delivery of data that only the edge can provide.
150- 120
The minimum number of milliseconds it is estimated data takes to travel across a cellular network, assuming that there is a strong network connection.16
The future of data science at the edge
Kirk Borne
Principal Data Scientist and Executive Advisor, Booz Allen Hamilton
“To manage fast moving data of all kinds of variety you need edge devices that are at the point of data collection—and are specific to the type of data collection —whether it’s imaging data or stream, like cyber network data, whatever is at the edge. The ability to do intelligence or knowledge discovery at that point of data collection is critical in so many applications now.”
Jared Dame
Director of AI and Data Science, Z by HP
“Edge includes mobile workstations and devices like the Jetson Nano and the Intel Movidius chips. All of those things will be out on the edge collecting, filtering, and making data so that the cloud isn’t the endpoint for everything. This means you’re not hung up waiting for all the data to be processed up to the cloud in order to make a business decision.”
The Numbers Game
One implication of this shift in focus is that the hardware requirements for edge computing are going to change. Vendors are responding to the appetite for innovation at the edge by moving from simple embedded devices and microcontrollers towards more powerful systems such as workstations or even micro data centers.7
For businesses handling vast amounts of IIoT data, the edge should enable a way to combat the cost of transmitting data back to a data center or the cloud. This is a point made by Seagate’s Jeff Nygaard, Executive Vice President and Head of Operations, Products and Technology.
“It’s not free to move data through a pipe from endpoint to edge to cloud; it costs money to send data through that pipeline. The idea that you should really only be moving data if you need to move the data is something you should be thinking about,” he says.8
Edge computing allows for data to be filtered and processed before it is sent to the cloud, reducing the volume of data transmitted, and the network costs that would otherwise be incurred. This opens up a world of possibilities for businesses to realise the full potential of IIoT devices to obtain previously unviable data sources for business-critical insights in real time.
“It’s not free to move data through a pipe from endpoint to edge to cloud; it costs money to send data through that pipeline. The idea that you should really only be moving data if you need to move the data is something you should be thinking about.”
Jeff Nygaard
Executive Vice President and Head of Operations, Products and Technology, Seagate
As an example, an oil rig operating in a remote location in the ocean may have thousands of sensors producing large amounts of data, most of which may be redundant. There is unlikely to be a reliable internet connection, and streaming all of this information back to base via a satellite data connection would be prohibitively expensive. Instead, a local edge computing system could compile the data and send daily reports to a central data center or cloud for long-term storage, so only the important data gets transmitted.9
Edge computing is thus about implementing a more distributed architecture that is key to delivering scalability, since data processing takes place closest to the source of the data, making it more feasible for business users to gain real-time insights from the data being gathered. For IT leaders and Data Strategists, this enhances the decision-making process by avoiding the risk of network failures and delays that might impact performance with a centralised architecture.10
But edge and cloud are not mutually exclusive. A new hybrid model is emerging where the edge and cloud work together. A factory environment could be one example. In a smart factory, data is not funnelled directly to the cloud, as valuable factory floor context may be lost in the transition. Instead, filtered data is sent to the cloud during off-peak hours when timeliness is no longer a concern.11
Possibilities for Data Science at the Edge
Jim Duarte, Data Scientist and Principal at LJ Duarte and Associates, believes the shift to the edge will be enthusiastically embraced by data scientists—especially by engineers who place predictive models at the source of the control.
“More and better edge applications will be possible as better analytics create better models and the scalability of technology will put it in more locations,” he predicts. “Complex workflows will have better controls to minimise constraints for improving optimisation. More robust edge analytics will be created as controllers can be scaled down and be put in hostile environments because their smaller size will enhance the ability to protect them and keep them functioning longer.”
For data scientists, performing sophisticated data analysis at the edge will enable prediction-based outcomes that will empower businesses to make greater efficiencies and cost savings.
As an example, consider a large piece of expensive industrial machinery, such as an earth mover. Machine learning could be used to identify and map the relationships between the machine’s sub-assemblies and components, then monitor those components and run simulations to predict their future state.12
Such a model would not only be able to predict future failures in the machine, but also the probable time of failure. The calculations to do this would simply have been too complex in the past. With this capability, the company would be able to schedule maintenance and avoid unplanned downtime, as well as make spares inventory much more efficient.
Driving Better Business Decisions
Edge analytics opens up a whole new realm of possibilities in machine learning applications. In financial services, for example, data scientists can train algorithms to assess in real time if a transaction is unusual for a particular customer, and request verification or even just block it if there is at least a 95 percent probability of it being fraudulent.13
Machine learning algorithms also fit perfectly with the underwriting tasks that are so common in finance and insurance, performing underwriting and credit-scoring tasks that can help employees make speedier and more accurate decisions.
Healthcare is another field where analytics driven by machine learning is expected to make a huge impact by driving significant efficiencies in terms of time and cost—but also, crucially, by saving lives. From helping pathologists make a quicker and more accurate diagnosis from medical images, to identifying patients that might benefit from new types of treatments or therapies, healthcare professionals now regard AI as the future of healthcare.
“Today when you go to hospital and have an MRI, the data is there immediately, but it takes probably a week until you get the result,” says Amit Marathe, Director of AI & ML at Inseego Corp. “I see healthcare shifting towards more real-time analytics.”
One future application example might be a hypothetical emergency medical responder (EMR) system that could run predictive algorithms at the same time as a doctor is examining the patient. In this vision, the system would display the real-time diagnosis, pathology results and treatment options, as well as each option’s potential effectiveness.
“Edge computing has an exciting future, extending way beyond applications such as IoT, self-drive vehicles, and industrial automation.”
Darren Seymour-Russell
Head of Data Science, Mudano.
The Rise of Intelligent Devices
These advances are propelling AI outside of the data center and into devices and machines we use in our work and our everyday lives.15 Deloitte predicts a coming “era of pervasive intelligence” that will be marked by a proliferation of AI-powered smart devices able to recognise and react to sights, sounds and other patterns. Increasingly, data scientists will be able to train machines to learn from experiences, adapt to changing situations, and predict outcomes.
“Edge computing has an exciting future, and we see this extending way beyond the ‘traditional’ applications such as IIoT, self-drive vehicles and industrial automation, and into areas in which a personalised user experience is key,” predicts Darren Seymour-Russell, Head of Data Science at data consultancy, Mudano.
For example, customers might be offered services or concessions tailored individually for them during secure banking interactions. With their intelligence embedded rather than living in the cloud, such devices will enable all kinds of applications that require instantaneous response and robust performance even when connectivity is poor or not available.
Outlook
Data is increasingly being generated at the edge rather than the data center, and a real-time response is needed in many applications, so that processing the data also needs to happen at the edge.
Businesses need to evaluate how and where data is collected and used in their organisation, and whether performing analytics at the edge of the network might significantly boost performance while avoiding data congestion. However, edge computing does not replace the cloud, but extends its reach, providing a local point of presence, and reducing bandwidth costs.
Key Take Aways
Processing at the edge cuts costs by reducing costs associated with network traffic and latency
Analytics at the edge is vital where a rapid response is required, or bandwidth is limited
There is a noticeable shift in trend from pure cloud strategy to a hybrid strategy that includes edge + cloud
Find out more about the benefits of edge computing
Meet the Products
Exceptional performance
with Intel® Xeon®
and Intel® Core™ i9 processors.
Have a Question?
Contact Sales Support.