I’ve been to the Edge: Internet of Things (IoT), Fogging, Artificial Intelligence (AI), and Edge Computing

Aug 28, 2019 | blog, IoT, Artificial Intelligence (AI), Fogging

Perhaps no other topic in Industrial Internet of Things (IIoT, or IoT 4.0) has received so much attention as so-called “edge computing”, and no wonder. Industrial demands for real-time or near-real-time monitoring, response, and control often does not allow for data to flow from remote sensors directly to a cloud computing resource, have that data analyzed, and then return both information and perhaps actions to correct or align a piece of equipment. IoT Edge Computing is used for data analysis and processing closer to the data source. Smart devices used in IoT Edge Computing are capable of processing critical data fragments and provide a quick real time response. These devices prevent the delay caused by sending the data through internet to cloud and linger for cloud response.

If cloud computing is really “large servers somewhere far away that I can lease on demand”, edge computing is the opposite:  local (industrial) computers with significant processing and analytic capabilities located very close to where sensors are deployed, which come to think of it is just what we used to call “distributed computing.”  If it seems like cloud computing is like client-server architectures (big server mainframes) linked by networks (internet or local area networks) to personal computers, then edge computing is “closer to the ground”, closer to the sensor.  These days many people are trying to merge sensors with tiny Arm or RISC-V embedded processors, but the processors are very limited in capabilities and performance.  The focus tends to be more on battery operation, and far from being a weather phenomenon, fog computing is really “putting compute power where it is needed most.”

That’s why having powerful but industrial-grade computers at the edge makes sense these days. The concept of putting significant intelligence as close to the edge as possible allows for the pushing artificial intelligence as close to the data source as possible.  Combined with analytics, this “AI” capability is often called “fog computing.”

Fog computing often consists of multiple sensors feeding data to a local, relatively powerful industrial PC’s, where both sensor data integration and sensor fusion can occur.  Data analysis can be performed very quickly.  Of course, this can be done in the cloud or on your local on-premises servers, but there are good use cases for doing this on the edge:

  • Lower latency for better autonomous operation
  • Increased resiliency in case the network goes down
  • Discrete subnetting for increased security
  • Virtualization
  • Saving money by only storing exceptions and serious alerts in the cloud
  • Reducing staffing costs or making the experts you have more efficient

The first one, lower latency, is especially important if you hope to employ some degree of artificial intelligence in the loop from sensors to data collection to analysis to action.  Simple if-then-else “rules engines” have been around for many years in market segments such as building automation and industrial automation, and it is natural to want to extend that IoT so that fewer people can do much more work.  But if you have to wait for the cloud computing resources to deliver automated analysis and autonomously activate, say, a valve, the response time may be measured in seconds or minutes – too slow to act.

The last reason, reducing staffing costs, is not a matter of trying so squeeze more work out of fewer people.  Rather, it can be an artifact of labor shortages due to history.  A good example is the oil and gas industry.  Back in the 1980’s, a huge number of talented and knowledgeable oil patch workers were laid off as the oil industry suffered massive layoffs.  Now there is a real lack of certain skills since there are far fewer on-site experts such as “treaters” to monitor drilling and production sites.  Oil and gas companies are pulling these treaters and putting them into network operations centers (NOC’s), essentially “war rooms” where one treater can now monitor dozens or even hundreds of wells at a time, allowing the edge computing (foggers) to deal with common, more minor issues.  IoT monitoring shows them status instantly, in real time, and they can concentrate on more serious issues.

Figure 1:  Oil and Gas Field Monitoring for Predictive and Preventative Maintenance using Corvalent CAT-APM

 

Corvalent has more on Intelligent IoT applications, but the most important news is that we take this very seriously for our customers.  For now, know that Corvalent’s new powerful edge computers, secure gateways, and data collection systems allow for fewer, more powerful computers, to collect data from a larger number of sensors, perform data analysis and apply AI, and then autonomously takes action.  It also transmits that information back to the cloud or server CAT-APM system, giving the treaters, operators, and engineers valuable time-series information for trend analysis. Powerful x86 or x64 Intel-based industrial PCs at the edge are ideal in many cases, reducing cloud costs, accelerating reaction speeds, and allowing robust local control for routine operations.

 

About the Author

Alan R. Weiss
Alan R. Weiss has Fortune 500, International, and Start-up experience in IoT, including semiconductors, computers, networking, software, and enterprise IT applications. With hardware and software experience designing advanced IoT Healthcare (telehealth) systems to Industrial IoT dashboards and applications, Alan is the Senior Product Manager for IoT and Industrial Computing at Corvalent Corporation. Contact: alan.weiss@corvalent.com

SUBSCRIBE FOR UPDATES