Users need to capture more precise insights at the edge and make real-time intelligent decisions in the cloud to make the Internet of Things (IoT) valuable.
Data scientists know the pressure to help businesses understand the signals hidden in the vast and diverse IoT data stream. Companies need to decipher these signals to deliver critical outcomes to enhance the customer experience, improve equipment effectiveness, and drive operational excellence.
But if users are using batch scoring and various techniques to analyze data at rest, they are hamstrung by the need to stream, store, and then score the data. Not only is it very time consuming, but it also delays the ability to make decisions in real-time, which hampers the business’ ability to accelerate performance.
What steps can users take to rapidly convert IoT data into valuable insights for their business?
Users need to capture more precise insights at the edge and make real-time intelligent decisions in the cloud. They also want to use the system of their choice to quickly and precisely ingest, understand, and act on the massive and diverse volumes of IoT data in real-time. However, it can’t be done without streaming analytics and machine learning capabilities.
Here are some thoughts to consider along the journey toward helping their business extract the most value from its IoT data:
When users think about “ingest,” consider the IoT is about getting access to high-speed data, has various forms, and is emitted from multiple sources. Users need flexible ways to connect to these sources that support the speed and volume of IoT data. Users need tools that support various data formats and protocols and are optimized for high-speed data ingestion. Solutions need to include connectors and adaptors for streaming data as well as static data
Streaming data sources typically include IoT devices like machines in a factory, connected vehicles, wearables, customer browsing, interacting, and purchasing behaviour. Static data sources are often overlooked but represent a treasure trove of information they already have but most likely have not fully tapped into. This static data can enhance the events that originate from streaming sources to provide a richer set of data to analyze.
Activate your treasure trove of data through understanding
For example, processing video, audio, and text is necessary to gain the insights needed to make sound decisions and need techniques that can support those processes and data. Many different methods can be used to understand the information, and applying these other techniques is essential. Tools need to have a wide range of capabilities, including algorithms that can be applied on the streaming data, integration with machine learning, and AI techniques that allow models to be trained offline and then deployed for in-stream scoring. These powerful capabilities can be combined for real-time analyses to discover events of interest.
The purpose is to act.
Users need to act once an exciting event has been discovered. It’s not good enough to identify events and log them somewhere. The point of ingesting these events and applying real-time analyses is to react faster.
React faster so healthcare providers can enhance patient outcomes. Retailers can deliver a differentiated customer experience. Energy companies can predict machine failures before they occur, and manufacturers can detect objects and classify them immediately. No matter the use case, detection is just the first step.
The real value is in the ability to act. The reaction can be in the form of an alert generated to an operator to investigate a problem or maybe to dispatch a technician to resolve a potential issue before it becomes a catastrophic failure. This means support is needed for resolution so users can apply business rules and create workflow – enabling cases to be made, routed, resolved, and dispatched. Action can be human actions, or they can be automated feedback loops to control machines for optimized operations or reduce wear and prolong machine life.
The objective is to quickly and precisely ingest, understand and act on massive and diverse volumes of IoT data in real-time. This has a significant impact on the business and allows people to take the actions necessary to make substantial improvements.
Cloud-based software solutions for industrial applications
A growing number of technology suppliers are offering cloud-based software for automation, control, and instrumentation applications. It remains the responsibility of end users to decide where automation software should reside. Should they “own” the software in a traditional sense, or does software-as-a-service (SaaS) or platform-as-a-service (PaaS) make more sense for particular applications?
Automation systems are now asked to do more in an environment where flexibility and agility in plant operations are critical. Industry 4.0 is driving demand for “intelligent” production. Its focus is smart objects, autonomous products, and improved decision-making processes using new technologies from the information technology (IT) domain.
With the rise of connectivity in the industrial enterprise, it is essential to avoid situations where a company’s intellectual property (IP) and critical infrastructure could be made public. The nature of modern control platforms, and facilities themselves, adds to this concern. The industrial automation sector introduced Modbus in 1979 and Ethernet by the early 1990s. These standards simplified communications but opened the door to potential bad behaviour.
Industrial manufacturers have been slow in adopting robust security due to the requirements for reliability, stability, longevity, and tight budgets. Many industrial control systems (ICSs) run for 30 or more years with minimal hardware or software changes. The use of Ethernet, in various mediums, for plant networks have the potential to expose automation systems in any location-whether in the cloud or on-site-to unauthorized access. (On-site architectures also are frequently called on-premises or on-prem for short.)
Rise of cloud computing
Recently, there has been a significant change in the control and automation landscape due to trends associated with the Industrial Internet of Things (IIoT) and the continued improvement of existing technologies, such as mobile, wireless, cloud, and cybersecurity.
Cloud computing has emerged as an accepted computing paradigm in areas such as banking and enterprise systems worldwide. These systems rely on concentrating critical processes behind robust physical security, continuously managed firewalls, intrusion detection, and encryption. They use defence-in-depth strategies to protect IP. Cloud-based industrial solutions have generally been limited to higher automation hierarchy levels, such as enterprise management.
Significant benefits associated with cloud computing include:
- The availability of standardized development and test/simulation environments cuts costs for setting up and configuring the infrastructure.
- The flexible use of distributed engineering resources permits multi-project and multi-user configuration, independent of location.
- The freedom to focus on core competencies in running the assets while reducing on-site physical footprint, hardware, software and maintenance.
- Expertise is available to keep the system up to date and apply the right cybersecurity solutions to keep it safe and protect intellectual property.
- A demand-oriented pricing model reduces investment costs to the actual use.
Growth of Saas, PaaS services
SaaS is a software distribution model in which the developer hosts applications and makes them available to customers over the internet. Incorporating SaaS in the process control world means adding data collection, integration, and distribution capabilities beyond the limits of most existing in-house systems.
SaaS can expand access to plant data, which supports real-time monitoring of processes. It also delivers Big Data needed to drive predictive maintenance programs. It enables facility-wide or enterprise-wide visibility of real-time key performance indicators (KPIs) and dashboards, focusing on what’s essential and otherwise leverage the value of new or existing supervisory control and data acquisition (SCADA) investments.
When properly implemented, SaaS can mean significant cost savings over the traditional approach of software ownership. This cloud service model offers minimized hardware and software setup costs even as it delivers redundancy and high availability, which allows maintenance on running applications. End users are freed from managing and controlling the underlying IT infrastructure. Security, networking, computing, and all software licenses are packaged into a monthly or annual fee, eliminating or significantly reducing capital expenditures (CapEx). Instead, there is a one-time cost to access any desired services. Organizations pay for what they use and often have the flexibility to add or delete services as needed.
Another benefit of SaaS is the automation services provider usually includes updates software applications. The provider will work with the customer to coordinate the installation of appropriate software release updates and patches to all installed software, as appropriate to the application and ensure they function correctly. The provider takes on this added responsibility because it helps customers use the newest software to lower support costs while improving security.
PaaS is ideal for efficiently providing programming environments and developer tools to industrial organizations that develop and test software and database applications. It provides a complete and centralized development environment that is accessible on demand.
Some cloud-based environments will include sophisticated simulation environments for thoroughly testing project applications before moving them to the production system. Software and database applications can be custom applications industrial organizations have used in the past but are now deploying on virtual machines (VMs) in the cloud. They can also be applications built from scratch in the cloud using the automation service provider’s platform and tools.
In some cases, the same software can be used for SaaS and PaaS applications. Control and process engineers might use a PaaS model to develop the application and SaaS for their production environment. For instance, automation, process control, and SCADA software, traditionally only offered in the customer’s facility, are available as an off-process development and simulation environment (Open VEP) or as SCADA software optimized to provide enterprise-level reliability and security to monitor and control widely distributed assets.
Putting such software into a data centre with direct high-speed connectivity to the telecoms and internet enables high-speed and reliable connectivity to all remote devices and the overall business’s visualisation.
Cloud-based SCADA systems
The traditional SCADA approach has been wholly owned and on-site, requiring dedicated support staff with heavy capital and operational expenditures (OpEx). These costs cover security and firewalls, networking equipment, as well as physical computer servers and software. However, the on-site approach offers limited flexibility and ties up valuable resources better used elsewhere. Perhaps more importantly, these on-site solutions need to be updated every 4 to 5 years. Migrating existing solutions to new hardware is often more complicated than the initial installation-mostly if they were not initially virtualized.
To help customers meet operational and business challenges, leading automation suppliers have developed SCADA software for a hosted cloud environment. New cloud-based SCADA systems represent the natural progression of software in the era of IIoT. Locating functionality in the cloud means end users can move from a capital model to a predictable OpEx model. They can have a functioning SCADA system within days.
On-process production systems are available for remote SCADA applications such as those found in upstream oil & gas, alternative energy industries, or any application where the user wants to monitor and control geographically dispersed assets. It can be used to collect information from the field or mobile devices despite their location. Such systems help operations with highly distributed assets and a workforce not located at one site .
The software is being used within the oil and gas industry to monitor and control wellheads, compressor stations, pipelines, and the renewable energy market for solar and wind power generation. While very different, these applications all share the need to collect data from distributed assets, enable graphical visualization of production constraints, and allow remote control and troubleshooting to optimize production.
This configuration makes SCADA functionality available as a scalable, cloud-based service. This reduces the on-site physical footprint, hardware, software, and maintenance. It also lowers the cost of entry, enabling smaller companies to bring cybersecurity operations online despite limited capital. Operators are free to focus on optimizing processes instead of handling hardware and software updates, system upgrades, technology migrations, and ongoing maintenance.
Virtual engineering tools can provide an off-process system for developing, testing, and validating a process control system. Process or control engineers responsible for making improvements or additions to an existing control system typically use this. The goal is to optimize controls and the appearance and functionality of operator screens before incorporated into the live production system.
PaaS software can be accessed by personnel from anywhere globally on an automation system at any release, configuration, and size required. The plans are developed to include all security, firewalls, networking, and licenses, eliminating the need for IT architecting, hardware and software purchase, on-site installation, maintenance, backups, support, and upgrades. There is also a reduction in the physical space, air conditioning, and physical security necessary to keep an off-process system operational in a production facility.
Implementing a cloud service model
Manufacturers must ask if it makes sense to move part of the control and monitoring software to a cloud environment or remain on-site. Implementing a cloud service model might not be imperative for organizations with one production site. However, manufacturers with multiple facilities can collect data from different locations as part of an overall cloud approach. Some manufacturers might deploy it as a supervisory system to visualize the current status of the company’s priorities to coordinate resources and make better enterprise-level decisions.
Traditional on-site hosting has the advantage of speed and simplicity. The cost of a remote services license also is eliminated. On the other hand, cloud hosting allows for increased collaboration and access to control data without having to manage software licenses on individual computers. Employees can do their jobs from anywhere yet still be secured to their unique scope of responsibility. Others can see progress done on a particular project so they can continue working on the current version.
The flexibility of using distributed engineering resources facilitates breaking the automation task up between specialists focusing on their areas of expertise, regardless of location. Companies can significantly reduce travel costs while enabling personnel to work remotely to bring new products to market faster. This also allows experts to support any facility to minimise downtime while building consistency across multiple facilities.
The cloud versus on-site decision often hinges on what is being controlled and where it is located. There is a growing need for employees to be coordinated in their work processes instead of operating autonomously with the software on their computers-often with poorly orchestrated backups of critical programming. Manufacturers running facilities in one state or country will likely want their cloud-based solution to be hosted within their geographical region. But, if it is necessary to monitor widely or globally distributed assets, it may make sense to use several cloud strategies with fail-over from one location to another.
As cloud computing continues to gain popularity, major automation suppliers offer cloud hosting and on-site control software. One reason for having both is to comply with regulations and meet critical data needs. There are pros and cons to each approach. Customers can partner with an automation technology vendor to choose the right path based on operations and business requirements. The overall strategy might be a mix overall. Automation knowledge and experience is crucial to making the best decision for the company.
Perhaps the most significant opportunity for cloud technology is remote monitoring for enterprise operations. For instance, a cloud solution employing SaaS is appropriate when corporate management wants to monitor all plant locations and small assets’ status. This approach provides one set of dashboards, key performance indicators (KPIs), and other reporting capabilities to help understand the enterprise’s situation. It enables a consistent view of the current situation. It allows data from hundreds of facilities to be aggregated and presented in a visual graphic to direct stakeholders’ attention throughout the organization and help them make better decisions.
Despite concerns about the security of cloud-hosted data for the industrial plant environment, major technology providers have deployed rigorous defence-in-depth strategies to protect software deep within multiple layers of physical and cybersecurity. This gives security experts time to recognize and eliminate intrusions before they impact crucial control assets. Flexible solutions incorporating additional firewalls and advanced encryption to maintain secure access to the customer’s control infrastructure and intellectual property also can be developed to enhance security.