As information technology and automation technology continue to converge, cloud-based communication and data services are increasingly used in industrial automation projects.
Beyond the scope of conventional control tasks, applications such as big data, data mining and condition or power monitoring enable the implementation of superior, forward-looking automation solutions.
New Beckoff hardware and software products for Industry 4.0 and IoT offer simple implementation of such advanced solutions.
Definition of business objectives
Industry 4.0 and Internet of Things (IoT) applications do not start with just the underlying technology. In reality, the work begins much earlier than this. It is critically important when implementing IoT projects to first examine the corporate business objectives, establishing the benefits to be gained as a company from these projects.
From an automation provider perspective, there are two distinct categories of customers that can be defined: machine manufacturers and their end customers – in other words, the end users of the automated machines.
In the manufacturing sector in particular, there is an obvious interest in reducing in-house production costs, both through efficient and reliable production control and also by reducing the number of rejects produced.
The traditional machine manufacturer pursues very similar objectives, and above all is interested in reducing the cost of the machine while maintaining or even increasing production quality.
Optimising the machine͛s energy consumption and production cycles, as well as enabling predictive maintenance and fault diagnostics, can also be rewarding goals.
The last two points in particular offer the machine manufacturer a solid basis to establish services that can be offered to end customers as an additional revenue stream.
Of course, what both customer categories ultimately want is for the machine or product to be designed more attractively and to increase competitiveness in the marketplace.
Collecting, aggregating and analysing process data
The process data used during production provides a foundation for creating added value and for achieving the above-mentioned business objectives.
This includes the machine values that are recorded by a sensor and transmitted via a fieldbus to the PLC. This data can be analysed directly on the controller for monitoring the status of a system using the TwinCAT condition monitoring libraries integrated in the TwinCAT 3 automation software, thereby reducing downtime and maintenance costs.
However, where there are several distributed controllers in production areas, it may not be sufficient to analyse data from a single controller.
The aggregated data from multiple or even all controllers in a production system or a specific machine type is often needed to perform sufficient data analysis and make an accurate analytical statement about the overall system.
However, the corresponding IT infrastructure is required for this purpose.
Previous implementations focused on the use of a central server system within the machine or corporate network that was equipped with data memory, often in the form of a database system.
This allowed analysis software to access the aggregated data directly in the database to perform corresponding evaluations.
Although such an approach to realise data aggregation and analysis in production facilities worked well, it also presented a number of problems, since the required IT infrastructure had to be made available first.
The fact that this gives rise to high hardware and software costs for the corresponding server system can be seen clearly. However, the costs with respect to personnel should not be overlooked either.
Due to the increasing complexity involved in networking production systems, especially with large numbers of distributed production locations, skilled personnel are necessary to successfully perform the implementation in the first place. To complicate matters, the scalability of such a solution is very low.
Ultimately the physical limits of the server system are reached at some point, be it the amount of memory available or the CPU power, or the performance and memory size required for analyses.
This often resulted in more extensive, manual conversion work if systems had to be supplemented by new machines or controllers. At the end of the day, the central server system had to grow alongside in order to capably handle and process the additional data volume.
The path to the public cloud
Cloud-based communication and data services now avoid the aforementioned disadvantages by providing the user with an abstract view of the underlying hardware and software systems.
Rather, only the use of the respective services has to be considered. All maintenance and update work on the IT infrastructure is performed on the part of the provider of a cloud system, whether it be a public or private cloud.
Public cloud service providers, such as Microsoft Azure or Amazon Web Services (AWS), for example, provide users with a range of services from their own data centres.
This starts with virtual machines, where the actual user has control of the operating system and the applications installed on it, and stretches to abstracted communication and data services, which can be integrated by the user in an application.
The latter also includes access to machine learning algorithms, which can make predictions and perform classifications regarding specific data states on the basis of certain machine and production information.
The algorithms obtain the necessary contents with the aid of the communication services. Such communication services are usually based on communication protocols, which in turn are based on the publish/subscribe principle. This offers definite advantages from the resulting decoupling of all applications that communicate with one another.
On one hand, the various communication participants no longer need to know each other – in other words, any time-consuming disclosure of address information is reduced. All applications communicate via the central cloud service.
On the other hand, data communication with the cloud service, via the message broker, involves a purely outgoing communication connection from the perspective of the terminal device – regardless of whether data is sent (publish) or received (subscribe).
The advantages this offers for configuring the IT infrastructure are immediately clear: no incoming communication connections have to be configured, for example in firewalls or other network terminals.
This significantly reduces IT infrastructure set-up time and maintenance costs. Transport protocols used for data communication are exceptionally lean and standardised, such as MQTT and AMQP.
In addition, various security mechanisms can be also anchored here, such as encryption of data communication and authentication with respect to the message broker.
The standardised communication protocol, OPC UA has likewise recognised the added value of a publish/subscribe-based communication scenario and has taken appropriate steps to integrate this communication principle in the specification. This means that an additional standard besides MQTT and AMQP is consequently available as a transport mechanism to the cloud.
The private cloud
However, such publish/subscribe mechanisms are not only used in public cloud systems; they can also be used in the company or machine network. In the case of MQTT and AMQP, the infrastructure required for this purpose can be installed and made available easily on any PC in the form of a message broker.
This means that both M2M scenarios can be implemented and any terminal devices, such as smartphones, can be connected to the controller. Moreover, access to these devices is further secured by means of firewall systems.
The extensions of the OPC UA specification with regard to publish/subscribe will also simplify the configuration and use of 1:N communication scenarios within a machine network in the future.
Products for Industry 4.0 and IoT
Beckhoff provides users with a wide variety of components for simple and standardised integration into cloud-based communication and data services.
The IoT products within the TwinCAT 3 automation software platform offer varied functionalities for exchanging process data by means of standardised publish/subscribe-based communication protocols and for accessing special data and communication services of public cloud service providers.
Corresponding services can be hosted in public cloud systems, such as Microsoft Azure or Amazon Web Services (AWS), but can be used just as effectively in private cloud systems.
These IoT functions can be accessed alternatively via special function modules directly from the control program or can be configured via an application called the ͞TwinCAT IoT Data Agent͟ outside of the control program.
The data to be transmitted can be selected easily via a graphical configurator and configured for transfer to a specific service.
A major advantage here is that the data agent also allows integration of cloud-based services in older, existing TwinCAT systems.
The process data can also continue to be exported here using the standardised communication protocol OPC UA, with the result that data can likewise be sent from non-Beckhoff systems.
An additionally available smartphone app enables mobile display of a machine͚s alarm and status messages.
If I/O signals are to be forwarded directly without a control program, then Beckhoff͚s newly-announced EK9160 IoT Bus Coupler allows I/O data to be parameterised via an easy-to-configure website on the device for sending to a cloud service.
The bus coupler then independently carries out the sending of the digital or analogue I/O values to the cloud service.
An IoT coupling station consists of an EK9160 and a virtually limitless number of powerful and ultra-fast EtherCAT Terminals.
The data is sent in a user-friendly, standardised JSON format to the cloud service and can also be transmitted in encrypted form if required.
Extended mechanisms, such as local buffering of I/O data in the case of an interrupted Internet connection, are provided here in the same way as a monitoring function for connected fieldbuses.
The I/O signals can therefore not only be collected by means of EtherCAT, but also via other fieldbuses, such as CANopen or PROFIBUS.
Analytics and machine learning
Once the data has been sent to a public or private cloud service, the next question is how the data can continue to be processed. As previously mentioned, many public cloud providers offer various analytics and machine learning services that can be used for further examination of process data.
Moreover, Beckhoff also has its own analytics platform for users to take advantage of, namely TwinCAT Analytics.
This platform provides relevant mechanisms for data analysis, with all process-related machine data being recorded in a precise and cyclical manner. All machine processes can therefore be fully recorded as a result.
Depending on its requirements, this data can either be stored for evaluation locally on the machine processor, or within a public or private cloud solution.
TwinCAT Analytics uses TwinCAT IoT to connect to cloud solutions, ensuring seamless data communication. Generally-speaking, this provides the power to create new business ideas and models for the machine manufacturer and respective end customers to capitalise on.
Conclusion
Industry 4.0 and IoT are on everyone͛s minds. Likewise, these concepts are important when the realisation of innovative new business models is a requirement for the underlying infrastructure.
This also drives the increased convergence of IT and automation technologies. Cloud-based data services can help implement such automation projects, as they save the machine manufacturer or end customer from having to provide the corresponding IT expertise.
With TwinCAT IoT and the EK9160 cloud bus coupler, Beckhoff provides customers with two new product series for integrating such cloud-based data services quickly and easily into the control project.
Additionally, TwinCAT Analytics enables the support of such projects using a powerful analytics platform, which facilitates comprehensive analysis of the recorded process data.