About Us

Bridging security gaps in IT and OT systems

A VPN is an essential component of IT security, whether you’re just starting a business or are already up and running. Most business interactions and transactions happen online and VPN

AEI is a B2B engineering solutions and services provider in the area of industrial IoT and automation. We consult, train, create/integrate & maintain digitization solutions for OEMs, silicon vendors, MES/ERP providers, smart factories, etc. A number of our off-the-shelf solutions target

– OT network node configuration and management

– secure OT/IT gateways for legacy systems

– Data aggregation from OT to IT systems

More information at https://www.aeicor.com/global/technology/industrial-iot

Considering our solutions help bridge the OT and IT systems in an enterprise, we have to take into account the cyber security vulnerabilities of a traditional apparatus typical of any industry; and build a designed approach to bridge the security gaps along with the information gap; so as to speak, greater proliferation of digitization across the industries will always go hand in hand or drive demand for greater addressal of the cyber security facets.

Chidrupaya has over ten years of experience in responsible product and solution design, life-cycle management, and engineering consulting in IoT, automation, Industrial networking, smart sensor networks, low-power devices, mission – critical systems, industrial safety, and security systems. He has provided engineering consultation, technology training, and product design to clients in India, the EU, and the United States in the fields of silicon manufacturing, OEMs, and IT solution providers.

Why Now?

Now, this realization isn’t something that has become apparent only in the last few years; rather security considerations have always been part of any design decision, although the importance of it has varied across different industries, due to their inherent nature. For example, imagine a vertical manufacturing management system or any IoT platform you are familiar with. Typically, the edge nodes (or field-devices) directly interacting with the environment via sensors/actuators, are connected to computer platforms (or gateways, brokers) which then connect to the IT platforms, to exchange information. Depending on the type of industry, only a part of this entire network is exposed to the outside world e.g., via public infrastructure and wireless networks; and while designing such systems, more common than not, only these exposed components were taken into consideration for security development lifecycle practices. Although a potential security loophole can be exploited at any point of a networked system, due to the inherent disintegrated nature, most deployments get away without a penalty for this oversight.

That has changed in today’s scenario, especially with the aggressive digitization of any and all industries, driving in deeper with each passing year. This has increased the demand for more and more closely connected components which allow granular access to process-level information across all deployment geographies and which provide flexibility in configuration and integration for a highly scalable system; as well as methods to aggregate, digest, and represent this data in the IT platform. This has increased the likelihood or probability of any single component in the entire system, being vulnerable to security threats.

Now you may think, “is it going to affect me”? Well, that depends on which industry is under consideration, and finally, you would have to decide that for yourself. But it is more likely that the current digitization revolution (or industry 4.0) is going to affect (positively in my opinion) eventually every part of this huge machinery and it is an examination of not if but when. So, the real question we are posed with is whether we want to be ahead of the curve, in this evolution. There is no doubt that, there has been a fundamental shift in our perception of how we visualize, and access information as users, managers, and administrators, and it is only going to evolve further; it would be akin to swimming against the tide to hold back.

So how to get started?

First, we need to recognize that the days of fractured integrations and disparate components are brief; it is going to be a world of highly integrated accessible machines that will bring in its own sets of advantages and drawbacks. Future (if it can be called that any longer) systems are going to make information access seamless, reliable, and platforms dynamic/scalable; at the same time adding to the complexity and overhead of configuration and security. Adapting to this change is going to be a challenging task, although it may be more so for some geographies than others. An encouraging factor in all this is that there has been already a large amount of collective effort, to bridge this gap and help every sector of the ecosystem adapt as smoothly as possible; this is in terms of, but not limited to, standardization of information exchange format, protocols, configuration, security & encryption, hardware support, etc. There has been an ongoing effort to make these available to the community, via Open-source or easily accessible third-party options. But more importantly, to get started there has to be an adaptation in how we approach design decisions.

Now, as we progress on this digitization journey, there is going to be a need to address a number of design facets including the aforementioned communication, information representation, compliance, and security; and all of these would play critical roles in how well suited the design would be for future market demands. While usability obligation is going to be the primary driving factor, for this article, we will anchor our conversation on the security aspects of OT and OT-IT intermediates.

Standardization and compliance

In the digitization evolution, there is going to be less and less space, if not none, for fractured components inside a connected system. The future of infrastructure is based on information consistency, device congruency, indefinite scalability, and flexibility. Standardization is unavoidable in achieving interoperability, and in addition, modern standards inherently address the security aspects of such a scalable system, at each layer. So, it is imperative that the first step towards the design goes through the choice of the right standard, not just in terms of product architecture design but also in the development practices as well.

Data Security

The first and foremost of the challenges would be to ensure data security and not just the data over the air but also the data at rest, on any component. This would involve choosing the right encryption (e.g., SSL/TLS), with authorized access-control systems in place, via reliable third-party validation. Using any mode of name-sake encryption or validation will expose the vulnerable points in the network, and it is especially true for gateway devices and low-footprint IoT nodes which typically do not include the necessary hardware support for higher-level security. It is critical to choose a standard and hardware which will provide this support.

Identification

The next step is to ensure identification authentication for each component or device in the network. In an ever-dynamic, scalable system, it is critical to ensure that each component’s identity is authenticated and verified before and during the operational life cycle i.e., configuration and data exchange. This may involve issuing unique digital certificates or SKS key-pair. This is vital for edge nodes (or field devices) and gateways (or data brokers) in both client-server and talker-listener arrangements. This would need careful assessment of the associated hardware support and performance penalties.

Legacy support

While designing the system, it is a practical consideration to address the integration and how to support legacy systems and components. Now, to the possible extent supported by their hardware, the legacy devices should be updated to the latest standards. But wherever required, there would be a need to create bridges between legacy and current systems where the gateways handle the interoperability and security aspect of the brown-field components and ensure compliance; in these scenarios, accessibility and scalability would need to be sacrificed in favour of security.

The following are two examples describing each of these cases. OPC UA is one of the industry standards which provides a framework for consistent information exchange between IT, and OT systems, from field devices to compute nodes and ERPs; and avoids information fragmentation between multiple vendor systems running on existing protocols such as Profinet, EtherCAT, Ethernet POWERLINK, etc. There are multiple compliant third-party and Open-source OPC UA stacks available for system designers. From its original form to the current, these have progressively included security provisions to address the afore-mentioned aspects in a real-world system. So, in deployments, over the years we have had to update and adapt its integration strategy, starting from early adopters to the early majority.

Example-I, for a smart factory early adopter, we had to create an IT-OT bridge from the factory to the ERP system, where the OT system included legacy components as well as we had to deploy newer infrastructure to the recently commissioned machinery. Our solution was to create a real-time live data aggregator gateway service on secure hardware, which acted as a data broker between the legacy OT components and the IT platform over SSL, without exposing any of the legacy OT components to public network (i.e., sacrificing accessibility in favour of security); at the same time latest OT components adherent to the security standards, were made accessible directly from the IT system, via the same gateway service. The gateway hardware itself supported the current-gen security which acted as a firewall between the public network and the legacy OT hardware components without compromising compliance with standards.

Example-II, Early adopters of remote maintenance software providers typically used MQTT brokers to access field networks and device information that had OPC UA servers. When there were no ready-to-use secure OPC UA talker-listener components, our gateway service allowed the MQTT brokers to collect data from OPC UA servers, via client-server service over SSL; which then published it to other MQTT nodes. The start-up, configuration, key exchange, and information exchange, have to be secured while adding a marginal performance penalty.

And as systems and software evolved, so did the integration approach. In time, Open-source OPC UA stack S2OPC, which targets addressing safety and security compliance over OPC UA, provided secure talker-listener components which could now be deployed to achieve better performance balance.

Ensuring availability

Apart from data security, there is also a need to ensure that the system and its components are always available, especially for mission-critical systems. A common form of security challenge would be the denial of service (or DOS) attacks and the system has to include preventive mechanisms to ensure that critical components are always accessible and able to share information/data. This will also include preventing identity theft of any of the components by an outside attacker who can potentially manipulate the traffic volume without needing access to encryption, as well as immediately identifying and isolating originating sources of such attacks.

Continuous Improvement

The design approach to building components also needs to include a continuous improvement of the components, in the design, production, or deployment stage; for the definition and requirements of the system, security is continuously evolving. As we see more and more digitization, there will be increased instances of cyber-attacks; and as the majority of the future battles are expected to be in cyberspace, it will give rise to consistently playing catch-up to the rapidly evolving demands. It would be a good design strategy to plan the development in increments, from a business standpoint as well as considering the evolving scenarios, such that continuous upgrades and patches are rolled out to each layer of the system, to remain protected. This will include firmware authentication for over-the-air updates, especially on edge nodes, to prevent any malicious updates from third-party.

With these general guidelines in mind, there also has to be a realization and subsequent acceptance, to underscore security facets as one of the primary aspects of product and system design. With progress, the number and dimensions of the challenges will go up and the designers have to level up to play the probabilities game. The probabilities of risk will play differently across the ecosystem depending on the industry type (e.g., there will be higher challenges for the telecom sector than the manufacturing sector) and their criticality to the economic function (e.g., power distribution system), and on the vulnerability of different economies, but as digitization drives to unify the automation backbone across the globe, so will it do for the security threats irrespective of geography.

Author: Chidrupaya Samanta Ray

Company : Accessible Engineering Innovation Corporation Pvt Ltd

Recent Blogs

Newsletter

SUBSCRIBE TO OUR NEWSLETTER!!