WisePlant – A WiseGroup Company
The State of Knowledge

The State of Knowledge and Risk Management in Industrial Cybersecurity (ISA/IEC-62443-3-2)

The state of knowledge is always evolving and changing. There are countless fields of study and research, and new discoveries and advancements are made every day. It can be difficult to keep up with all of the latest developments, but it is important to stay informed and curious in order to continue learning and growing.

The state of knowledge refers to the current level of understanding and information available in a particular area of study or field. It encompasses all of the theories, facts, and concepts that have been discovered and accepted as true by the scientific and academic communities, as well as any ongoing research and exploration.

In many fields, the state of knowledge is constantly changing as new discoveries are made and new technologies are developed. For example, in the field of medicine, new treatments and therapies are constantly being developed based on the latest research findings. In the field of astronomy, new telescopes and instruments are allowing scientists to observe and study the universe in ways that were previously impossible.

The State of Knowledge

The state of knowledge is also influenced by societal and cultural factors, such as the funding and support given to different areas of research and the values and beliefs of the general public. This can sometimes lead to biases and gaps in knowledge, which can be addressed through continued research and exploration.

Overall, the state of knowledge is always in flux, but it is essential for progress and growth in all areas of human endeavor. Industrial cybersecurity is not an exception. Managing risk is the right knowledge properly used.

The state of knowledge is not only limited to scientific and academic fields but also extends to practical knowledge and skills that are acquired through experience, such as cooking, carpentry, or gardening. These areas of knowledge may not be as rigorously researched and studied as academic disciplines, but they are still important for individuals to learn and master for personal and societal reasons.

In addition, the state of knowledge is also influenced by the availability and accessibility of information. Pushed or distracted away by someone else’s interests, hidden agendas, or just wrong decisions. Failing is also part of the learning process. The rise of the internet and digital technologies has made it easier than ever before to access information from around the world. However, this also means that there is a vast amount of information available, and it can be difficult to determine which sources are reliable and accurate.

Another aspect of the state of knowledge is the concept of “unknown unknowns.” These are things that we don’t even know that we don’t know yet (The missing pieces of the puzzle). This means that there may be gaps in our understanding of a particular area of study that we are not even aware of. As such, it is important for researchers and scholars to remain open-minded and curious, always willing to explore new possibilities and ideas. This reminds me of our last question, how to measure effectiveness and efficiency?

Overall, the state of knowledge is a complex and ever-changing concept that encompasses many different fields and areas of study. It is essential for individuals and society as a whole to continue to learn and grow, in order to advance our understanding of the world around us and improve our lives.

So, how is this related to Industrial Cybersecurity and Risk Management? Well, industrial cybersecurity is a relatively new field. And there is a lot to come yet.

Let me explain a bit about it. Let’s go through some concerns and rationales.

Let me challenge you with our first question when flipping a coin. Heads or Tails? What are the chances for you to guess the correct outcome before flipping the coin, assuming the coin is in good condition and no tricks are used. Your answer should be 50%, because you have equal chances to determine the correct outcome before flipping the coin.

Let me change the situation. I ask you again the same question after flipping the coin letting you see outcome. What are the chances for you to provide the correct answer? You might probably feel a bit suspicious. Hmm. Ok, I said no tricks. Your answer should be 100%. Yeah, that is correct.

What changed? The state of knowledge. “Knowledge is the key for taking good decisions” What other methods do you know for taking decisions?

There are many ways to make a decision, depending on the situation and the preferences of the decision-makers. Here are some common methods for making decisions:

Pros and Cons: This method involves listing the advantages and disadvantages of each option and comparing them to determine which one has the most benefits.

Cost-Benefit Analysis: This method involves weighing the costs and benefits of each option to determine which one has the greatest overall value.

Multi-Criteria Decision Analysis: This method involves evaluating options based on multiple criteria or factors, such as cost, time, quality, and risk, and assigning weights to each factor to determine the best option.

Decision Trees: This method involves mapping out the possible outcomes of each option and their probabilities, and selecting the option that has the highest expected value.

Group Decision-Making: This method involves getting input and feedback from a group of people with diverse perspectives and expertise and using a consensus-based approach to make a decision. Examples of this are: multidisciplinary activity as in a CyberHAZOP, CyberPHA, CyberLOPA,… – RAGAGEP based, whatever you like to call it).

Intuition: This method involves relying on your instincts or “gut feeling” to make a decision, based on your past experiences and knowledge.

Ultimately, the best method for making a decision will depend on the nature of the decision, the available information, the knowledge, and the preferences of the decision-makers.

Finding the way to do the right things right.

Referring to the ISA/IEC-62443-2-1/3-2, one of the main and most important activities within a risk management process is the risk assessment phase for evaluating the risk and taking decisions, preferably and hopefully good decisions, right?

The general formula for risk is R (Risk) = P (Probability) x I (Impact). Mostly everyone agrees, as it is the risk formula used by many risk disciplines.

When dealing with cybersecurity “we, the practitioners” (ISA99) introduces a variation to the formula which now is R (Risk) = Probability x Vulnerability x Consequence (Impact). This formula is more adequate for cybersecurity, but….

“When you look at the formula mostly everyone agrees”, it looks to be a true argument, sadly it can be also false. In a practical way everyone is using a different method for calculating the risk and taking decisions. Choose any organization you might know, whoever developed a “standard” – yes, standard between “”, a set of beautifully written guides, a popular maturity model, a profound set with hundreds and maybe a thousand of controls, etc.; you name it (NERC, NIST, DOE, etc.), take some time to understand how each of these organizations is calculating the risk and taking decisions (I’ve done my own research on this, and they all fall into three main categories). Do the same thing with the vendors of products, systems, and providers of security services that you might know. While most of all agrees on the formula, everyone is calculating and taking decisions differently. So, the fact is: nobody agrees!

In the case of ISA99, in the ISA/IEC-62443-3-2 the “Probability x Vulnerability” is replaced with a methodology combined with the Probability “P” gotten from the operational risk matrix in use at the plant under consideration. I think that the explanation written in the standards around this topic is not good enough, and it should be improved. I can get into this later.

Now, the big question is, Are all the different organizations, vendors, and service providers’ methods equally effective and efficient? Are all mitigating the risk sufficiently? Hmmm. Of course not! They are not. Is there a way to measure effectiveness and efficiency? And what about measuring sufficiency? (As mentioned, I did my own research on this).

To think: Flipping a coin will be better than using the wrong method. By using the wrong method for taking decisions you are guaranteeing that all the decisions will be consistently wrong. It doesn’t matter how much money do you have to spend, or you don’t have anymore (b). By flipping a coin, at least you have a chance to do it correctly.

Finding the way to do the right things right. Without the appropriate knowledge, you can do (a) the wrong things wrong, (b) the wrong things perfectly well (repeatedly) with a lot of money, or with the appropriate knowledge you can do (c) the right things wrong, (d) the right things right. Which path would you choose? ?

As happens with the coin, the state of knowledge is the answer. Dealing with ISA/IEC-62443 working with many customers (learning), I went through these questions over and over for years. Doing this in South America is a lonely work, in a ghost town. In some points it is better because it is like a virgin space. It can be better to work in a virgin space rather than a vicious one. ?

So, what makes ISA/IEC-62443-3-2 so unique?

It doesn’t matter how much money you can spend; or how many controls you can implement, IT security alone does not protect the plant and will always be insufficient. Additional countermeasures, by design, need to be implemented.

Preventing cyber-incidents is like a sprint race of who arrives first, us protecting the systems, or the hackers compromising them. Non-intentional actions can also create an undesired incident. Every time a new vulnerability is published, or a new threat is created we are all running behind. This is a hacker game, no matter the color of its hat, white or black, it always is, “the hacker’s business”. So, stop playing fool. Short-term decisions, permanently draining the customer’ money.

A cyber-incident is an undesired event that happens to the control system when at least one of its cyber-assets gets compromised. A consequence is an undesired event that may happen to the plant. A cyber-incident may or may not lead to a consequence. It depends on the control system and plants design.

Traditional IT security focuses on preventing cyber incidents, while ISA/IEC-62443-3-2 focuses on preventing consequences from happening. This can be achieved by creating a resilient design, tolerable to cyber-incidents, preventing consequences from happening even if cyber-incidents happen.

Like functional safety, when the design of the plant and its systems are improved to tolerate failure. Every device, whatever its mechanical, electrical, or electronic, will fail, someday. Same happens in cybersecurity.

Pretending that the control systems will never suffer a cyber-incident is absurd and too utopic. It will happen. Security alone does not meet the needs of plant safety and a broader approach needs to be implemented. ISA/IEC-62443-3-2 has it all. Decisions are based on knowledge. Long-term decisions, investing, and saving the customer’s budget.

Security Levels (SL-T) help to reduce the probability of cyber-incidents from happening by robustizing zones and conduits, while compensating countermeasures help prevent consequences from happening, by using the genie of knowledgeable professionals. Every “system-plant” is unique, so the solution should also be. The use of templates does not suffice on IACS.

The missing pieces of the puzzle

When making decisions, it’s important to be aware of common pitfalls that can lead to poor decisions. Here are some common pitfalls to avoid:

  • Confirmation Bias: This is the tendency to look for information that confirms our existing beliefs and ignore information that contradicts them. To avoid this, make sure to consider all relevant information and seek out diverse viewpoints.
  • Overconfidence Bias: This is the tendency to be overly confident in our own judgment and abilities, leading us to underestimate risks and overestimate our chances of success. To avoid this, seek out feedback from others and consider multiple perspectives.
  • Anchoring Bias: This is the tendency to rely too heavily on the first piece of information we receive, even if it’s not relevant or accurate. To avoid this, gather multiple sources of information and consider them all equally.
  • Sunk Cost Fallacy: This is the tendency to continue investing in a project or decision because we’ve already invested time, money, or effort in it, even if it no longer makes sense to do so. To avoid this, consider the current and future costs and benefits of a decision, rather than focusing on past investments.
  • Groupthink: This is the tendency for groups to prioritize harmony and agreement over critical thinking and dissent, leading to poor decision-making. To avoid this, encourage open and honest communication, and encourage dissenting opinions.
  • Emotions: This is the tendency to make decisions based on emotions rather than facts and logic. To avoid this, take a step back and consider the situation objectively, and seek out data and evidence to guide your decisions.

By being aware of these common pitfalls, you can make more informed and effective decisions.

About the author: Maximillian G. Kon ISA Qualified Instructor Qualified Instructor ISA Groups Member

Get Involved & Participate!

Welcome to WisePlant
Industrial Cybersecurity and Safety Solutions

Comments

No comments yet