.
Evolving Knowledge: The state of knowledge is always changing with new discoveries and advancements in various fields.
Influences: Societal and cultural factors, as well as the availability of information, impact the state of knowledge.
Decision-Making: Effective decision-making methods include pros and cons, cost-benefit analysis, decision trees, and group decision-making.
Industrial Cybersecurity: Emphasizes the importance of managing risk and making informed decisions to prevent cyber-incidents and consequences.
The state of knowledge is always evolving and changing. Researchers and scientists make new discoveries and advancements in countless fields of study every day. It can be difficult to keep up with all the latest developments, but it is important to stay informed and curious to continue learning and growing.
The state of knowledge refers to the current level of understanding and information in a specific area of study or field. It encompasses all the theories, facts, and concepts that have been discovered and accepted as true by the scientific and academic communities, as well as any ongoing research and exploration.
In many fields, new discoveries and modern technologies constantly change the state of knowledge. For example, researchers constantly develop new treatments and therapies in the field of medicine based on the latest research findings. In the field of astronomy, new telescopes and instruments let scientists observe and study the universe in previously impossible ways.
The State of Knowledge
The state of knowledge is also influenced by societal and cultural factors, such as the funding and support given to different areas of research and the values and beliefs of the public. Continued research and exploration can sometimes address biases and gaps in knowledge.
Overall, knowledge constantly changes. However, it is crucial for progress and growth in all areas of human effort. Industrial cybersecurity is not an exception. Managing risk is the right knowledge effectively use.
Knowledge covers not only scientific and academic fields but also includes practical skills gained through experience, like cooking, carpentry, or gardening. These areas of knowledge may not have been researched and studied as rigorously as academic disciplines, but they are still important for individuals to learn and master for personal and societal reasons.
In addition, the state of knowledge is also influenced by the availability and accessibility of information. Pushed or distracted away by someone else’s interests, hidden agendas, or simply wrong decisions. Failing is also part of the learning process. The internet and digital technologies now allow us to access information from around the world more easily than ever. There is a vast amount of information available. However, it can be difficult to determine which sources are reliable and accurate.
Another aspect of the state of knowledge is the concept of “unknown unknowns.” These are things we don’t yet realize we are unaware of (the missing pieces of the puzzle). There may be gaps in our understanding of a particular area of study, which we do not even realize. Researchers and scholars must stay open-minded and curious. They should always explore new possibilities and ideas. This reminds me of our last question, how to measure effectiveness and efficiency?
Overall, knowledge is a complex and ever-changing concept that covers many fields and areas of study. It is essential for individuals and society to continue to learn and grow, to advance our understanding of the world around us and improve our lives.
So, how does this relate to Industrial Cybersecurity and Risk Management? Well, industrial cybersecurity is a new field. And there is a lot to come yet.
Let me explain a bit about it. Let’s go through some concerns and rationales.
Let me challenge you with our first question when flipping a coin. Heads or Tails? What are the chances that you will guess the correct outcome before flipping the coin, assuming the coin is in good condition and no tricks are used? Your answer should be 50%, because you have equal chances to determine the correct outcome before flipping the coin.
Let me change the situation. I ask you again the same question after flipping the coin letting you see outcome. What are the chances for you to provide the correct answer? You might feel a bit suspicious. Hmm. Ok, I said no tricks. Your answer should be 100%. Yes, that is correct.
What changed? The state of knowledge. “Knowledge is the key for taking good decisions” What other methods do you know for taking decisions?
There are many ways to decide, depending on the situation and the preferences of the decision-makers. Here are some common methods for making decisions:
Pros and Cons: This method lists the advantages and disadvantages of each option and compares them to find the most benefits.
Cost-Benefit Analysis: This method weighs the costs and benefits of each option to find the one with the highest value.
Multi-Criteria Decision Analysis: This method evaluates options based on factors like cost, time, quality, and risk. It assigns weights to each factor to find the best option.
Decision Trees: This method maps out the possible outcomes of each option and their probabilities. Then, it selects the option with the highest expected value.
Group Decision-Making: This method involves getting input and feedback from a group of people with diverse perspectives and expertise and using a consensus-based approach to decide. Examples of this are: multidisciplinary activity as in a CyberHAZOP, CyberPHA, CyberLOPA,… – RAGAGEP based, whatever you like to call it).
Intuition: This method involves relying on your instincts or “gut feeling” to decide, based on your past experiences and knowledge.
The best method for deciding will depend on the nature of the decision, the available information, the knowledge, and the preferences of the decision-makers.
Finding the way to do the right things right.
Referring to the ISA/IEC-62443-2-1/3-2, one of the main and most important activities within a risk management process is the risk assessment phase for evaluating the risk and taking decisions, preferably and conceivably good decisions, right?
The general formula for risk is R (Risk) = P (Probability) x I (Impact). Mostly everyone agrees, as it is the risk formula used by many risk disciplines.
When dealing with cybersecurity “we, the practitioners” (ISA99) introduces a variation to the formula which now is R (Risk) = Probability x Vulnerability x Consequence (Impact). This formula is more adequate for cybersecurity, but….
“When you look at the formula mostly everyone agrees”, it looks to be a true argument, sadly it can be also false. In a practical way everyone is using a different method for calculating the risk and taking decisions. Choose any organization you might know, whoever developed a “standard” – yes, standard between “”, a set of beautifully written guides, a popular maturity model, a profound set with hundreds and maybe a thousand of controls, etc.; you name it (NERC, NIST, DOE, etc.), take some time to understand how each of these organizations is calculating the risk and taking decisions (I’ve done my own research on this, and they all fall into three main categories). Do the same thing with the vendors of products, systems, and providers of security services that you might know. While most of all agrees on the formula, everyone is calculating and taking decisions differently. So, the fact is: nobody agrees!
In the case of ISA99, in the ISA/IEC-62443-3-2 the “Probability x Vulnerability” is replaced with a methodology combined with the Probability “P” gotten from the operational risk matrix in use at the plant under consideration. I think the standards do not explain this topic well enough, and they should improve it. I can get into this later.
Now, the big question is, Are all the different organizations, vendors, and service providers’ methods equally effective and efficient? Are all mitigating the risk sufficiently? Hmmm. Of course not! They are not. Is there a way to measure effectiveness and efficiency? And what about measuring sufficiency? (As mentioned, I did my own research on this).
To think: Flipping a coin will be better than using the wrong method. By using the wrong method for taking decisions you are guaranteeing that all the decisions will be consistently wrong. It doesn’t matter how much money do you have to spend, or you don’t have anymore (b). By flipping a coin, at least you have a chance to do it correctly.
Finding the way to do the right things right. Without the appropriate knowledge, you can do (a) the wrong things wrong, (b) the wrong things perfectly well (repeatedly) with a lot of money, or with the appropriate knowledge you can do (c) the right things wrong, (d) the right things right. Which path would you choose? ?
As happens with the coin, the state of knowledge is the answer. Dealing with ISA/IEC-62443 working with many customers (learning), I went through these questions over and over for years. Doing this in South America is a lonely work, in a ghost town. In some points it is better because it is like a virgin space. It can be better to work in a virgin space rather than a vicious one?
So, what makes ISA/IEC-62443-3-2 so unique?
It doesn’t matter how much money you can spend; or how many controls you can implement, IT security alone does not protect the plant and will always be insufficient. Additional countermeasures, by design, need to be implemented.
Preventing cyber-incidents is like a sprint race of who arrives first, us protecting the systems, or the hackers compromising them. Non-intentional actions can also create an undesired incident. We all fall behind every time someone publishes a new vulnerability or creates a new threat. This is a hacker game, no matter the color of its hat, white or black, it always is, “the hacker’s business”. So, stop playing fool. Short-term decisions, permanently draining the customer’s money.
A cyber-incident occurs when an unwanted event affects the control system because at least one of its cyber-assets is compromised. A consequence is an undesired event that may happen to the plant. A cyber-incident may or may not lead to a consequence. It depends on the control system and plant design.
Traditional IT security focuses on preventing cyber incidents, while ISA/IEC-62443-3-2 focuses on preventing consequences from happening. Creating a resilient design can achieve this, tolerating cyber-incidents and preventing consequences even if cyber-incidents occur.
Like functional safety, the design of the plant and its systems improves to tolerate failure. Every device, whatever its mechanical, electrical, or electronic, will fail, someday. Same happens in cybersecurity.
Pretending that the control systems will never suffer a cyber-incident is absurd and too utopic. It will happen. Security alone does not meet the needs of plant safety and a broader approach needs to be implemented. ISA/IEC-62443-3-2 has it all. Decisions are based on knowledge. Long-term decisions, investing, and saving the customer’s budget.
Security Levels (SL-T) reduce the probability of cyber-incidents by strengthening zones and conduits. Meanwhile, compensating countermeasures prevent consequences with the expertise of knowledgeable professionals. Every “system-plant” is unique, so the solution should also be. The use of templates does not suffice on IACS.
The missing pieces of the puzzle
When making decisions, it’s important to be aware of common pitfalls that can lead to poor decisions. Here are some common pitfalls to avoid:
- Confirmation Bias: This is the tendency to look for information that confirms our existing beliefs and ignore information that contradicts them. To avoid this, make sure to consider all relevant information and seek out diverse viewpoints.
- Overconfidence Bias: This is the tendency to be overly confident in our own judgment and abilities, leading us to underestimate risks and overestimate our chances of success. To avoid this, seek out feedback from others and consider multiple perspectives.
- Anchoring Bias: This is the tendency to rely too heavily on the first piece of information we receive, even if it’s not relevant or accurate. To avoid this, gather multiple sources of information and consider them all equally.
- Sunk Cost Fallacy: This is the tendency to continue investing in a project or decision because we’ve already invested time, money, or effort in it, even if it no longer makes sense to do so. To avoid this, consider the current and future costs and benefits of a decision, rather than focusing on past investments.
- Groupthink: Groups tend to prioritize harmony and agreement over critical thinking and dissent, which leads to poor decision-making. To avoid this, encourage open and honest communication, and encourage dissenting opinions.
- Emotions: This is the tendency to make decisions based on emotions rather than facts and logic. To prevent this, step back and view the situation objectively. Look for data and evidence to guide your decisions.
By being aware of these common pitfalls, you can make more informed and effective decisions.
Don't forget to subscribe to OT Connect Newsletter - The News That Matters.
Take advantage of the "Cybersecurity Awareness Month" exclusive discounts on training before October 31st.
Get Involved & Participate!
Comments