A vulnerability worse than Log4j (and it can blow up facilities and shut down the grid)

Joe Weiss is one of the most renowned ICS cybersecurity experts in the US. At one time, they worked closely with Robert M. Lee, one of the most influential and influential gurus in the industry, and CEO of Dragos. Then they went their separate ways, but both continue to be worth listening to.

One of Joe’s notable posts a few weeks ago, “A vulnerability worse than Log4j (that could blow up facilities and shoot down the network)”, was published on his Unfettered Blog on January 2.

Not wanting to spoil the somewhat clickbait-suspiciously-titled but worthy of consideration content, I will highlight only one aspect of this post.

Joe is attempting to formulate rules – or as Joe puts it – laws regarding the cyber security of process measurements.

Joe has been ringing the alarm bell for some time regarding the vulnerability of devices to be classified in the lowest – 1, but especially 0 – levels of the Purdue model. Given this, it is not surprising that he is trying to formulate precautionary rules, or ‘laws’, for the security of these devices.

Joe Weiss’ Law 1 on the cybersecurity of control systems:

Process measurement integrity =
Authorization + Authentication + Accuracy.

The integrity of process measurement ensures that:

  • all changes are made only by those who are authorised to do so (Authorisation),
  • The signal must come from the sensor (Authentication),
  • The sensor measurement takes into account variations, whether unintentional or malicious (Accuracy).

Joe stated that (in the US) neither cybersecurity, reliability nor security standards address the integrity of process measurements.

Joe Weiss’ Law 2 on cybersecurity of control systems:

Garbage in from process sensors = Garbage out from networks

Or to paraphrase the slightly loose text:

Faulty input information from a process sensor
necessarily causes faulty output information from the system.

It could be the “rubbish” mentioned in Joe’s text:

  • unintentional (e.g. sensor drift (drift, drift), operator error, manufacturing defects, etc.),
  • or malicious (physical or cyber) information.

Joe’s “law” is appropriately treated with the constraint that, in the case of “self-protection” functions built into the system (e.g. credibility checking), erroneous input information does not necessarily cause erroneous output information.

Joe Weiss’s Law 3 on the Cybersecurity of Control Systems (i.e., the Commonsense Risk Index (CRI)) (presented in rows, with logical relationships highlighted):

If process measurement integrity is compromised,
AND sensor(s) can cause,
OR contribute to, catastrophic failures
AND the risk is High
THEN must be addressed expeditiously

(E.g. high reliability requirement for measurements and signals from a nuclear power plant circulating pump.)

Commentaries to Joe’s Laws:

  1. Joe’s “laws” above, as currently worded, refer mostly to process measurements. However, my view is that they can be extended to process signals (status, status and fault signals).
  2. Following Joe’s logic, the “laws” can be extended, with some clarification, to process intervening/operating devices as follows.

So I’m going to start “making laws”… 😀

To wit:

Joe Weiss’s extension of Law 1 by GePe:

Integrity of process operation/intervention =
Authorisation + Validation + Accuracy

Operation/intervention ensures that:

  • all changes are made only by those authorised to do so (Authorisation),
  • the operating/intervention command comes from the authorised device (Authentication),
  • the operator/intervention recognises and handles any abnormal operation/intervention, whether unintentional or malicious (Accuracy).

Joe Weiss’ Law 2 extended by GePé:

Even the right operating command from higher levels can have serious consequences
– material or even human –
if the integrity of the intervening/operating element or device is lost.

Joe Weiss’ extension of Law 3 by GePé:

IF the integrity of the intervening/operating device is compromised
AND the intervening/operating device could cause
OR contribute to a catastrophic consequence
AND the risk of this is high,
THEN urgent intervention is required.

The issue is particularly relevant, for example, with the proliferation of digital substations. One of their defining features is the digital design of the process control elements (signalling, measuring, actuating), which may result in their vulnerability and integrity being compromised.

One – if not the most important – cause of the problem Joe consistently ‘pokes’ at may be that the previously prevalent ‘dumb’ devices with fixed programs burned into ROM – including those close to the technology – are increasingly being designed to be programmable, and even equipped with remote access for operator ‘convenience’. These may even be seen by attackers as ‘factory-installed backdoors’.

A solution to the problem could be to bring back an “old good practice”, i.e.

burning programs into ROM memory in the affected processor devices.

The advantage of this solution is that the device’s program cannot be changed remotely, so the device cannot be forced to malfunction.

However, the undoubted operator disadvantage of this solution is that any possible operational – i.e. program – modification can only be done locally, by “physical intervention”, thus requiring considerable preparation. But this disadvantage can also be an advantage, because the modification of the program is certainly more careful because of the “stakes” and the cost.

Finally, let us not think that the above is some new and theoretical problem.

Well, no. The attacker has been at work since the Stuxnet attack of 2009-2010 – more than 10 years ago! – was able to change the speed of the uranium centrifuges – and thus cause them to fail – in such a way that the process control system constantly indicated a nominal speed to the dispatchers. It is true that in this case the attacker did not violate level 0, but level 1 or 2 devices, but it was still valid:

  • Joe’s 1st “law” of GePe’s 1st “law”, since the attack had destroyed the integrity of the device controlling the speed of the uranium centrifuges,
  • Joe’s 2nd law for a level 1 (or 2) device, since the level 1 (or 2) device was “feeding junk information” to dispatchers,
  • finally, Joe and GePe’s 3rd Law, since the integrity violation caused the destruction of hundreds of uranium centrifuges.

So keep your eyes peeled for Purdue levels 0, 1 (or 2) (too)!



We welcome messages either agreeing or disagreeing with the above. For example, by responding to them in a new post, a substantive professional exchange of views could be initiated.

Translated by DeepL