Tom agrees with James Taylor about the importance of discipline around business rules, but objects to an interpretation of business rules that goes back to Taylor's namesake Frederick Winslow Taylor.
Tom distinguishes between decision-support (where the decision is made by a collaboration between an automated system and human judgement) and automated decision-making.(where there is no space for human judgement). This distinction is not always clear-cut - if the human actants within a complex business process lack the information, intelligence, attention or confidence to overrule the computer's suggested answer, then a decision-support system becomes defacto a decision-making system.
We should also distinguish between simple rules (binary logic) and more complex rules (modal logic, probabilistic logic). As Tom points out, much of the rule industry appears to assume simple composition of simple binary rules, which are inadequate for most interesting business problems (three quarters of his "context space mapping framework).
Tom makes three important points
- We should not assume that the business rules are sufficient, invariant, accurate and complete, especially if they are derived from the people who run the existing processes. Therefore the identification and codification of business-rules generally leaves something to be desired. (One way of putting this is that the Real resists symbolization.)
- There needs to be a very strong emphasis on rule-maintenance, otherwise placing all the business-rules into an automated system will lead to a ‘fit and forget’ attitude. (One way of putting this is a demand for double-loop or deutero-learning.)
- The viability of using automation for decision-making is dependent on the context.
In his new book Obliquity, John Kay discusses the example of waiting for a bus. According to the timetable, a bus should come every ten minutes. There are two rules that should help you decide whether to wait for the bus or walk - except that these two rules contradict each other.
Rule One says the longer you wait for the bus, the more likely it is to arrive soon. So if you have waited for nine minutes, it is practically certain to arrive in the next minute.
Rule Two says that the longer you wait for the bus, the more likely it is that Rule One is incorrect. So if you have waited more than nine minutes for the bus, it is starting to look as if the bus will never come at all.
If you have waited more than half-an-hour for the bus, then common sense suggests that Rule Two is in force. But as Kay points out, many people (including those who drove banks into bankruptcy) appear incapable of shifting from Rule One to Rule Two.
In real business situations, there is always a balance between following a rule and questioning the rule. It is not just automated systems that may fail to strike this balance; many people with significant business responsibility lack common sense. This is an important aspect of the context for decision-making.
Update September 2020
Prompted by the current interest in RPA, I've been looking over my old blogposts on Business Rules.
I'm not sure I agree with Tom's point that humans are good at modal logic - there are whole schools of psychotherapy devoted to reframing "always" and "never" into "sometimes". But individuals may be better than machines (whether computer machines or bureaucratic machines).
James Taylor's post quotes Jim Sinur recommending a focus on high volatility rules - which would seem to tell against the "fit and forget" attitude Tom warns us about. But jumping forward ten years, we find RPA vendors recommending a focus on using bots for the low volatility rules. Meanwhile "mutant algorithms" try to tackle probabilistic decision-making. The issues raised in his post are clearly relevant to this new technology landscape.