“in some cases, the business benefit of event processing system is actually reducing the number of events, and focus the decision makers on the important events”
An event processing system generally has some degree of event reduction and aggregation. There may be real-world events that are not represented as system events (for example, a continuous real-world process is reduced to a series of periodic instrument readings, and the intermediate points are not captured). And some events may be front-end captured but subject to some filtering so that not all of them are passed onto the business applications.
In both cases, the total behaviour of the enterprise system may be affected by the quantity and variety of events that get through to the business applications. Too many events, and the decision-makers may be unable to focus on the important ones. Too few events, and the enterprise may lack requisite variety. A good cybernetic systems design would include feedback loops, so that the quantity and variety of events can be adjusted until the enterprise works to the maximum efficiency and effectiveness.
So an event processing architecture needs some kind of variety regulator (“regulator” in the engineering sense, not the legal compliance sense) – some kind of monitoring and control. For example, changing the frequency of some measurement, or changing the filtering or aggregation rules.
In very simple event processing, the systems designers might possibly be clever enough to calculate the right measurement frequencies and event processing rules in advance, although I don’t know of any robust methodology for doing this. For complex event processing the mathematics start to get too difficult, and we need to rely on proper engineering and not just simple arithmetic.
No comments:
Post a Comment