Showing posts with label time. Show all posts
Showing posts with label time. Show all posts

Thursday, December 08, 2011

Enterprise Architecture Tempo

#entarch Like many complex capabilities, enterprise architecture operates at several different tempi. (See my post on Enterprise Tempo.) A workshop at the SEI last year identified three tempi, which it called "Operating Modes".
  1. At the lowest level, enterprise architects operate in an urgent response mode, reacting to crises as they arise.
  2. Next, enterprise architects may operate in a continuous improvement mode, making incremental changes and generally avoiding crises.
  3. Finally, enterprise architects may operate in a transformative change mode, collaborating with business leaders to enable new business capabilities and new business models.

The SEI observes that, in practice, enterprise architects are nearly always operating in all three modes, but suggests that the most effective organizations will spend less effort in the urgent response and more in transformative change.

Most of the process frameworks for enterprise architecture concentrate on the longer-term operating modes - sometimes called tactical and strategic planning. Some frameworks distinguish between solution architecture (with a typical cycle time of 6-18 months) and enterprise architecture (with a typical cycle time of 3-7 years).

But surely it is the less effective organizations, where more effort is devoted to urgent response, that are in greater need of process guidance? As I said in my post on Enterprise Architecture as Viable System, the dilemma for EA is how to appear useful in a crisis without (a) throwing all the EA professional discipline out of the window or (b) demanding at least four months to carry out a proper study. So what good is a process framework that simply tells such organizations that they really ought to put the urgent stuff aside and concentrate on long-term transformation? Or a process framework that is incapable of Learning from Stories?


John Klein and Michael Gagliardi, A Workshop on Analysis and Evaluation of Enterprise Architectures (pdf) SEI, November 2010.

Elsewhere the term "operating modes" has been used for the EA archetypes identified by John Gøtze and his colleagues, but this is completely different. See my post on the Dimensions of EA maturity.

Wednesday, June 22, 2011

Fast data - speed over precision?

@bmichelson posts a piece on an HP website called Fast Data: Speed over precision for better decision-making (21 June 2011), summarizing a recent interview in MIT Sloan with Ali Riaz and Sid Probstein of software company Attivio https://sloanreview.mit.edu/article/why-companies-have-to-trade-perfect-data-for-fast-info/.


Riaz and Probstein are not claiming that better decision-making is just about speed: it also requires a continuous refinement loop, including rethinking one's premises. In other words, we need multiple feedback loops at different speeds, to handle different levels of complexity. As I pointed out in my piece on TIBCO's slogan Two Second Advantage, although simple decisions can be taken quickly, complex decisions need what I call "time for understanding".

One of the architectural challenges of organizational intelligence is to coordinate a complex array of sense-making and decision-making processes operating at different characteristic tempi, and to maintain a proper balance between the very fast (reactive) and the comparatively slow (reflective). Probstein identifies Amazon.com as a successful exemplar.

It is common for technologists to make a fetish of speed, but business effectiveness and organizational improvement need "time for understanding" as well. Riaz and Probstein appear to understand this, and I look forward to seeing how this understanding is supported by their software.

Friday, June 04, 2010

Ghetto Wifi 2

Following my post on Ghetto Wifi, I passed a pub yesterday with a sign "Free wifi for customers only".

Thought of another perspective on this problem. How long do you remain a customer after you have bought a drink? One hour? One day? Is there some kind of dwindling right to use the facilities as time passes? If you bought a coffee within the last half hour then you are a full-status customer with full rights; if you haven't bought a coffee in the last two hours, then maybe your customer rights are weaker.


Let's look at some more examples. If you have lunch in a restaurant, and then do some shopping, can you then go back to the restaurant to use the toilet before driving home? (If you left a reasonable tip, then maybe the aura of "customerhood" still lingers.) If you have lunch in a pub, can you leave your car in the "customers only" car park for the rest of the day? Some establishments have strict limits - for example, some supermarkets have free parking for two hours, and then start clamping or ticketing people who stay longer.

Obviously there will always be some people to try to cheat - to use the facilities without buying anything at all. The point here isn't whether this is possible, or the extent to which the establishment tries to make you feel welcome (in order to convert you to future customer) or unwelcome (to reserve its facilities for genuine customers), but what moral rights you have in the first place.

Some people might think that the "customers only" goes without saying - you shouldn't need a notice - but presumably there are some people whose behaviour will be influenced by a reminder that these facilities are intended for genuine customers.

So it comes down to a question of what counts as a genuine customer, and for how long.

Friday, May 28, 2010

Two Second Advantage

There are two reasons I like TIBCO's new slogan "The Two-Second Advantage". (I have some reservations as well, but let's start with the good bits.)

advantage means something to business

The first reason is that it actually means something to the business, unlike the widely misunderstood technical term “real-time”. One company that is currently boasting “real-time” performance is SAP, which claims that its in-memory databases will result in analytics that are “really in real-time”. Larry Dignam quotes Hasso Plattner as follows. “In analytics, there’s theoretically no limitation on what you can analyze and at what level of detail. (In-memory databases) mean reports on a daily basis, hourly basis.” [ZDnet] @davidsprott calls this In-memory madness. Even if an hourly or daily report counted as “real-time” (which it doesn't), this kind of technical wizardry doesn't make any sense to the business.

On a Linked-In discussion thread recently, I've seen vendors excusing their misuse of the term "real-time" (to describe software that isn't strictly, or sometimes even remotely, real-time) by claiming that the meaning of technical terms evolve over time. Oh yeah, very convenient. But we shouldn't allow vendors to fudge perfectly good technical terms for their own marketing purposes, any more than we should tolerate car manufacturers self-interestedly redefining the word "friction".

(In The 2 second advantage...the 2 culture disadvantage? Vinnie Mirchandani praises TIBCO executives for being able to talk business, and contrasts them with the IT analysts in the expo hall, with a special dig at Gartner.)

Unfortunately, TIBCO is not content with the "two second advantage" slogan, and we find TIBCO CEO Vivek Ranadivé over-egging the pudding by introducing some additional notions including Enterprise 3.0. Transcript of HCL keynote by TIBCO CEO Vivek Ranadive (April 2010). For @neilwd "Enterprise 3.0 ... is the sound of a company trying too hard" (TIBCO, Enterprise 3.0 and the two-second advantage, May 2010. See also Tibco’s Hits and Misses (Ovum, May 2010).

advantage is relative

The second reason I like the phrase "Two Second Advantage" is that it focuses our attention on the business advantage - not of raw speed but of getting there first. If you are a speculator who judges that some asset is overvalued or undervalued, the way to make money from this judgement is to buy or sell and then wait for other speculators to arrive at the same judgement. Being just ahead of the pack is actually more profitable than being a long way ahead, because you don't have to wait so long.

And although simple decisions can be taken quickly, complex decisions need time for understanding. (See my presentation on Mastering Time.)With complex decision-making, it's about spending just enough time to process just enough information to make a good enough judgement.

Ranadivé also talks about two trends - the increasing volume of data and the diminishing half-life. (In physics, the concept of "half-life" suggests a long tail of residual value - just as a radioactive sample will always remain somewhat radioactive, so the value of data never reaches zero.)

But as @neilwd points out, these trends don't necessarily refer to the same kinds of data, especially if we measure data volumes in terms of physical storage, since these volumes are dominated by email attachments and rich media. Maybe we need to find a way of measuring data volumes in terms of information content ("a difference that makes a difference"): as the cost of data transmission and storage continues to get smaller, it is not the number of megabytes but the number of separate items (giving managers the experience of being overloaded with information) that really matters.

Even if we limit ourselves to traditional data, the relationship between data volumes and response speed is not as simple as all that. Let's look at a specific example.

If a retail store gives a hand-held scanning device to the customer and/or places electronic tags on all the goods, it can collect a much higher volume of data about the customer's behaviour - not merely the items that the customer takes to the check-out but also the items that the customer returns to the shelf. As technology becomes cheaper, this enables a huge increase in the volume and granularity of the available data, collected while the customer is still shopping, and therefore the retailer actually has more time to use the data before the customer leaves the store.

For example, you might infer from the customer's browsing behaviour that she is looking for her favourite brand of pasta sauce. The shelf is empty, but there's a new box just being unloaded from a truck at the back of the store. Find a way of getting a jar to the customer before she reaches the checkout, and there's your two second advantage.

Monday, September 15, 2008

SOA Example - Real-Time Regulation

In a couple of recent posts on Turbulent Markets, I asked whether real-time profit and loss could tame turbulent markets (answer: No), and asked whether some other form of real-time event-driven system could perform a regulatory function (answer: Possibly).

Bloggers from some of the CEP vendors have been making similar suggestions for a while.

Back in March 2008, there was a flurry of interest in the strange fate of Eliot Spitzer, who was apparently exposed by the very regulatory technologies he himself had advocated.
Jesper Joergensen of BEA (BEA now part of Oracle, Jesper has now joined SalesForce, Jesper's BEA blog has disappeared) took the opportunity to put in a plug for BEA's event processing products. "If anyone working in a bank's anti-money laundering, compliance or fraud detection unit is reading this", he writes, "go check out [my company's products]. This is the technology you need to automate these compliance requirements."

There is obviously a need for systems to trap people like Eliot Spitzer. But I'm not convinced that simple compliance systems need to be real-time service-oriented event-driven systems. See my post on Real-Time Fraud Detection.

But a much stronger case can be made for real-time risk management. Chris Martins (Progress Apama) put the case for CEP and real-time risk in March 2008. More recently, Jeff Wotton (Aleri) has put the case for real-time risk consolidation, drawing on an interview with Nick Leeson. Meanwhile, Progress Actional has a product page on Real-Time Risk Profiling for Banks.

The point about real-time risk aggregation is that you need to produce a rapid and reliable picture of total risk, drawing on data from many heterogeneous sources. In a typical business environment, new types of risk and sources of data are constantly being added, and you want to be able to plug these into your risk consolidation straightaway. In this kind of scenario, it should be very easy to justify SOA.

Tuesday, January 22, 2008

Real-Time Events

Opher reports a car accident (unplanned events again) and concludes that people need to process events in real-time and not in batch.

Congratulations to Opher on his fast reactions, and commiserations on the slower reactions of the driver behind. Clearly there are some events you have to process in real time. But I hope he is not implying that all events must be processed in real-time. When the fuel tank indicator appears, do you refuel immediately or do you wait until you reach the next gas station? How often do you have the vehicle serviced?

I think the critical question for systems designers here is to determine which are the events that call for a real-time response, and which are the events where a batch response is more appropriate.

There are also events that call for a low-latency (extremely fast) response, but don't count as real-time. For example, in the financial markets, there may be short-lived arbitration opportunities, which means you can make a lot of money if you can react within a small number of milliseconds. This is not a real-time requirement, because nobody expects you to pick up every single opportunity - just catch a reasonable number of them. (The reactions of the frog should allow it to catch just enough insects to fill its belly - but some insects escape to breed more insects. The insects get faster, and so do the reactions of the frogs.)

Surely the event infrastructure should be capable of handling any of these patterns.