Sunday, August 31, 2008

The Death of EBO?

General Mattis's guidance that USJFCOM cease the use of Effects Based Operations (EBO) concepts does not bode well for its future. Here's a few extracts I found interesting:

  • "The joint force must act in uncertainty and thrive in chaos, sensing opportunity therein and not retreating into a need for more information."

  • "EBO ... assumes an unachievable level of predictability"

  • "... any planning construct that mechanistically attempts to provide certainty and predictability in an inherently uncertain environment is fundamentally at odds with the nature of war."

  • "The use of 'effects' has confused what previously was a well-designed and straightforward process for determining 'ends.'"

  • "The underlying principles associated with EBO, ONA [Operational Net Assessment], and SoSA [System of Systems Analysis] are fundamentally flawed and must be removed from our lexicon, training, and operations. EBO thinking, as the Israelis found, is an intellectual 'Maginot Line' around which the enemy can maneuver. Effective immediately, USJFCOM will no longer use, sponsor, or export the terms and concepts related to EBO, ONA, and SoSA in our training, doctrine development, and support of JPME."

My limited understanding of EBO was acquired incidentally when I became interested in Network Centric Warfare a few years back. So, I don't know enough about the details of EBO to have strong opinions about it.

However, General Mattis's statement did prompt a few snap reactions:

  • EBO, like NCW, seems to have come to prominence in part because of the emergence of hyperconnectivity in communications and information technology. This revolution prompted visions of coordinating effects with unprecedented precision across a battlespace. As I've said before, I think the basic concepts of NCW (as depicted in Figures 5 and 11 of "The Implementation of Network Centric Warfare") are exactly right in their emphasis on the social and cognitive aspects of turning information into actions. However, I've also noted that the U.S. implementation of these concepts seems to have overemphasized information sharing, to the detriment of the social and cognitive domains. General Mattis's critique of EBO recognizes that robust information may actually degrade decision making if the cognitive and social domains are not also considered.

  • There seems to be a widespread feeling among John Boyd's followers that the intellectual leaders of NCW have promoted concepts that are flawed understandings of Boyd's ideas (e.g., the Observe, Orient, Decide, Act (OODA) loop). I may have misunderstood these critiques, or it may be that the NCW theorists were on a track parallel to Boyd (a common situation when a fundamental new idea is on the horizon). Regardless, I think it's true that NCW does not emphasize "staying inside the opponent's decision making loop" the way OODA does. General Mattis's critique seems to imply that EBO reflects this deficiency.

  • Users of a process must own it. According to Mattis, EBO is "staff, not command, led." All of us who live in large organizations understand that creating robust and mature processes is relatively easy (though expensive). The hard part is getting those processes into the heads of those who use them, and to do so in such a way that robust coherent decisions and actions are established and maintained.

  • There seems to have been a fundamental misunderstanding of the nature of war. This is why I think Cynefin is such a useful framework. It helps analytically oriented individuals and organizations understand the limitations of analysis and the need for agile probing in Complex contexts (which is what most battlespaces are).

  • Trying to attack a Complex problem using analysis, complicated organizations, and thick processes is a recipe for failure. What's needed are relatively simple organizations and processes that (a) provide "just enough" structure to maintain coherent action and (b) are able to run OODA loops fast enough to maintain relevance. Again, see Cynefin.

  • Finally, I can't help wondering if there's a bit of "emergence magic" mixed into EBO. As with NCW, I may be misunderstanding what I've seen, but there's an "information for free" mentality that seems to occasionally pop up in discussions of pervasive hyperconnectivity. As someone who has been entranced by a number of books describing complex adaptive systems and emergence, I understand the temptation to think that mixing the right ingredients with the right incantation might result in the emergence of an unexpected synergy. However, our ability to design emergent behavior to achieve a specific goal seems to be very limited at this point.

Anyway, if you're the least bit interested in asymmetric and irregular warfare, General Mattis's article is must reading.

Saturday, August 30, 2008

The Decontextualization of IT

Seems like every few weeks there's new swirling about SOA vs. WOA/REST/ROA.

I doubt I can add anything new to the discussion, but it occurs to me that a part of what we're seeing may be the gradual and ongoing decontextualization of IT. Here's the progression:
  1. Mainframe era - an entire entire enterprise of related contexts are modeled on a single box. User interaction was limited to block-mode terminals & printouts. My first job after leaving college was programming on such a box.
  2. Microcomputer era - similar to mainframe; for smaller organizations
  3. PC era - a small set of a specific users' or group's contexts are modeled. New tools allow users to do their own modeling (Excel, Access). IT spends a decade or more figuring out how to govern the resulting dis-integration.
  4. Web 1.0 - entire enterprise of contexts are exposed via a standard presentation layer (i.e., a browser); in some ways this looks like the return of the mainframe; albeit a distributed one.
  5. Web 2.0 - coarse-grained business-oriented subsets of specific contexts are exposed. Although the integration of those services remains largely an IT job, the services carry much less context with them than the apps/systems they replace. As a result, the SOA style is promoted as enabling increased business agility and innovation.
  6. Resource-oriented web - decontextualized resources are exposed. IT builds apps, services, workflows, etc. that contextualize resources. Users string resources together to contextualize them "at the speed of need." (e.g., Ubiquity).

If you emphasize the contextual aspect of knowledge (as I do), you may find this trend slightly puzzling. Ever since the birth of computing & IT, there's been a focus on increasing the amount of context that's automated. There was a concerted effort, now seen as largely failed, to create what became known as "strong AI." And, recently, there's been lots of speculation about a Web 3.0 or Semantic Web.

Even Dion Hinchcliffe, in an interesting discussion about how the Web is increasingly about user-driven contextualization of resources, has a diagram that speculates that the next step may be Semantic.

He may be right...but, if so, it will be a dramatic swing back toward contextualization. Since it's unclear how resources/services can become much more decontextualized, maybe it's time to move back a little. If so, a key issue will be how to do so while maintaining the adaptability that comes with decontextualization.

Regardless, increasingly decontextualized IT, along with standards/provisioning/tools that make it easy to recontextualize "at the speed of need", would seem to be exactly what's needed to support more adaptable and agile sensemaking and decision making.

If you're unaware of the SOA, WOA/ROA/REST, etc. swirling, here's a few recent items: Burton Group blogger, here, here, and here. This shows no sign of clearing up any time soon...which, given how fundamental it is, may be a healthy sign.

Thursday, August 28, 2008

Drawing Distinctions

Two recents posts got me to thinking about the drawing of distinctions.

Dave Snowden discussed various types of stories, and referenced a Patrick Lambe discussion of the dangers of simple, closed typologies. If you're unfamiliar with this topic, both are well worth reading.

Anyway, my reaction was that there are two extremes that folks tend to react against. At one end is the logical positivist dream of "one taxonomy to rule them all." At the other end is the postmodern "taxonomies are a tool used by the dominant culture to oppress those who are not of it." In this latter perspective, the only appropriate action is to deconstruct the taxonomy to uncover the underlying assumptions and power structures.

Both extremes highlight real concerns. Drawing distinctions is an inescapable activity in creating meaning and taking action. However, the distinctions drawn do reflect historical and environmental considerations that are often invisible to those using them. And, it seems likely that the creation and using of distinctions forms a sort of "strange loop" that is inextricably entwined with identity formation.

Practically speaking, Known/Knowable (Cynefin) domains tend to be Exploitation/Execution-oriented, and generally have relatively stable taxonomies/typologies that are relatively invariant across those who use them (though an expert's will be much richer than a novice's). If you want to push a Known/Knowable context into the Complex domain, deconstruct its taxonomies.

Complex domains, on the other hand, are likely to take a bite out of any taxonomy you bring to the table. This is one of the challenges of probing these domains...you're constantly shuffling a morphing deck of distinctions to keep the useful ones in play...not the sort of thing taught in the average engineering or business curriculum.

Teaching Programming To Accountants

When I was in college in the late 70's, I had the opportunity to attend the national meeting of DEC minicomputer users. At lunch one day I sat at the same table as the president of a small accounting software vendor. When I asked him about what kind of background he looked for in a developer, he said "I hire accountants; it's easier to teach programming to accountants than to teach accounting to programmers."

I was reminded of that while reading a Forrester report ("Complex Event Processing in a Quant World"). In it, Charles Brett interviews Robert Almgren, a pioneer in using Complex Event Processing to build algorithmic trading strategies. Regarding the skills he looked for, Almgren says "I wanted quants — not IT or technology people. This was because creating the algorithms determines the event processing. If you do not know what you want to do with the events available, no amount of CEP will help."

It seems like IT is splitting into two pieces: commoditized & standardized (but complex) data processing, and domain-specific business-focused capabilities that are becoming deeply woven into the non-IT fabric of the business.

It's not clear to me that IT-trained personnel will ultimately perform much of the work in the latter category...at least where the related business knowledge is largely Knowable (the domain of experts) or Complex (the domain of pattern management). Then again, maybe it's always been that way...

Saturday, August 16, 2008

Dis-Integrating E-Mail

As asynchronous electronic connectivity became commodotized in the 80's and 90's, two types of tools became pervasive: e-mail for one-to-one (1:1) or one-to-few (1:F) communications, and bulletin boards (then Internet forums) for many-to-many (M:M) communications. Since I'm mostly interested in group sensemaking, I'm going to ignore 1:M communcations for this post.

Enterprise 2.0 has begun to explore how social media can be used on a scale that is much smaller and more formal than the Internet. That exploration is mostly of 1:1 & 1:F capabilities, since most group conversations inside an enterprise are on that scale.

Given that e-mail is basically snail mail in an electronic form, it would seem that e-mail might disintegrate into multiple capabilities that are crafted for the various types of small group activities.

Matt Moore (engineerswithoutfears.blogspot.com) has an interesting post & SlideShare of how he sees e-mail disentegrating...check it out.

Monday, August 11, 2008

Mashup Wiki Apps

I suppose that sounds like a bad translation from some foreign language, but it seems to be what MindTouch Deki enables. In this blog post, Dennis Howlett discusses the latest release.

I heard an interview over the weekend with the inventor of the wiki and thought it was interesting that he invented it to replace e-mail-based collaboration on project. What little I know about wikis indicates that, within the enterprise, they still work best in the context of a group that needs to collaborate to complete a task.

I suspect it has to do with the fact that individual identity within an enterprise is usually tightly coupled to specific goals and tasks. As a result, most individuals are a bit at sea when asked to start or contribute to a wiki outside the context of some specific goal/task.

I'm not sure anyone can keep track of all the innovation going on in the social media arena, and I don't even try. So, there may be a number of products that look like MindTouch Deki. Regardless, it's an intriguing concept: combine the artifact-oriented wiki with the ability to mashup application inputs/outputs and you get something like a collaborative workflow exploration tool. Something about this sounds catalytic...

Saturday, August 9, 2008

Will IT Be Run by Social Scientists?

As a big fan of Karl Weick & Gary Klein, I'm sympathetic to Gartner researcher Tom Austin's answer of "yes" to this question in this interview.

However, I'd probably answer "no, but." I do think that systems engineers and architects are going to have to become much more literate about social and organizational behavior, and about how information technologies affect individual and group sensemaking and decision making.

And, managers at all levels are also probably going to traverse a similar learning curve.

IT might even have a few social scientists on staff who are techno-literate. But, I don't think you'll have to have a degree in the field to effectively use the required social science knowledge.

Conversation Types

This blog entry by Jennifer Legio references a "Conversation Prism" as a tool in enterprise social media strategy formulation.

It's an interesting taxonomy that raises an interesting question...when will the rate of change in the taxonomy/structure of social media begin to flatten out? I certainly don't have a clue, but do have a couple of snap reactions:
  • The variety of social media tools will probably continue to increase for 5-10 years
  • So far there's very little of what I'd call "user governance" tools for managing your social net. I can't imagine social media being considered mature without mature governance capabilities across a range of contexts (e.g., individual, internal, external, ad hoc, group, enterprise, etc.).
  • Given that this ultimately will result in a much more complex mixture of Exploration and Exploitation across multiple organizations, along with the likelihood of more fluid individual and group identities, I lean toward 30+ years before this begins to shake out.

Friday, August 8, 2008

Fitting Organizations

I suppose we all struggle some with separating ends from means. But it is still frustrating to see individuals and groups pursue tools like process and organizational structures as an end in themselves.

To a certain degree, focusing on a tool is useful. Human-intensive tools are always evolving, and the environment in which a tool is being applied also evolves. So, there's a certain amount of tool focus that's required to engage in the ongoing task of molding the tool to the context.

However, it seems that we're more likely to shift our focus from a goal to a tool than vice-versa. And, finding a good balance between ends and means is often difficult.

This task is considered in an article by David Tucker (from Homeland Security Affairs). It's an interesting discussion of fitting organizational structure to the context of fighting terrorism.

Hyperconnectivty in the Enterprise

Posts by Andrew McAfee about micro-blogging and messaging in the enterprise ignited a small firestorm about whether a company should set up constrain how these tools are used.

Seems like this is classic Complex (Cynefin) territory, so boundaries and attractors make lots of sense.

Furthermore, enterprises have goals and engage in a mixture of Exploration and Exploitation activities to reach those goals, so integrating exploratory tools like micro-blogging and messaging into ongoing Exploitation activities makes a lot of sense (e.g., many web retailers now have chat integrated into their sales workflow).

Regardless, a related item may be more interesting in the long run. Mozilla has released a Firefox plugin called Snowl that aggregates messaging and micro-blogging. While Flock is more established, Snowl's focus on aggregating messages raises some interesting questions. The one I find most intriguing is how this tool may evolve to allow users to manage hundreds to thousands of message sources (and tap into the associated social network). I don't have an answer, but this may be the cutting edge of hyperconnectivity in the enterprise.

Tuesday, August 5, 2008

Complex Organizations

There's been a lot written about social media and modern organizations, but publicly available research in this fast-changing area is relatively scarce. Which makes this HBS Working Paper interesting.

The fact that Micheal Tushman is one of the authors makes it interesting to me...he's one of the main popularizers of the Exploitation-Exploration contrast.

The focus is on coordination across the organization, and has some interesting observations of the single large company studied. Here's one that caught my eye: "...the category spanners in the firm are women concentrated in the upper-middle management ranks and in a few functions, most notably sales, marketing, and general executive management."

Focus and Resource Scarcity

We tend to focus intently on key resources that are relatively scarce.

This truism seems so pervasive as to be trite...so why mention it? Where resource supply and demand change relatively slowly (or oscillate in a bounded range) I suspect there's no compelling reason to think much about this topic.

However, the world of IT continues to change quickly, and Moore's Law remains in force. One implication of this is being discussed by Joe McKendrick in his coverage of the emergence of a Service Science curriculum.

How does engineering and IT change when processing, communications, and storage becomes commoditized (i.e., cheap, pervasive, and standardized/interoperable)? Since the telegraph was invented, an enormous amount of effort has been focused on dealing with resource scarcity in these areas. As that focus is freed up, it's beginning to shift to the business application of hyperconnected and commoditized IT.

It's a very different kind of problem demanding a different set of skills...and, IBM is recognizing this.

Finally, Dion Hinchcliffe has a nice summary of one current aspect of this transition...cloud computing.

Twitter in the Enterprise

SAP is rolling out an Twitter-like capability for the enterprise, discussed here by Oliver Marks.

I suspect no one really knows whether/where this might work. Seems like it might be useful for individuals and groups that are in a sensemaking/orientation role, especially if constant conversation is needed (e.g., marketing).

Like all social media, using metrics to track usage and infer value may be difficult since gaming and unintended consequences may become pervasive (i.e., just knowing something is being measured can turn it into a goal).

Andrew McAfee also has a couple of posts on the topic (here and here).

Sunday, August 3, 2008

Knowledge Binding

Few (if any) contexts are pure Known/Knowable. If a human is involved, there's at least a few Complex threads. Which means that there's usually a trade space involving when to bind Knowledge to Context.

I was reminded of this late last year when I heard an NPR piece on reducing catheter-related infections in the ICU. The traditional approach was to create a more sophisticated (and expensive) technology...that seemed to be more "idiot-proof." The non-traditional approach was to create a process and roles that made the existing technology less risky.

Here's a few observations:

Technology solution - antibiotic-coated catheter
  • Complicated point technology
  • Intended to reduce risk in a range of contexts
  • Expensive; one-size-fits-all "silver bullet"
  • Knowledge is statically bound to all potential contexts at the time the the technology is designed and created
  • Unanticipated risks are not mitigated
  • Infection rate remains unacceptably high

Process/roles solution - checklist to control infection sources, non-traditional roles/responsibilities to increase organizational reliability

  • Simple process
  • Intended to ensure the Context is low-risk
  • Cheap; humans ensure fit between context and technology
  • Knowledge is dynamically bound to a specific context at the time of need
  • Unanticipated risks are addressed when cather is inserted
  • Infection rate drops to near-zero
As technology becomes more sophisticated, agile, and adaptable, the temptation to create sophisticated "silver bullets" increases. And, it gets easier for a designer to be seduced by the illusion that smarter technology can move the Complex into the Known/Knowable.

When you add in the fact that financial incentives tend to be biased toward Technology (i.e., a product or system that's easily monetized) and away from Process/People/Organization, it's not surprising that the doctor in the NPR story had a difficult time getting hospitals to adopt his very successful solution.

Bottom line: Complex threads require late knowledge binding (driven by Context)...which often means humans in/on the loop. And, short-term financial incentives often point away from the most effective solution.

See also the original New Yorker article.