Saturday, March 29, 2008

Traditional SOA Governance

A recent Forrester report, "Scoping Strategic SOA Projects", has a nice discussion of what I would call centralized SOA governance. It's a good jumping-off point for the journey toward more agile business processes. However, there's very little Discovery flavor to this....I suspect this kind of framework will be considered hopeless dated five years from now. Nevertheless...a good departure point.

Power Users Hate AJAX Apps?

A recent Forrester report asserts that Power Users of client-based business apps will be disappointed in AJAX except for "mash-ups."

Whether that's true or not, my snap reaction was that AJAX is mostly focused on allowing a relatively small number of facets/dimensions of a context to be combined in unexpected ways. If Power Users are mostly doing sophisticated analysis, then I would expect them to find AJAX-based apps to be of limited usefulness.

The small number of facets in an AJAX app and the simple/sparse controls limit sophistication and constrain analysis. This both a weakness and a strength...if you're doing Discovery/Exploration in a narrow context, then an AJAX app can be "just right."

Governing BitTorrent

Seems like I first became aware of BitTorrent 2-3 years ago. At the time, the thing I thought was really interesting was that it had metadata hooks that offered some real potential for discovering & organizing emergent communities of interest. Since then, it's moved to the center of the file sharing war between file sharers and copyright holders.

Until this week, Comcast had been blocking torrents since they were consuming large chunks of a finite resource. This week, they stopped blocking them....however, the price is that all users are subject to bandwidth restrictions. All this makes sense since it seems like bandwidth demand is likely to always exceed supply.

BitTorrent is a fascinating piece of technology that highlights the difficulty of decentralized governance. Although there is no single BitTorrent governance baseline/standard (and its governance capabilities continue to evolve), my impression is that BitTorrent governance is relatively complicated given that (a) it's a very narrow problem (sharing large files in a bandwidth constrained environment), and (b) it only focuses on a small number of factors (e.g., how fast you can download, how much you share).

It makes me wonder if the complexity of decentralized governance increases at least exponentially when compared to centralized governance....regardless, it seems that centralized just won't scale, so we're forced to explore the decentralized landscape...not something we're comfortable with or know much about since (a) centralized approaches been effective until this point in time/complexity, and (b) we hate anything that's not easy to command/control.

I've followed various discussions about emergence (CAS, chaos, etc.) since stumbling across Gleick's "Chaos" shortly after it was published in 1987. I even slogged through Kauffman's "Origins of Order" in the late 90's. One theme that continually emerges is a search for "information for free". I'll stay out of the controversy about how successful this search has been...and simply note that, as far as I know, there's no mature theoretical foundation upon which emergent control capabilities can be built.

Bottom line: when it comes to order/information, I've not seen any emergent governance capability (that "works") that is anything other than narrowly focused on a small set of specific cause-effect relationships. Which makes me suspect that we'll ultimately wind up with IT/business governance that's more centralized than most folks anticipate today. That does NOT mean it will look like what we think of as "centralized." For example, DNA is a centralized governance mechanism in the sense that the same data is in every cell (except red blood cells). It's radically DEcentralized in how the information is translated into action. And, the more we discover, the more complex it gets...e.g., there are other information molecules (RNA, etc.). If BitTorrent & DNA are any indication, designing governance that's truly agile/adaptable is in the distant future....and it's likely to be a long, strange journey.

Managing Innovation

Forrester as a new report entitled "2008: Innovation Management Services Take Off". It asserts that, in a softening economy, CEOs will place greater emphasis on innovation, and will look to consultants like BAH and McKinsey to help them better manage innovation.

It has some good suggestions, but my initial reaction was that innovation (as an Exploratory activity) probably needs "management" that is a little different that the traditional plan, organize, control, direct, staff.

Finding the right mix of art and science is probably more art than science.

Semantic Concept Mapping

This product looks interesting...

Interesting Lighting Technology

If you're interested in LEDs, CFLs, etc. then you'll probably find this short video on ZDNet interesting.A 250 watt plasma light bulb from Luxim that puts out 140 lumens per watt...but after skimming the data sheets, I'm not clear on whether we'll ever see it in a relatively small package (1-10 watts) for portable lighting (e.g., flashlights).

And, the efficiency is not that far above where the best LEDs are today...though I'd guess that there's been a lot more R&D thrown at LEDs to this point.

Finally, I'm not seeing any cost figures...maybe too soon to tell how cost competitive it will be. This article (whose LED efficiency numbers are dated) seems to think they'll be a bit pricey for a while.

More "Execution-Exploration" Fodder

In other forums, I've discussed at some lengthe the challenge of integrating the exploding Exploration-oriented IT capabilities (e.g., Web 2.0) with the traditional Execution-oriented IT capabilities (e.g., Oracle, SAP, etc.). For this blog, I'll just point out that similar contrasts have been discussed by others....Exploitation-Exploration (Tushman, HBS), Push Programs - Pull Platforms (Hagel and Seely Brown), Performance - Learning,Innovation (Singer and Edmondson), Volume vs. Complex Operations (Moore).

Don't know that there's anything new in Dennis Howlett's recent post, but it is another data point highlighting how the emerging tensions are approaching what he calls a "war."

I guess it was about 3 years ago that I was part of a small enterprise architecting team that observed that this war was coming and that it would be the users vs. IT...as I've mentioned before, it's similar to the PC/LAN conflict of the 80's...though I suspect that the emerging conflict is far more fundamental/significant and will therefore generate much more tension and last longer.

G4 - Why Governing IT is a Challenge

If IT is just another tool, why all the fuss about governance? I suppose there are probably entire books on the subject, but here's a few snap reactions:

  • Physics does not impose as many limits on IT capabilities as on non-IT capabilities. So, there's more room for morphable tools...leading to a need for more formal and complex governance.

  • When compared to non-IT tools, IT is more oriented toward supporting human decisions. Since human decisions are generally more complex than the tasks humans perform with non-IT tools, it's reasonable to expect that governance (control) is more likely to become an issue.

  • The electronic nature of IT (combined with its decision making orientation) means that multiple individuals often interact with IT across time and space. Coordination of these interactions becomes increasingly challenging as they grow in complexity.
  • Non-IT capabilities actually have lots of governance...it's just in the heads/cultures of the individuals/organizations that use those capabilities. Instantiating that same degree of governance in an automated capability is likely to be difficult.

Since our experience has been (until very recently) with non-IT tools with limited formal governance needs/capabilities, we have little understanding of the complex governance needs, principles, or frameworks associated with IT tools.

The most complex governance today is arguably found in the policies & procedures used in running a large modern organization. Not only is this largely manual (and therefore static), it is a relatively recent innovation since the need for it (and the ability to do it) was ultimately driven/enabled by the rise of electronic communications.

G3 - The Governance of Hammers

Governance implies a context where decisions need to be made about the control of a capability.

Decisions about the control of IT currently are mostly made in the pre-deployment phase. This is no different from non-IT capabilities, and reflects Tool factors, People factors, and Context factors, driven by a large chunk of pre-IT inertia.

Here's a few observations (about non-IT tools) related to why we have minimal formal Governance for physical tools like hammers:
  • Tools are dumb and people are smart

  • Tools are dumb because they have traditionally been built for a specific purpose with limited configurability

  • Tools are built with limited configurability for a specific capability since the physical world usually has a trade-off between effectiveness/efficiency (in a specific context) and adaptability (across a range of contexts).

  • In the physical world, it usually makes more sense to provide a user with a range of special purpose tools than a single morphable tool...the Leatherman-style tools being one notable exception.

  • Contexts are often unpredictable enough that a single morphable capability would be difficult to design and difficult to use.


Bottom line: In the physical world, we create simple single-purpose capabilities (tools) that users combine in an ad hoc manner to achieve a goal in a specific context.

G2 - SOA Governance From the Top

Seems like most of the discussions I see of SOA governance are written by techies for techies and therefore emphasize technology. To a lesser degree, they emphasize process, and mostly ignore topics like how people interact with IT and how roles/rights/responsibilities shape IT.

My general approach will be to start at the top of a stack consisting of People, Organization, Process, Technology (POPT) stack.

Why POPT? For example, PPT is often used in architectural discussions, and DOTMLF is often seen in defense-centric processes.

Taxonomies reflect a drawing of distinctions with the purpose of highlighting contrasts among fundamental entities. These contrasts provide a framework for considering fundamental relationships...ultimately enabling more effective decisions & actions. This drawing of distinctions is the first act in transforming a formless pile of data into knowledge.

A taxonomy should be as simple as possible, and no simpler...for my purposes POPT seems about right...I first encountered it in a John Garstka presentation at NCW 2007.

The current SOA governance discussion seems to have not gotten much beyond generic statements about the importance of it being "driven by business needs."

G1 - Some Observations on Governance

I've been thinking about governance in general and how it applies to SOA. So, this starts a series of posts on the topic.

I'll use titles starting with "G# -" so that you can find your way through them.

The anticipated outline looks something like this:

  • The origins of governance

  • IT vs. non-IT governance

  • Governance of capability-oriented IT vs. service-oriented IT

  • People & Governance

  • Organizations & Governance

  • Process & Governance

  • Technology & Governance

  • Summary

Wednesday, March 26, 2008

KM, KS, and IBM

IBM has been deeply involved in Knowledge Management over the past couple of decades. In the KM community, there's an ongoing debate about the degree to which knowledge can be managed (planned, organized, controlled, directed, provisioned). Those who are skeptical about KM tend to emphasis decentralized, emergent approaches (collaboration, sharing, etc.).

This post is an interesting discussion of IBM's increasing emphasis on Knowledge Sharing (vs. traditional KM).

JBOWS & Governance

Another good post from Joe McKendrick on SOA governance. This prompted me to begin a series of posts on the nature of governance and the unique challenges of SOA governance.

Just a Bunch of Web Services provide a muddled capability of marginal utility. To move to the next level, people & process must be addressed. However, Joe does not clearly frame the problem in terms of goals, strategies, tactics, roles/responsibilities/rights, integration/transition vis-a-vis legacy capabilities, etc.

Governing Cash

This post by Phil Wainewright made me wonder if the solution to the SaaS monetization problem might also be a nice starting point for a SOA governance framework. I suspect there's a lot more motivation to solve the SaaS problem than the SOA one.

Monday, March 17, 2008

MS, HP, and clouds

An interesting discussion by Phil Wainewright on Ozzie's hints about MS & cloud computing.
And, an interesting discussion of HP's Adaptive Infrastructure as a Service...another XaaS acronym... :-)

SOA is stalling?

"The techies just can't sell SOA to the business."

An interesting post providing more evidence of how disruptive SOA really is. As I've tried to highlight in the past, numerous observers have emphasized that the real ROI for SOA will involve fundamental changes to business processes (both in execution and governance), organizational structures (including roles/responsibilities), and the rich fabric of fragments that individuals and groups use to navigate the world.

Bottom line: the technology is the fast & easy part. The hard part (people, process, and organizations) is barely even recognized as such at this point. Since this has a strong bottom-up flavor, it's not clear when/if we'll see a common understanding of the questions begin to emerge...much less some good answers.

Sowing SOA

Although the challenges of transitioning well-ordered systems to SOA began getting widespread attention 2-3 years ago, I'm not aware of any clear solutions at this point.

There's no good analogy since we're not just talking about software....we're also talking about processes, organizational structures, and (most intractable) the frames/models/narratives that people use to navigate.

However, here's one rough analogy....if your systems are built of Lego blocks, it's like tossing a big pile of Legos out on the floor and saying "Don't you feel empowered now that you can compose IT capabilities that are customized to your decision needs?" I suspect lots of folks will not react kindly.

Several intriguing products have appeared recently that try to address some aspects of this problem. One is IBM's QEDwiki.....another is Sprout. Check out the video on the link....governance is still basically absent, but I suspect that experiments of this kind are essential to figuring out the governance puzzle.

Perhaps most importantly, they allow non-IT folks to experiment with the composability meme.

NCW - 10th Anniversary

Although I did not get to attend the NCW 2008 conference in Washington this year, I was able to review some of the presentations.

John Garstka, one of the shapers of NCW theory, had an interesting discussion of NCW at the 10 year mark.

Few things frustrate me more than the fact that very few architects seem to know that there are a handful of key concept diagrams for NCW. Two of these are Figures 5 and 11 of "The Implementation of Network Centric Warfare."

John updated Figure 5 in his presentation. The changes are all in the Social and Cognitive domains and are worth discussing:

  • "New Processes" has been removed.

  • "Common Operating Picture", "Individual Situation Awareness", and "Shared Situation Awareness" have been added.

  • "Decision Making" has been added.

  • "Information Sharing" now spans the Information and Cognitive & Social domains.

  • "Self-Synchronization" now is partially in the Physical domain.


I had two snap reactions:

  • The initial diagram implied that Processes are how "top-down" is done, and Self-Synchronization is how "bottom-up" is done. This is clearly simplistic, if not the most dangerous kind of deception (an incomplete truth).

  • Decision Making is both Individual and Shared, and is "top-down" in the sense that it's not emergent in the way Self-Synchronization is. This reflects the emphasis that Individual and Shared Sensemaking received in Figure 11.


If you're an SE or SA and are not familiar with these two figures, I urge you to make them a part of your operational DNA...these patterns are fundamental and pervasive in a way that very few things are.

Unintended Consequences of Metrics (and Measuring)

Ed Yourdon has an interesting list of 13 common political problems associated with metrics initiatives.

He includes the old cliche about "you get what you measure", which is the one I thought of first.

My snap reaction was that SW development of any complexity has traditionally involved a roadmap in the form of a process and associated artifacts. The artifacts model the target "chunk of knowledge" being created (at increasingly lower levels of abstraction), and the process models the governance of the creation activity.

Monitoring those models inevitably involves metrics, but since these metrics are one or more steps away from the actual target "chunk of knowledge", they are always proxies and are subject to various interpretations by those who interact with them.

I suspect at least a part of the problem springs from a basic human desire for a "magic incantation" that can be mindlessly recited....we all know such a desire is foolish, but like many foolish desires, we're apt to occasionally give in....just look at the management best sellers of the past few decades.

Is SOA IT?

In a recent post, Joe McKendrick highlights a topic that's perhaps not always given the emphasis it deserves....SOA is mostly about business services, while IT is mostly about technology.

This distinction is critical since the approach, framework, processes, skills, organizational structure, etc. for governing business services will likely have significant differences when compared to the governing of IT.

To the degree that IT is cloud-centric (i.e., homogeneous infrastructure that is largely commoditized), the differences will be especially sharp since, SOA ownership will likely be much more decentralized than IT ownership.

Virtualization and Governance

Since the history of IT is one of increasing levels of abstraction, hardware virtualization is not conceptually new.

However, we often see the promise of abstraction without considering the new governance issues it brings. See this post for a short discussion of the topic vis-a-vis virtualization.

Transitioning to the Semantic Web

Anyone who's examined the potential value of SOA quickly arrives at the ugly question of transition....specifically, how do you transition legacy systems to SOA? There are no simple answers since each situation is unique.

The transition issue is much less mature for the Semantic Web since the concept is much less mature. However, the things I've seen written about Calais indicate that such a concept might be part of the answer.

Twine

I've mentioned Twine before, but in a posting today, Nova Spivack responded to some criticisms of Twine (still in private beta) and mentioned that they are broadening their private beta community.

If you're into the Semantic Web, this seems to be a capability that's attracting a lot of attention.

Tuesday, March 11, 2008

SOA, WOA, and REST

I've skimmed a lot of material over the last few years on SOA and associated topics. One of the most informative observers/practitioners is Dion Hinchcliffe. Although he stopped blogging frequently quite a few months ago, he remains one of the clearest commentators on architectural issues associated with SOA and Web 2.0.

I just stumbled across his consulting blog, which has his latest discussion of what he calls Web-Oriented Architecture (which is URI/data-centric).....he contrasts it to the traditional SOA style (which is (obviously) service-centric).

I remember running across REST 2-3 years ago and thinking "now there's something profound....lots of important architectural implications, none of which are very clear or predictable." Dion's continuing exploration of REST/WOA is well worth following.

Compliance in a Hyperconnected World

The simple/complicated domains are chock full of compliance mechanisms. These reflect an assumption that the key aspects of a generic decision context can be measured & monitored.

Compliance metrics usually measure factors that, in the past, have been correlated with achieving a specific goal (a clear cause-effect relationship may or may not be present). While some of these factors are measured outputs, they are often measured inputs.

These aspects of compliance have been much discussed, and are not the topic of this post, which was triggered by a slashdot posting about a student that faces expulsion from college for organizing an online chemistry study group. Although it appears that, at the very least, what was going on was suspicious, it still raised a question for me:

"How do the ethics of collaboration change in a hyperconnected world?"

In a face-to-face collaboration about homework, interactions can range from copying (unethical) to help that accelerates learning and is ethical (as long as it's not prohibited by the instructor, as it was in this case).

However, hyperconnected collaboration involves IT, and therefore inevitably spans multiple context instances, since the content of the collaboration lives on and can be applied in future contexts.

Implication: a collaborative exchange between two students that is entirely ethical may be seen as unethical if it is published to the entire class.

This is not a new issue, and I'm not sure there's really a specific answer to my question, but it highlights the fact that underlying much of our communication is an assumption that it will not be used IN EXACTLY THE SAME FORM in a future context (i.e., a human will often mediate re-use).

Although this assumption is trending toward being the exception (rather than the norm as almost all contexts are being at least partially recorded for future consumption), I'm not sure I'm seeing any fundamental cultural shifts yet that reflect that trend.

As for the college, I hope there's a renewed appreciation for the fact that input-oriented compliance mechanisms (how students collaborate in doing homework) are not nearly as valuable/enforceable as output-oriented mechanisms (test scores).

Thursday, March 6, 2008

Music, Prediction, and Identity

I've been reading Daniel Levitin's "This is Your Brain on Music." His discussion of how interesting surprises (i.e., breaks in predicted/emerging patterns) in music are part of making it aesthetically pleasing reminded me of the centrality of predictive activities in cognition.

I know a real cognitive scientist, and I'm definitely not one of them. However, the one really interesting fact I took away from Jeff Hawkin's "On Intelligence" was that it appears that there's as much as 10 times as much information flowing from higher levels of the neo-cortex down as there is flowing up....implying that there's lots of predictive modeling going on.

This was all triggered by something Dave Snowden has discussed occasionally, most recently in a posting on the use of ritual. When a decision maker enters a familiar context, he engages in rituals that trigger the identity associated with that context. And, as that identity is activated, the frames/fragments that are associated with that identity are activated. As the decision maker acts within the context, there's an ongoing cycle of action > identity activation/reinforcement > frame/fragment activation > action > identity > frame/fragment > etc. where action, identity, and frames/fragments are reinforcing each other.

This is one reason why it's hard to remember who someone is if (a) you see them rarely, and (b) only in a specific context (e.g., a dentist). As you enter the dentist's office, a specific identity (patient) begins to emerge (along with associated roles/responsibility expectations) and associated frames/fragments are activitated (specific past experiences, patterns of interactions, processes, etc.). These trigger new actions, and the cycle continues as you "boot" yourself into "dental patient" mode.

Application: when you're trying to ensure sensible interactions, don't just think about process. Think also about identity and ritual.....and design in a way that leverages them.

I've had Gary Klein's "Sources of Power" (discusses Recognition-Primed Decision Making) on my reading list for quite a while, but have not managed to start it. Based on what I've read of his papers, I suspect he has lots of interesting insights on this topic.

There are lots of interesting issues in this area, especially as hyperconnectivity increases our ability to establish and juggle multiple identities. Technology is ceasing to be a gating factor in this area....it's our brains that are maxing out.....and they're not on a Moore's Law curve.

Miscellaneous Items

A few items from the past week...starting with one from this week in 2007:

I've always liked concept maps, so I really liked Mindmeister, a collaborative mind mapping tool, when I first ran across it last year.

Seems like I'm seeing more news items about what might be called "mashable hardware." There's nothing new about the concept (e.g., Lego's Mindstorms has this flavor), but the combination of Moore's Law, standardized components that are more "external-facing" than "internal-facing" (I'm using ESB terms here as an analogy), and the spread of the "mashable" meme seems to be sparking some interesting experiments. Here's a couple:DARPA is pursing something similar for satellites. The dynamic is very TRIZ-like, and reminds me of a basic innovation strategy.....continually re-evaluate how you've coupled/decoupled capabilities, and how you might reconfigure capabilities (at varying levels of granularity) to do something interesting. The Design Structure Matrix (I say "Design", not "Dependency" since Steven Eppinger at MIT has done so much to popularize this tool) doesn't have a TRIZ flavor to it, but has the same focus on coupling.

The whole topic of "zzz-as-a-Service" is getting a bit tired....however, Phil Wainewright had a nice summary this week. If you're unfamiliar with the architectural revolution that's emerging from SaaS, SOA, composability, etc., his post is one of the best short summaries I've seen. It clearly highlights the fact that all this composability/mashability comes at a price.....and that the price may well exceed the benefits for certain capabilities/needs. Much of the current discussion/exploration is focused on what mixes of generic/specific capabilities provide a positive ROI (along with how they fit various types of contexts/needs).....a landscape that continually shifts as the technological & memetic landscapes shift.

Wednesday, March 5, 2008

Is Complex/Complicated/Simple Subjective?

I suspect the short answer is "to a certain degree." Most of Snowden's work I've seen tends to imply that the domains of Chaotic/Complex/Complicated(knowable)/Simple(known) are mostly objective "in-the-world" adjectives.

However, Dave's current guest blogger (Boudewijn Bertsch) has a long discussion of the difficulty of getting health care experts to effectively train diabetics to manage their own care. The life expectancy of a diabetic varies dramatically depending on how well this task is done.

My over-simplified summary of her post is that Expert (grounded in analysis of the Complicated) Doctors tend to toss Novice Patients (for whom diabetes management is, at least initially, more Complex than Complicated/Simple) off a cliff by giving them a set of Best Practices (the way a Simple context is addressed). Bottom line: widespread failure.

I posted the following comment (which translates the dynamic into an engineering context):

As an engineer whose primary interactions are with other engineers, I frequently see the same sort of "sink-or-swim" mentality.

The same kind of motivational issues arise when an "expert" engineer is working with a "novice" engineer in a context that is complicated to the expert, but is complex to the novice.

It is often difficult to convince an expert of the necessity of probing behavior to help the novice successfully move the context into the complicated/simple domains.

I suppose it's easier to tell a patient to "follow these instructions" (best practice), and leave him on his own to adapt them to his specific context via uninformed trial-and-error probing. It's not surprising that many patients don't have the reserves to successfully translate a set of generic best practices to specific behaviors that address the unique needs of their situation.

"Empirical" Agile?

While reviewing a colleague's discussion of high-speed collaboration in a planning-centric culture (Discovery/Exploration vs. Execution/Transaction), I noticed that he used the term "empirical" to describe the collaboration activity and cited an Agile Development book. Below is an edited reply of my comments to him.

Your use of "empirical" caused me some confusion.....which made me think about why.....here's a few snap reactions:

1. It implies a taxonomy of "empirical" and "non-empirical." At the very least, planning-oriented folks will resist being characterized as "non-empirical".....and, the smarter & more aggressive types may directly attack the taxonomy & its implications...... "if anyone's empirical, we are."

2. I suspect that Agile folks use the term because they're often seen as using an ad-hoc process with few metrics. So, a term that emphasizes their ongoing use of data to continually course-correct is understandable.

3. Planning-oriented approaches are also empirical. The difference is that they assume that many (most? almost all?) significant aspects of a decision context can be represented by a relatively static model. They then create organizations, roles/responsibilities, processes, and metrics to allow them to act consistently and coherently across many instances of that decision context. To the degree that the model fits a specific instance, planning works. The "empirical" parts include (a) model-building, and (b) repeatable & measurable processes.

4. Probing-oriented approaches are (in some sense) less empirical in that they don't have static models of a decision context. Instead, they use Snowden's "narrative fragments", Klein's "frames", and high-speed collaboration to constantly manage the creation/testing/modifying/deletion of multiple working hypotheses.....framing, modifying, abandoning them as needed to best anticipate future needs. They are more empirical in the sense that they have a much broader perspective of the decision context and are much more likely to pick up weak signals that would be filtered out by standardized processes....in other words, their claim to "empiricism" is based primarily in a real-world flow of data that richly informs their activities, while planning-oriented claims to "empricism" are based primarily in a "validated" model with pre-defined metrics (that, by implication, assert that "if it's not measured, it's not relevant").

5. It's easier for me to think about the differences using Cynefin's known/knowable (or simple/complicated) "ordered domains" vs. the complex/chaotic "unordered" domains. Specifically, if it's complex, then you can't model it, you're limited in your ability to plan, and you have to be able to smartly probe for patterns using attractors and limiters (instead of standardized processes and/or expert analysis).

6. However, in the almost 3 years I've been using Cynefin, I've found it difficult to explain....most people don't seem to have good narrative fragments (or frames/models) associated with consciously addressing complex contexts. You have to first make them aware of the complex vs. simple/complicated contrast, then discuss the need for different organizational structures, roles/responsibilities, and processes to address complex contexts. Unless they have a pressing complex context need, they're unlikely to invest the time needed to grasp the value of Cynefin. And, often, they simply impose Analysis & Best Practices on a complex area, leaving those on the edge to bend "the system" to adapt it to each specific complex instance.

Transformation at the State Department

Skimmed a recently released report of the Advisory Committee on Transformational Diplomacy. As expected, a strong emphasis on creating Discovery/Exploration capabilities.....though the language/framework feels mostly Execution/Transaction-oriented.

One metric I look at is how often the word "identity" appears in such a document. In this case, zero.....perhaps they deliberately viewed it as an emergent property that should not be addressed explicitly....however, I suspect they either don't really appreciate the centrality of the concept in the kind of transformation they're undertaking, or see it as too fuzzy to be addressed by the existing Execution/Transaction-centric culture.

Even more telling is that the word "trust" appears only once.....ouch!



Overall, though, it covers most of the significant issues well. The devil (as always) will be in the details of implementation.

Emergent IT and IEDs

Just got around to reading the article that Nick Carr referenced in Technology Review on TIGR....a DB that allows the decentralized compilation of important features of the battlespace.

Highly recommended.

Small Worlds & Innovation

My son brought an article on social network structure & innovation to my attention. A study found that "fully connected" groups did well on simple problems, but not nearly as well on complex problems.

And, a related article discussed research that found that complex/'wicked' problems are solved better by individuals than by internet groups.

Don't know if there's a real trend here, but it seems consistent with a couple of things I've observed:

  • As group size increases, so does the likelihood of a "lowest common denominator" solution

  • As problem complexity increases, group effectiveness decreases


Although how a group is structured and the processes they use are key factors, I suspect that a key factor is costs associated with coordination of meaning among group members (i.e., syncing of mental models & narrative fragments). As you move into the Complex domain and the focus shifts to probing (vs. analysis), effective exploration of the problem is only possible by individuals or very small groups with significant overlaps in problem domain expertise.

Bottom line: ensure there's a good match between the nature of the problem (simple, complicated, complex, chaotic), the makeup & structure of the group attacking the problem, and the processes be used.

TheEdge is Complex

Nick Carr has a posting about Web 2.0 on the edge. As I noted ~3 years ago, the decentralizing edge-oriented concepts and technology associated with Web 2.0 are so disruptive that they'll probably be deployed more "bottom-up" than "top-down."

Reminds me of Dave Snowden's description of the Complex Domain as being one where "agents constrain the system & the system constrains the agents"....a domain characterized by probing (vs. analysis) via "safe-fail" experiments with attractors and barriers.

Twitter is spam/troll-free?

Identity is a core issue for information assurance....which makes it a core issue for Web 2.0.

This blog entry and the related comments have an interesting discussion of whether/how Twitter resists spam/trolls.

I don't use Twitter, but I think it's an interesting experiment in context tracking/linking.

Why?

A few comments on the purpose this blog....

I'm interested in conversations about how individuals and organizations know what they know, how they know it, and how that knowledge shapes decisions (and vice versa). While this topic has traditionally been perhaps more the domain of specialists, the emergence of user-driven decentralizing technologies (e.g., SOA, Web 2.0) is making it important to anyone who uses information technology or is part of a group.

"Sensemaking" - how individuals and organization make sense of a context....includes things like the (usually fuzzy) taxonomies/ontologies being used, how they are created & evolve, the key cause-effect relationships, the narratives being exchanged. etc.; includes DNA-level aspects like trust, values, and goals.

"TheEdge" - where concepts intersect with operational decisions & actions; often an ambiguous and dynamic place where the future is constantly being created. Even though much has been written about the edge in the world of Network Centric Warfare &amp Transformation (e.g., 'Power to the Edge'), the defense industry seems to retain a strong focus on products, systems, and infrastructure (possibly since that's where most of the $ are).