Sunday, April 6, 2008

Send the Computing to the Data

Over the past few decades, it has become clear that technology is creating an irresistible pull for valuable new mission capabilities that tear down security barriers faster than we can put in needed risk mitigation/mgt. capabilities.

A few years back, I extrapolated that the logical end point was to wrap data (and associated metadata) in one or more layers of IA using some sort of PKI. If you know anything about IA, you'll recognize that this endpoint is not in the near future.
Whether we'll end up at that extreme (the current backlash against DRM would vote "no"), it does put the focus on data (vs. processing).

This reflection was triggered by two articles that caught my eye today (both from Jon Udell's blog):
  • A discussion by John Montgomery asserting that mashups involve data that's (a) simple to access programmatically, (b) interesting, and (c) available under terms that enable users to work with it...and you can pick two of the three.

  • A discussion by Jon Udell about a Microsoft HPC initiative that provides a basic cluster approach to wrapping large (multi-tera/peta/exabyte) data stores (e.g., climate data) with an HPC cluster.

Both articles have a bit of the "send the computing to the data" flavor.

No comments: