« go with the flow | Main | a secret plot to make big business followers instead of leaders »



While this makes perfect sense it immediately raises several questions for me:

if we keep ALL the underlying data without filtering, does that not, in itself, create some new problems of both storage and more importantly, concrete descriptions of the data itself which are able to change over time as the company shifts products, people, strategy, geography etc?

Are companies prepared for this, and indeed, would they ever bother? (whatever happended to that "datamining" buzz from years ago anyway? ;)

And how do companies, especially those lacking in approprite resources, apply their own logic and most importantly, test that logic. The one thing the big packages tend to do (I assume) is test their standard logic "products" although thousands upon thousands of companies are taking that on trust for the most part.

It would seem then, that the business logic might best be applied as a set of modules that are open to scrutiny and modification... like an open source library for business based on some underlying knowledge of how the data is stored (sounds like XML to me but I'm no expert)

an open set of tools would allow different sets of logic and assumptions (sometimes very creative assumptions in accounting in particular) to be applied. A company in one geography with one regulatory environment could then simply download the template for a company subject to other constraints, logic and assumptions and compare like-for-like (damn I wish I had THAT power in business school, let alone real business... would have saved das of spreadsheet agony ;)



yes, you're right, data volume would increase - but if that is a problem? I think not as storage costs is rather cheap. But it will stress cpu and database efficiency as the data have to be kept in raw form and manipulated real-time if one wants to be true to concept.
That would be an issue of software architecture, and solvable I'm sure.

And, new rules and regulations increasingly requires that raw data is kept. Take the Basel II regulations for financial institutions - seven years of transactions have to be kept, not only the summaries and reports as today. The big system suppliers are thus racking up new income streams as they struggle to cope. Good post at Cardboard Spaceship about SAP here: http://hnewlands.typepad.com/cardboard_spaceship/2005/02/lawmakers_know_.html

That I suspect will strain the systems built on a keep-only-the-manipulated data philosophy! Bloated gets obese.

Ahh, and description of data! Why not a single ID? Should I suspect falling into the trap of have-to-apply-logic here - as in using tree structures? OODB fixes most of that issue, and tags of course. Have to apply same principles of raw data, no logic applied all the way to the bottom or else you will loose - as you rightly pointed out!

And I do agree, heartily so, that building all logic from bottom is a chore - but does not a corporation have any logic by itself? Perhaps something closer to their true values and meaning and ideas and people than that delivered by outsiders (including templates)?
I think it will be more of a question of finding these values and meanings and distill the assumptions that lies therein!

The comments to this entry are closed.

My Photo


  • Phone: +33 6 8887 9944
    Skype: sigurd.rinde
    iChat/AIM: sigrind52

Tweet this

Thingamy sites

  • Main site
  • Concept site

Tittin's blog


Enterprise Irregulars


Twitter Updates

    follow me on Twitter


    • Alltop, all the cool kids (and me)


    Blog powered by Typepad
    Member since 01/2005