« Blame it on JP and Susan! | Main | Does enterprise software induce sadness? »




Along the same lines but a little bit more technical:



One of the issue with what you say is that if this is too transparent, then competitors will likely know what the others are holding making Bank runs a strinkingly slow event of the past. For this reason, there is probably a need to embbed some of the information you are talking about in the "object" being transacted with the near impossibility for most actors to find out that information. Regulators, on the other hand would be able to decode this information near real time and actually act as regulators.




the crux is the data-model (all too often forgotten fact), and that's where DEBK fails, and most other methods today fails - as they do not differentiate between representation and presentation thus increasing complexity far beyond reality itself.

When presentation is something you create after the fact using minimum and singular representations of reality you are free to present anything, and real time if capture is real time - all or just snippets, depending on your policies and needs.

That said, transparency = trust, so if you want trust (the basis for all transactions between humans) you simply have to be transparent beyond what those old fashioned "knowledge is power" and "I have a secret it'll cost you 10%" business models I think :)

Interesting posts you have there, I noticed your interest in complexity as well - perhaps this old post of mine on the matter in relation to enterprise software (how we run organisations mostly) could be of interest: http://tinyurl.com/5dmtxb




I'm afraid that sounds very much like security through obscurity. *Which, unfortunately, is the worst kind of security because you can't know or predict when it will fail.


Maybe the issue is not bookkeeping per se, but treating it like a science when it is really a work of (historically inspired) fiction. Fiction can be useful as long as we don't pretend it's absolute truth.

I'm not really clear on what you alternative concept to replace transactions are. Please elaborate.




without going into too much detail let me try to give an idea via an example:

Important - it's not about replacing transactions, it's about trying to avoid a placeholder to represent the transaction, finding a way where the single unique data object (real world representation) holds enough information so we can extract the "transaction" from one place only if we need to use it in a report.

In reality: A widget is a widget, a physical object, one day you "sell" it - this usually defined by local law as a "transfer of ownership" by way of a contract, a handshake or oral agreement on the phone. This reality has one real object (the widget) and perhaps another representing an agreement (sales contract).

Our solution (thingamy): The widget will be represented in the system as a single "widget type object" that has properties like serial number, colour etc. Then it has "relationships" with other objects - like "widget type object, is owned by, legal entity type object".
As thingamy actually runs all processes it will register every change, including who, when and what for both properties and relations.
Thus I can have a "report template" at the backend that when invoked (say "widget sales for December") will query all objects, find all "widgets" that "changed relationship owned by from our company to any customer" for the period of December - then add up all values - and voila you have a dynamically rendered sales report without bothering with invoices or anything like that.
Cool of course is that you can have two templates with different rules and thus deliver US GAAP and IFRS in parallel from the same data without messing up.

Now, the even better part: That way, if changes happens further down the chain (like the economic status of a borrower in Louisiana), then that would be instantly reflected on the top level for say a CDO, real time. And another cool thing, the reports are not constricted to that "stupid" sales invoice - thus I can create a report for say "December sales to customers that has an uncle who lives in Berlin who cycles on Sundays" - and get it :)

Oops, the example did expand there... hopefully it made some kind of sense :)




I agree with you, however, I believe that one of the issue in the CDO game was/is the averaging risk calculation performed on the product. Even if the Louisiana owner is defaulting on his loan, nothing says that the combination of both the weighting/averaging of the risk and the risk scale (AAA, AA...) can grab the potential default of the whole product when it is weighted with really good AAA financial product.


If this is an example of security through obscurity then most cryptographic means are also falling in that category. I realize however that I have not said much.

The idea is as follows, as I noted in my post:

"For instance, the regulators need to understand some information from many different actors without each actors having the possibility of reconstructing the full picture if they, by accident, have access to their competitors information."

In this case, the regulator is interested in a snapshot of an entity's financial situation (this model can be used for other types of transactions). The regulators emits a table of randomly generated numbers against which the banks/entities under regulation produce a sum of their numbers weighted with the ones given by the regulators. The result is sent back to the regulators, who can then reconstruct the bank numbers. If a competitor intercept the numbers sent to the regulator, they cannot make out what the original numbers are. In fact, it looks like the problem is NP-hard. What is interesting about this scheme is if the bank has 1000000 numbers to communicate, this scheme would require only 1000 actual numbers to be sent back to the regulators. Another interesting aspect of this formalism is that the 1000 information could be readily mined for fraud detection without going through the pain of performing a reconstruction of the initial numbers.

FWIW, the formalism of compressed sensing is being used in another area by the likes of Google in determining information from streams, i.e. large amount of information that cannot be stored yet from which one needs to do some operations (like counting or evaluate their statistics).

As one can see, this communication could assert trust between two parties, but it could also provide evidence of breach of trust between the same two parties.




good point - but then again one of the big issues when grouping investments/debt is to avoid correlation, and that's where it really f**** up this time on the subprime side.

The more "knowledge" those objects could hold - and here semantic relations to other objects comes in handy. Would be helpful for all parties to keep an eye on the development and act accordingly.

But even more important would be that the top level product, as seen in a report, would be dynamically and real time built by the underlying objects so that any change anywhere would immediately be seen on the top level.

That in itself would of course be after-the-fact, albeit milliseconds after, but it would make the pricing dynamic and reflect reality. Add that with this knowledge the "rater", seller and buyer would look at and act towards the "financial product" differently I would imagine.


Hi Sig,

Very interesting post. Not sure I agree. Satyam is a case of fraud similar to Enron, that didn't bring down the world markets, the world market problems expedited their fall.

I think the MIT Blackjack Team perspective is closer to the root cause, that our financial system is a giant complicated Martingale system and after so many years we ran out of money to do the next doubling down. http://semyondukach.blogspot.com/2009/01/real-cause-of-financial-crisis.html

Actually a reason why I am skeptical about the doubling down stimulus package too.

We push the crash out and it will be even worse in a couple of years.

All the best, Mark.


Hi Mark,

cannot disagree with you as such - when you're out rowing in a storm the rogue wave that capsizes your boat is indeed the "cause".

But it's based on many assumptions, among them that you should be out there in the first place, that you're in an open rowing boat and not in something more modern and seaworthy, and for this case, that you navigation relies on a keen eye, gut feeling and an old compass in your left pocket.
If you had challenged those assumptions you might at least been in possession of some proper GPS and a link to weather satellites for real time view of what's brewing beyond the horizon so you could take evasive measures.

My point is we cannot do much about reality, that being changes to consumer habits or rogue waves - which can be "direct causes" for calamities, but it's lack of preparedness that is the "root cause".

That the suppliers of old compasses and binoculars are in "control" of the market and have a big problem shifting to GPS and weather satellite stations is of course not helping - is that the "ultimate root cause" perhaps?
Hehe, sorry, could not resist that last one! ;)

The comments to this entry are closed.

My Photo


  • Phone: +33 6 8887 9944
    Skype: sigurd.rinde
    iChat/AIM: sigrind52

Tweet this

Thingamy sites

  • Main site
  • Concept site

Tittin's blog


Enterprise Irregulars


Twitter Updates

    follow me on Twitter


    • Alltop, all the cool kids (and me)


    Blog powered by Typepad
    Member since 01/2005