Tuesday, August 23, 2011

Systemic or Societal?

Last night I was having one of those interesting Facebook intercommentarial episodes — things that aren't quite conversations or discussions but approximate them through roughly alternating comments on a post. The Thunderer's get was protagonist, so I suppose in a sense I was antagonist, although not in the modern sense.

The underlying issue was whether a certain system was crazy or not. Personally, I always go for etymology before thinking about significance. The word 'crazy' means 'fragmented', and it is in this sense that things are 'cracked', 'flawed' or otherwise 'broken'. It implies a lack of consistency at the very least and a lack of coherence or integrity at more subtle levels. Clearly, a system can be crazy.

But the overall issue seemed to be whether crazy people had made a crazy system or whether a crazy system had made crazy people. And here I paused to think.

The real conflict was that, in the kinds of systems we were discussing, people were integral components. If the people were nuts, the system (no matter how designed) would behave sub-optimally. But this, in the long term, is true of any system — it slides into increasing entropy. The Thunderer's get averred (perhaps with some sarcasm) that it was easy to say this, since the Second Law said as much.

My point was that empirically, systems collapse not in the very long heat-death-of-the-universe sense but in the much shorter span of generations. If a generation is 30 years, evolving systems like the American experiment in constitutional democracy have only lasted less than 10 generations. The first of my ancestors to leave eastern Eurasia and move to south-eastern Eurasia did so around that time; he lived from 1751 to 1801. The local mercantile/political experiment of my forefathers is thus an experiment just as old and I have empirical knowledge of it.

In the much smaller system of my clan, we have diversified pretty broadly (although not as much as the American system, with vastly greater numbers of inputs and functions). But I am absolutely certain that none of us hews closely to the attitude of our patriarch. If he had drafted a constitution, it might have slowed the entropy and diversification; if we had spent more time planning (as if we'd been a corporate institution) we might have kept the focus narrower. But neither of these possibilities is easy to evaluate.

The Thunderer's get also pointed out that certain systems obviously produced better (more rational, more useful etc) results than others. This is certainly true. But it is also about as useful as the finding that all systems fail because it is also a truism. No two systems can be alike, or they would not be two systems; they'd be one. If they were not alike, their outputs would be different either in terms of process or product, and hence different in impact on their implementors, environments and users.

Of course, in a purely mathematical system, this might not be true; you can indeed produce the same result by different process, and if a machine does it, there is little or no impact on human users. But that particular qualification — human — is the nub of the issue.

In any system, the tendency to entropy is greatest in the most complex element. The higher the level of development, the greater the distance from the baseline and the harder it is to sustain. This can be empirically and experimentally verified. In most of the systems under discussion, humans are involved, and they are the most complex elements of these systems.

This then was the process by which I reached the following point of contention: I said that all systems tend towards the mediocre and the Thunderer's get disagreed, citing the fact that some systems were better than others. I think that both of us were true in a very trivial sense (see above) but that taken together, these two truths give a higher truth: no matter how hard you strive to make something better, it will one day, in the not-so-distant future, be made less good by humans.

The challenge therefore is to overdesign systems so that humans can't mess them up. And of course, the more you overdesign a system, the more potential for entropy you create and the more effort you spend. You are taking away resources from somewhere else to do that work. You are fighting the inevitable, perhaps in a very laudable way, but mediocrity lurks nearby, always at your shoulder. It is the dark twin.

It is right to resist mediocrity. But this can only be done by changing the system every now and then. The other challenge is to know when to expend that effort and suffer that pain.

In all, one should conclude that the main point of human striving, systemically speaking, is to expend effort in setting up systems that resist entropy to some continuing and useful end. Society, on the other hand, cannot be that end in itself. So what is?

Labels: , , ,

0 Comments:

Post a Comment

<< Home