Monday, October 31, 2011

Dauntless

On All Hallows' Eve, we remember the departed saints. It is a good alternative to Samhain.

The late and nearly-sainted Dauntless has long been a hero of mine; my great-grandfather was his student. But writing about him is difficult for that very reason. What can one write about one's heroes?

One duty of the historian is to tell the truth as much as it can be discovered. Another duty is to keep discerning how much truth there is in the truth, and how much cover remains to be discovered.

The first is a narrative duty; the second is an investigative one. The historian is a journalist of the past, with the frequent disadvantage that sometimes the sources are all dead and gone, whether human or otherwise.

And so a month after Adobe's commission, I am still mulling about whether the narrative I am telling about the Dauntless really is the truth — whether it is sufficient, whether it is adequate, whether it is correct, whether it is biased. Blah. And blah, blah, blah.

I want it to be true. Does that make me guilty of bias?

Labels: , ,

Sunday, October 30, 2011

21st Century Competencies, Skills, Whatever

In the New World, they have decided that the 21st century (10% of which has already passed us by) requires the development of three main clusters of skills:
  1. making use of data/information/knowledge;
  2. making decisions concerning individual and social life; and
  3. creating useful outputs by synthesis and/or collaboration.
In Atlantis, they have decided to describe these clusters as:
  1. civic literacy/global awareness/cross-cultural skills;
  2. critical and inventive thinking; and
  3. information and communication skills.
For some reason, these skillsets are being touted as the new educational must-haves, as if they are something like Athena sprung fully-formed from the head of Zeus. Come off it, educators. You dropped the ball and now you are renaming the game. These skills have always been needed. You could find them in the marketplaces of Atlantis centuries ago, in the agoras of Greece long gone, in lost and legendary cities as well as places like the souks of Cairo and Byzantium Istanbul. Wherever humans have come together in civis or polis, there too have these skills been needed.

They are NOT tools unique to the 21st century, no matter how catchy the propaganda sounds. They are the Apollonian sense-making instruments that oppose the Bacchaean and Dionysian chaos and primal instincts of our passions and desires. They stand, have always stood, between us and the Dread Anarch.

Although these tools have lost in the past, they need to hold the line if we are not to enter darker times. To understand why they are more needed now than ever before, and thus why people are labelling them '21st Century' when they were as relevant at the dawn of civilisation, we need to note just two things: 1) there are now more people than ever before in the history of humanity; and 2) there is now more available data than ever before in that same history.

This means that 1) getting people to work together and solve problems and plan for the future is more important than ever before; and 2) making sense of the data-stream, like being forced to drink the ocean, is the new magic. And there has only ever been one coherent myth which attempted to resolve the difficulties while preserving the world of men.

I speak, of course, of the myths of the Norsemen. In those myths, Odin Allfather made it his quest, for which he suffered on the World-Tree and gave up his eye, to find the overarching wisdom that would save creation from Ragnarok, the Twilight of the Gods.

With Thought and Memory as his aides, he still lost. But he found the right solution: ameliorating the destruction while planting the seeds for rebirth. Meanwhile, his son Thor was busy trying to drink the ocean by brute force, and failing.

He who has ears to hear, let him hear.

Labels: , , ,

Saturday, October 29, 2011

Terra Nova (TV Review)

I've just watched almost two hours of nonsense masquerading as science fiction. Incompetent and wooden acting coupled with much wincing from the audience. Found myself rooting for dinosaurs against whiny teenagers, and authoritarians against unlikeable rebels. Great pain. Will not go there again.

This is my review. It was painful writing it too.

Labels: , ,

Friday, October 28, 2011

COPS

Sometimes, people ask me stuff like, "What is the model on which you base your ideas about curriculum planning and development?"

To be honest, I don't have one of those name-branded models. I've read too much stuff to attribute my ideas to specific researchers or specific models. What I think is a reasonable answer is this: In any endeavour targeted at humans, we begin with the human basis, and then we follow up with continuations that humans can accept as rational.

Today I was chatting with an officer from the Tower of the Book. She effectively asked me that very question. So I said, "It's all about COPS!" To her quizzical look, I continued, "Culture, Organisation, Process, Sense-Making."

That was an impromptu effort, actually. But it does cover the four basic ideas I have.

In any enterprise of the human sort, you should start out by examining the state of the art, the situation, the environment, the background or context of the matter. This is Culture.

Then you organise the material you have found in a way that is easy to see, to analyse, or whatever else you want to do with it. This is Organisation.

You then work out the necessary things you have to do with the material you've organised. This is Process.

Finally, you package it nicely and sell it to other people as the fruit of your consultancy. This is called Sense-Making (although some people call it Salesmanship, Story-Telling, or Scenario-Planning).

And there you have it. A new model, just fresh off the press.

I will close by noting that the meta-skill required to do this is itself the product of such thinking. It is either cyclical or recursive. I have yet to make sense of it, but it pays the bills.

Labels: , ,

Thursday, October 27, 2011

Burial Grounds

In some cultures, people are buried. In some, they are left in the cold, clear mountains to be carried away by ice, air and raptors. In others, they get funeral pyres or cremation. Sailors used to be buried at sea.

In many cultures, we honour dead ancestors in their absence. We remember them in stories and images, in spirit and in essence. After all, they are the people whose lives gave birth to ours. I don't know why we want to keep our cemeteries as if they were necropolises though, cities of the dead.

The dead will come to outnumber the living, given finite resources. The world population currently exceeds the combined total of all the dead perhaps up to about 1000 AD. But in a twinkling of an eye, we living will soon be overwhelmed.

The physical spaces some cultures set aside for their dead are not sustainable. Soon, 'ashes to ashes and dust to dust' will have to become the truth, not just a mealy-mouthed platitude uttered at some funerals at which the corpse is to be interred with the hope that the body will last forever.

Some believe that bodies must be intact for the resurrection of the dead — certainly, this belief is as old as Ancient Egypt. I find it hard to believe, however, that a Power which can raise the dead cannot simply reconstitute the bodies from the dust of the cosmos.

And so, we should favour the living over the dead. If a road must pass through the last resting-place of my ancestors, I think they would be amused. No mere road would have served to baulk any intentions of theirs when they were alive.

Labels: , ,

Wednesday, October 26, 2011

Pragmatism is Not the Opposite of Idealism

As I've implied before, 'real' and 'ideal' are related but not necessarily opposed. The truth is that 'ideal' is what we reason or imagine the perfect to be, while 'real' is what we actually have.

In between 'real' and 'ideal' is 'practical'; it is what we can successfully work towards. Similarly, pragmatism is somewhere between realism (this is what is) and idealism (this is what it would be in a perfect world) — it is the philosophy not of settling for second-best, but the philosophy of doing something towards an ideal and thus shifting 'what is' to 'what ought to be'.

This is because 'practical' and 'pragmatic' come from the Greek praxis, which simply means 'doing' or 'acting' (or if you like, engaging in some process). When we perceive something as 'real' and do something to shift it towards what we perceive as 'ideal', we are being practical.

Being practical or pragmatic therefore cannot mean being inert or deciding to do nothing. Pragmatism is an active philosophy.

Neither is pragmatism a cowardly, cynical, or defeatist position. It implicitly acknowledges that things can indeed be done, and goes out to do them. Pragmatism is always closer to idealism than realism is.

Labels: , ,

Tuesday, October 25, 2011

Responses (Nov 2012) — Summary

The list of topics for November 2012 (with some of my responses) is summarised in this post.
  1. Can we have beliefs or knowledge which are independent of our culture?
  2. It is a capital mistake to theorise before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. — Arthur Conan Doyle. Consider the extent to which this statement may be true in two or more areas of knowledge.
  3. Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand. — Albert Einstein. Do you agree?
  4. What counts as knowledge in the arts? Discuss by comparing to one other area of knowledge.
  5. Habit is stronger than reason. To what extent is this true in two areas of knowledge?
  6. The ultimate protection against research error and bias is supposed to come from the way scientists constantly re-test each other’s results. To what extent would you agree with this claim in the natural sciences and the human sciences?
For #1, I would attempt to define culture first, and then get ready for a philosophical argument.

For #2, I would consider the extent to which facts precede theories in different disciplines — and whether facts are required at all in the first place.

For #3, I would think about the concept of 'importance', and after working that out, consider the concepts of 'imagination' and 'knowledge' and what they imply.

For #4 and #5, I'd just look at some of my previous posts and work my dogged way through the obvious arguments.

For #6, I'd get a large sheet of paper ready and start to draw a large map of human knowledge...

Labels: ,

Monday, October 24, 2011

Responses 006 (Nov 2012)

The final question in the Nov 2012 list of topics is this: The ultimate protection against research error and bias is supposed to come from the way scientists constantly re-test each other’s results. To what extent would you agree with this claim in the natural sciences and the human sciences?

I like the way this question sets up little traps. 'Ultimate', 'supposed', 'the way', 'constantly' — these are all potential pitfalls. For a start, 'ultimate' ought to mean 'to the farthest extent of one's range'; 'supposed' implies that it isn't always so; 'the way' implies there is only one way; and 'constantly' implies this continues to happen all the time. You can create a little counter-argument involving each of these.

I'm not going to deal with those in detail, but I will add that if a homogeneous class (e.g. 'scientists' all using one 'way' to 'constantly' do something) does anything, there is necessarily a built-in bias, against which there is no defence. I will also add that to suppose anything requires someone to do the supposing. Who do you suppose does the supposing here? Scientists?

Hopefully, at this point, I have succeeded in convincing some of my readers that this topic requires great care and detailed planning. For those who have survived this, I will now add a note about 'natural sciences' and 'human sciences'.

These superclasses of disciplines are difficult to define for some people. I have given a quick summary of what they include towards the end of this earlier post. Let me now define them in slightly more detail.

The natural sciences are the spawn of natural history (i.e. the empirical observation and recording of natural phenomena in chronological order) and natural philosophy (i.e. the development of theory based on induction from natural phenomena or deduction from reasonable rules based on empirical observations). They include astronomy, geology, biology, chemistry and physics — disciplines which in general are considered to have objective content even in the absence of human activity or existence. Some of these disciplines may produce results that are difficult or impossible to re-test.

The human sciences deal with human affairs in terms of human activities. By analogy with the natural sciences, human sciences are the spawn of human history and human philosophy. These would concern the observation, recording, analysis and theory of matters social, political, economic, religious, and military. They thus include linguistics, sociology, political science, economics, anthropology, psychology, management and other such 'soft' sciences. These disciplines would be much deprived by an hypothetical absence of humanity. These disciplines tend to produce results which are often difficult to re-test.

Neither group would include many varieties of applied science, technology and engineering — these are not natural but are not generally considered human sciences either. The two groups would exclude mathematics, history, and philosophy because these are either tools or precursor disciplines; they would exclude the arts as well.

Well, you now have some of the basic elements of an answer. Give the topic a good try. This will be an educational experience.

Labels: ,

Sunday, October 23, 2011

Responses 005 (Nov 2012)

The penultimate topic (Question #5) in the list reads: "Habit is stronger than reason." To what extent is this true in two areas of knowledge?

It's a bit of a let-down compared to the other five topics, and that is why mainly two kinds of students will attempt it: the brilliant risk-takers and the unimaginative. What I'm a little irked about is the lack of attribution: if George Santayana did indeed say this, at least attribute it to him as you did for Einstein and Conan Doyle in the same set! That said, the quotation looks horribly boring on first sight.

No doubt, this kind of question is a staple of the 'Theory of Knowledge Diet', of the form [quotation by famous person] + [request to justify with respect to two (or more) areas of knowledge]. But what to make of the 'X > Y' format used also in Question 3 (which has a better quote from Uncle Albert)?

I think that 'habit' must refer to people doing things automatically (as in 'force of habit'), as opposed to thinking about it first (i.e. 'reason'). If that's the case, then how to link it to areas of knowledge? Surely it's a commonly known fact of human existence that people do things out of habit more easily than with the use of reason.

Perhaps a discussion of the topic can be made more substantial by discussing what exactly 'habit' is. The word 'habit', from the Latin, has the original meaning 'to have' or 'to live in a specific state' — hence 'inhabit'. The modern sense is one of being in a constant state of existence, doing the same things and responding the same way to things.

Habits are patterns of life, patterns of behaviour, that have developed from uncritical existence in specific environmental situations. Here, I have craftily inserted the word 'uncritical'. After all, if one were to be critical, one would probably not continue to live the same way all the time. But it provides us with a proper basis for argument — we can now say that reason is critical, and habit is not. (I used a similar trick when handling this older question.)

Now we have a proper epistemological argument. To what extent, in various areas of knowledge, is uncritical behaviour (not necessarily bad, perhaps instinctive or intuitive) more useful or more powerful than critical behaviour (which must involve reasoning and judgement)? This is where it would be good to draw boundaries between disciplines that require more critical thinking and those requiring less. I've discussed this elsewhere. Enjoy...

Labels: , , ,

Saturday, October 22, 2011

Responses 004 (Nov 2012)

The infamous list has thrown up an uneven field of obstacles as usual. Obstacle Question #4 asks: What counts as knowledge in the arts? Discuss by comparing to one other area of knowledge.

This is a hoary old chestnut, so to speak. The debate over what constitutes 'the arts', and hence by extension what counts as knowledge in the arts, is a very old one. The crafty Greeks got around it by using the word technë, which means 'art', 'craft' and 'skill' all at once. Their legacy is a range of modern words, including 'technology', 'technique', 'technical', and 'polytechnic', which are all supposed to have something to do with the arts but seem more to do with the sciences these days.

The picture is complicated by classical ideas of what constituted the arts. Here is a list of what the Greeks thought the arts were, in the personifications of the Nine Muses. Included were agriculture, astronomy, geometry and history.

Indeed, history and philosophy are the parents of the sciences — the natural sciences are the descendants of the disciplines which we used to call 'natural history' and 'natural philosophy'; the term 'natural science' in its present-day meaning is relatively new and was not often the term of choice till perhaps the late 19th century. [See Ngram here.]

This argues for a response that contrasts the arts with the sciences, using technology and engineering, architecture and design, as battlefields. It also means that whoever answers this question might have to draw a line of some sort between 'arts' and 'humanities' — not to mention the line between 'technology' and 'science'. Haha.

That said, how do we define knowledge in the arts themselves? I would argue that there are at least two kinds of knowledge: procedural knowledge (how it is done) and conceptual knowledge (what it is that is done) — for example, knowledge of how to paint vs knowledge of what painting is, or knowledge of how to write poetry vs knowledge of what poetry is. (We require most teachers of visual or performing arts to be practitioners to some extent; sadly, this is less true of the language arts.) How else can artistic (or 'aesthetic') knowledge be defined?

My instinct tells me that this essay topic is an easy one to handle. But it will require a disciplined approach that draws on personal experience and interpretation, coupled with the ability to examine disciplines critically.

Labels: , ,

Friday, October 21, 2011

Responses 003 (Nov 2012)

The short list for the year ahead has this as Question 3: "Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand." — Albert Einstein. Do you agree?

I think this question is one of those trivial ones which can be made important by deliberately introducing extra material. So let's dispense with the trivial approach first.

Knowledge does not exist without information, and information does not exist without data. At each stage, the questions of validity, reliability, utility, generalisability, transferability (etc., etc.) intervene in an attempt to prove or disprove the various claims that arise (whether actual or potential). Hence the only way to claim knowledge is to subject it to tests of some kind of reality — something that contrasts or is contrasted with imagination. Without imagination, i.e. the capacity to create images/imagery or otherwise imagine, we cannot test knowledge. Knowledge therefore has no importance without imagination.

In fact, etymologically and historially, the image or imago is the ideal or fully-developed concept of something. The real is merely the sub-standard shadow in Plato's cave. We see this usage in the relationship between (for example) real and ideal gases.

At this point, we can concede completely that Einstein was right. However, as always, a case can be made for negotiating on the basis of 'important'. What does 'important' mean?

'Important', from Latin importare, is used to indicate something 'brought in', 'brought into the discussion'. This explains why 'import' can mean 'significance' as well as 'something shipped in'. When we say something is 'of import' or 'important', we are saying that it is a point to be noted, or that we should pay attention to it.

And that is why knowledge might be thought of as more important than imagination — knowledge is reality as far as we can confirm it, and is thus the basis of our action in this world. It introduces the key points to be considered in what we do, as opposed to imagination, which is what we might aspire to (or have realised that we can never attain).

The argument then, it seems to me, is whether the real is of greater value (practical or otherwise) than the ideal. It's a good debate to have. While the idea of an ideal gas allows us to conceptualise many things, no such gas exists — we have to solve problems in reality based on real gases.

Labels: , , ,

Thursday, October 20, 2011

Responses 002 (Nov 2012)

The Nov 2012 list really throws up some interesting problems. Here is Question 2 on the list: "It is a capital mistake to theorise before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts." — Arthur Conan Doyle. Consider the extent to which this statement may be true in two or more areas of knowledge.

(Note: As usual, the context of this statement is not supposed to be important. For the more conscientious amongst us, this quote comes from Conan Doyle's 1891 Sherlock Holmes story, A Scandal In Bohemia, in which Holmes is bested by the lady adventurer Irene Adler.)

The context which is important, however, is our consideration of the statement in terms of what a theory is supposed to be in different areas of knowledge. Here are a few thoughts.

In most conceptualisations of the scientific method, we're supposed to build theory from empirical data or reasoning from basic principles. It's either induction or deduction, repeated and mixed up, which generates theory. It isn't considered scientific to generate theory without any data at all, since you cannot even generate a problem statement or question that is of scientific value without some initial data. Why? Because a scientific claim must be testable, and we only know what a reasonable test is if we have some data to work with.

On the other hand, let's consider the arts. We'll have to define the arts as disciplines in which something material (a text, a narrative, an artifact etc) is created in order to induce a response (almost always emotional) based on somebody's sensory perceptions. A theory in the arts can be scientific in nature, if it is an analytical theory. However, a theory about what an artist thinks art is can seem spontaneously enacted through what he does. It's when people respond to that, that data are generated. The artist may not have any obvious foundations of data on which his theory of 'this is art' is built. He might be using his emotions as a guide, for example, or his intuitions, or his faith in his own arbitrary principles.

Other disciplines fall somewhere in between. All disciplines work with data; the question is whether data must precede theory (and thus be its foundation, as in the 'grounded theory' approach beloved of some research in the human sciences) or whether theory can be constructed before any data is received. It might be a chicken-and-egg kind of problem, requiring much thought before the obvious 'chicken came first' conclusion arises.

In this statement, however, is a lot more material for debate. You would have to think about 'capital mistake' (which implies 'fatal error'), 'insensibly' (by which Conan Doyle would have meant 'subconsciously', rather than 'irrationally', I think), and 'twist' (as in apply torque to deform, but not in a literal sense).

I suspect this question will be attempted by Sherlock Holmes fans, amateur investigators, and people who just want a no-holds-barred dust-up of the old-fashioned kind. A lot of fun, a lot of risk. "Two or more areas of knowledge," forsooth.

Labels: , , , ,

Wednesday, October 19, 2011

Responses 001 (Nov 2012)

The next session of questions delves deep into many complex issues. Question 1 in that list reads: Can we have beliefs or knowledge which are independent of our culture?

My gut instinct leads me first to attempt to define culture — roughly perhaps as 'that which is cultivated as a pattern of behaviour in a group of humans over time, and the manifest fruits of that pattern'. This would presumably include manifestations like the arts as well as structural behaviours such as organisational ethos.

I think the next thing to look at is that naughty little word, 'independent'. These days, it implies separation or lack of relation between two entities. It can also mean that there is no clear relationship and hence we can imagine the two entities (in this case, culture vs beliefs/knowledge) to have little to do with each other.

To me the obvious argument is this: culture is a pattern (or the results of such a pattern) formed from human intellectual activity. Human intellectual activity tends to be based on beliefs or knowledge, which in turn are developed from information constructed out of data. Or you might say human intellectual activity constructs beliefs/knowledge from information constructed from data. This data is obtained from sensory perception and the interactions of various organs (such as the brain and various chemical factories like the adrenal glands).

None of this activity requires culture. However, once enough humans get together to develop culture(s), no matter how we define culture, it will act as a feedback input to human intellectual activity.

This is why I would argue that you can indeed have beliefs or knowledge independent of culture. However, a number of fairly bright people have claimed otherwise. There is a counter-case to be made here.

Most of the stuff a baby learns through language or other forms of experience is mediated heavily by environment. This means that if a well-defined culture is present, it will colour the learning experiences of young humans. This is true even of older humans exposed to a well-defined culture — we call this acculturation (humans 'converted' by culture) or cultural propagation (culture 'converting' humans).

If this is taken to the extreme, everything we believe or know is believed or known in terms that have been mediated by our cultural filters. Not many people have no cultural filters/lenses at all, or can function while effectively neutralising any filters they might have. Yet, I still think that such cultural influences must be in some sense known (i.e. they are knowledge, whether consciously or subconsciously attained) before they make your knowledge-construction or belief-construction processes dependent on them.

The usual outcome to such discussions is the infamous Gallic shrug, especially because many of these cultural philosopher types are Francophone Europeans. But those who have to write essays on such topics must suffer towards a fairly well-defined conclusion. Ah well. *shrug*

Labels: ,

Tuesday, October 18, 2011

Starbuck

A starbuck is a male deer with a white mark between or just above its eyes. They're generally supposed to be frisky. Me, I have something of the starbuck's with me. Little sachets, which the purist might scoff at, but which are the offspring of the bean anyway.

For we all serve 1,3,7-trimethylxanthine, whether we prefer mermaids or not. And the Columbian Roast, and the Italian Roast, and whatever blends and names describe the bean — these also serve the Atrament, the liquid faith, the alternate benediction. Although some are theobromic, and some are theophyllic, all acknowledge the gift of the Highest.

And I, I have a pot before me. It is full of the atramental essence of the bean. And I, I am happy; I, I am blessed. Makarios. It is my father's birthday today. Happy birthday, Dad!

Labels: , ,

Monday, October 17, 2011

Sabbatical

Every seventh year, a year of rest and regeneration. Speaking to academicians I know, this is no longer true. They don't give professors sabbaticals anymore. Time was, elite universities were the only ones that could afford the moral and intellectual high ground of such provision; time is, all universities have begun to think that it isn't worth it. World-class has become no-class; contract has slipped into indenture, and thence to slavery.

I am blessed because I was given an unexpected sabbatical, fourteen years into my career. And I've used it well, I think. I will always be thankful to those who gave it to me, for the gift that they gave me.

=====

Note: This is the 2500th surviving post of the Findings blog. Happy postday to us!

Labels: , ,

Sunday, October 16, 2011

Book Alert: The Dervish House

Ian McDonald's The Dervish House is rich and beautiful. It is a tale of nanotechnology, faith, the illusions of the world and their impact on humanity. It is a very human novel, firmly grounded in hüzün, the melancholic sense of decline which appears unique to Istanbul, once Constantinople, once Byzantium.

The novel begins with a suicide bomber on a crowded tram who kills nobody but herself, and unleashes angels of all kinds.

It's the story of Ayse, owner of a gallery of antiquities; Can, the almost permanently earplugged boy who controls nanobots; Georgios, an old radical 'experimental economist' who deals in information and probability; Adnan the rogue trader who is about to pull off the deal of the millennium with a little help from some friends; Necdet, who has begun to see the djinn and, more disturbingly, Hizir the Green Wanderer.

Most of all, it is a novel of Istanbul, mother of cities, daughter of its own rape.

It deserves its BSFA award. It is so much better than anything of William Gibson without having to descend into jargon, and it is a proper novel too. It will appeal to people who study development and post-colonialism and all that. Most of all, it will appeal not only to hardcore techie SF readers, but those who love good writing with character development and a sense of atmosphere. Rating: at least 9/10.

Labels: , ,

Saturday, October 15, 2011

A Quick Look at Some Questions

Today I had a good morning discussion on the questions covered here. Of course, since the venerable organisation in charge has reminded us that some students seem to be getting too much assistance, I will only deliver some sparse (or sparing) thoughts over the next few days. As always, they are only my musings. They may or may not be amusing to you, gentle reader...

This bunch of questions seems a more complicated bunch than usual, though. All about subjectivity and error and horrible things like that.

Labels: ,

Friday, October 14, 2011

Questions (2012 November)

There is a list. It is where it should be. There is a newer list. It too is where it should be. There will be responses. They will be wherever they will be. There will be fun, Dwarf Fortress style. Haha.

Yes, I will continue to give pointers for TOK Question Topics in 2012 and 2013. But no, I won't allow them to exceed the bounds of general advice. Much.

Labels:

Thursday, October 13, 2011

Feathered Glass

Why is the glass feathered?

Am I seeing the light layered
light upon light
like the essence of the pearl?

Or is there tracery of line
like laughter's faint
footprints on a distant face?

I think hawk, but see mist.

The unicorn blinks knowingly
life upon life
in the corner of my eye.

This moment an airplane summer
exhausted clean,
shivered timbers, back to work.

=====

Note: I spent last night reading a whole year's entries in somebody else's blog all over again to capture the essence, but it eluded me. This is all I have for now. The rest is filtering through and I expect to collect more residuals, for which poetry may barely be sufficient, and my skill not at all.

Labels: , , ,

Wednesday, October 12, 2011

The Entropy Tango

This phrase was coined by Michael Moorcock about 30 years ago, as a title in his long-running Jerry Cornelius writings. To me, it is something stately and inexorable, the dance that must by necessity lead to the final heat death of the universe.

If everything flows and nothing remains still (as Heraclitus the Ephesian said), there is no final death. But entropy, the state of gradual loss which comes with this motion, ensures that all things come to an end.

Heraclitus was the ancient who first came up with the concept of ὁδὸς ἄνω κάτω, the 'path upwards and downwards'. The phrase, read in English as hodos anö katö, or anodos/kathodos, is what electrochemists get their terms 'anode' and 'cathode' from; one is the upward path of the electrons and one is the downward path (and of course, elektrodos means 'amber path' or 'path of electrons'). But Heraclitus never factored in resistance and loss of power in his circuitous musings.

Which is what brings us back to the entropy tango. All things go round and round, up and down, but eventually, the ability to do anything productive wanes. Things equilibrate. At the end, all the lovely dancers fall down within their ring of roses.

No cell lasts for ever; no volta will bridge the dying electronic ga(s)p of the stricken system. But while the passionate embrace continues, life goes on. It is rather sobering, however, to note that the greater the energies expended, the more passionate the expression, the quicker the end. It is the choice of Achilles — to burn quickly and die a legend, or to burn slowly and die forgotten.

But either way, it is the entropy tango, danced quickly or slowly in the ballroom of time.

Labels: , , ,

Tuesday, October 11, 2011

Am I a Conservative Liberal or a Liberal Conservative?

For many years it's been difficult to articulate my political position (and indeed my position on many socially-related issues). That's because, as with most people, I have complicated positions; but I'm also picky about definitions, and it's hard for me to say I'm 'conservative' or 'liberal' without a whole bunch of caveats.

So here is some stuff about all that.

I'm politically a bit to what some people call 'the Left'. I believe that societies should make provision for their weaker elements, but I also believe that 'the poor you shall have with you always'. There will always be some freeloaders, and by the time that won't matter at all, we won't have political issues either. I don't like freeloaders, by which I mean people who can provide goods or services for remuneration sufficient to make themselves a living, but won't. However, I also believe that the 'better angels' of humanity need constraints in order to keep them better.

I'm socially a bit liberal in the sense that I believe that humans have inalienable rights and that they should be given freedom to express themselves within the meaning of those rights. However, I also believe that not all rights are equal and that some are downright spurious. I've written about such rights before. However, I am socially conservative in the sense that I don't think society needs to change without good reasons and sufficient infrastructural (both material and non-material) support; not all change is good, or good in practice even if good in theory.

I'm theologically conservative in the sense that given any particular body of religion, I'd examine the established canon of that religion before going on to speculative development of doctrine based on looser interpretation of that canon. If people have interpreted canon a specific way for a long time, I'd wonder why that would need to be changed if change were mooted.

Because of theological conservatism coupled with my other positions, some people think of me as morally and ethically conservative while others think I may secretly be a dangerous radical. I suspect I may be even more dangerous than that, but if so, I don't know it for sure. I just note, privately, that 'radical' comes from Latin radix, 'a root'. It doesn't quite mean what many people think it means: radicals are back-to-basics conservatives, who don't believe in preserving the existing complicated stuff-and-flummery structure of 'developed' society.

I think that all humans are essentially liberals at birth, but are made conservative later. At adolescence, some become badly-informed quasi-liberals. If they maintain the position into late middle-age, they often become pseudo-intellectual pseudo-liberals. It's only if they maintain their idealism into old age that they become true liberals again.

Not many humans let their offspring yell and scream without limit, throw their food around, remain unclothed, and set no limits on empirical education. This is what I mean by humans being 'made conservative'. Societies in general tend to have ideas about how children should be brought up, one way or another.

I've observed that universities in America are the most illiberal institutions with respect to what can or cannot be said, or their position on what is good, true and right for everyone else. British universities, by contrast, are hotbeds of dissenting opinion and subversive reasoning. They've honed the traditional British weapons of irony and satire well; nobody else is as good.

And here in Atlantis, the liberals are the ones who want more education for the masses, forgetting that education is the most conservatising force of all — unless of course you don't teach anything except how to reason without drawing conclusions and sticking to them and their consequences. How droll.

With all these opinions moving around in my head, and occasionally realigning differently, it's hard to say whether I am more liberal or conservative, or indeed which approach contributes more to my thinking. I am inclined to believe that my thinking contributes more to my perceived conservatism or liberalism. Politically, I'll vote for whoever makes more sense to me in a way that I like.

Labels: , , ,

Monday, October 10, 2011

The Paradoxes of Freedom

What most of my students don't get is this thing about Gibbs's free energy. Why is it that when ∆G is positive, things are non-spontaneous and when ∆G is negative, things are spontaneous? Using the equation ∆G = ∆H - T∆S tells us the mathematical story, but fails to tell us the human story. And so, some random thoughts cascaded through my mind...

In brief, Josiah Gibbs defined this free energy in terms of how much work you could get out of a certain well-defined amount of substance in a fixed state without a nett change of state. Call this a 'lump'. If the lump showed a positive ∆G, you'd need to ADD energy to it to get anything done. If the lump showed a negative ∆G, work would flow out from it spontaneously.

This, with some other thermodynamics, leads us to the reason why ultimate freedom is a silly ideal for humanity. If everyone had perfect information, you wouldn't have a free market — rather, everything would equilibrate. If the equilibrium were dynamic (i.e. no dominant strategy existed) people would have to change their output all the time. If the equilibrium were static (i.e. a dominant strategy existed), then people would be bound by their knowledge to produce specific things, and it would be in the majority interest to maintain this equilibrium regardless of free will.

Also, information freedom requires abundant energy. You need energy to produce, transmit and receive information. The more energy, the better the signal. However, energy requires gradients. Somebody will have to suffer. Ah, but suffering might be negligible if the energy available were huge enough! Yep, at which point, one of two things might happen: a) you wouldn't need all that because you'd have enough energy to be godlike; or b) you would be envious of relatively small differences because of energy inflation, if you were still human enough. Or has Moorcock has hinted, having what feels like infinite energy would lead to what felt like infinite boredom.

Freedom functions best in a relative sense. One feels more free relative to others being less free. One feels free to do work for something, against something, with something, or in spite of something. If we were all free, we would all be enchained by that freedom. Or by each other.

Labels: , ,

Sunday, October 09, 2011

The Big Change

Some time ago, I noted an amusing change which I believe to be of great benefit to everyone concerned...
In recognition of the concerns raised by colleagues regarding [redacted] assessment, the prescribed titles will be split from November 2012 and a new set of 6 titles will be issued for each examination session. The appropriate set of titles for each session will be made available on [redacted] as follows:
  • 1 March for the year of the November session
  • 1 September for the following May session
Marking guidance for examiners has been severely restricted because of the use of one set of titles to service two examination sessions. This improvement will enable the senior examining team to compile detailed marking notes for each title thus providing greater support to the examining team.
What it does mean is a lot less work all round for those of us who work on such things. Wonderful!

Labels: , ,

Saturday, October 08, 2011

October the Eighth

This date in the calendar always reminds me that October is actually the eighth month. Ah well.

Gerrymandering the calendar has been covered elsewhere in this blog. The sad story of February is covered here, with a further extension here. Enjoy!

Labels: ,

Friday, October 07, 2011

Steve Jobs (1955-2011) — Coda to the 20th Century

Historian Eric Hobsbawm insisted that the period 1789-1914 be called the 'long 19th century', spanning the era between the French Revolution and the Great War. It's a Eurocentric perspective; I would have included the American Revolution and made it 1776-1914. Perhaps he felt a 40-year extension to be too much.

Hobsbawm then went on to compound his assertion by saying that the period 1914-1991 should be called the 'short 20th century', spanning the era between the start of the Great War and the dissolution of the Soviet Union. He thus cleverly made two centuries out of the period 1789-1991, evening things out a bit.

I think he was wrong, though. The 20th century perhaps indeed began with the machine gun and the mass slaughters of the Great War. It certainly could have begun with the death of Victoria, the Queen and Empress who had presided over the greatest empire of modern times and the greatest advances in human enterprise and technology in history at that time. The antagonistic royals of the Great War were all her offspring (see this article).

But the breaking-up of the Union of Soviet Socialist Republics (how many still remember that dread acronym?) was a comparatively bloodless end-of-history that proved not to be any such end. Instead, the growing chaos just continued to grow. The one thing that held humanity together was increasing communication capacity, and the increasing ubiquity of widespread and speedy data transmission.

Steve Jobs was a prophet of this change. In a 1985 Playboy interview, he said, "The most compelling reason for most people to buy a computer for the home will be to link it into a nationwide communications network. We're just in the beginning stages of what will be a truly remarkable breakthrough for most people — as remarkable as the telephone."

He was right. And he spent his life ramming this vision artistically into the established wisdom of the world. Of all those who made and lost computer fortunes and established computer legacies, his was the most important: he was the Doctor Strangelove who taught us how to stop worrying and learn to love computers. He made them accessible, not just cheap plastic tools of high utility in the workplace, but companions and associates for the burgeoning chaos of the new millennium.

(I am quite certain that those who show their disdain for Apple's pretty machines as opposed to more robust and powerful systems are subconsciously working out their misplaced anger that so much power could be given to people who don't know anything about computer programming or have not 'earned' such power. These angry people might be Linux users, for example. Or those who assemble their own machines. The same people will probably vilify and/or disparage Jobs after his death.)

Steve Jobs's last 20 years, from 1991 to 2011, were the missing coda of Hobsbawm's 'short 20th century'. In those years, he spawned the new Apple way that forced competitors and colleagues to dance to his rhythm. For many, it was hard to decide if that rhythm was more like Chesterton's drumming "...in the hills half-heard, where only on a nameless throne a crownless prince has stirred" or the ominous thrill of Auden's drumming. But it was an unmistakable sound.

It was the sound of one aspect of the future. Not for us the flying cars and colonisation of space, but the future which meant tricorders and thinking machines. Jobs made us give names to our electronic aides, much as Asimov did for his positronic androids. Jobs did more than any other lord of the electronic realm to put power in our pockets and data in our hands.

That is why 2011 is a good year for the dying tail of the 20th century. Steven Paul Jobs is dead, and the way ahead transits with him into uncertainty.

Labels: ,

Thursday, October 06, 2011

Goodbye, Steve Jobs (1955-2011)

And also, goodbye to the 20th century that has finally passed on a decade after it was scheduled to go. But I'll get to that in a while.

Meanwhile, goodbye, Steven P Jobs. I was a personal friend of his computers; more accurately, a friend of his personal computers. I grew up in an age where the Apple icon was valued as trend-setting, even if sometimes mocked and sometimes treated like a soon-to-be-dead loony relative in the family of information technology. I never knew Mr Jobs as much; he was just the god behind the machine.

My school's computer room had some of those Apple machines. It also had NEC PCs and Windows-running clones. To me, the Apples (the ][, ][+ and ][ europlus) with their boxy monitors were more fascinating than the high-resolution NEC monsters.

It was the triumph of my youth when my first Apple finally came home, a ][e which I had cajoled my parents into buying. I cannot remember exactly what relatively lunatic arguments were marshalled before the duumvirate of my skeptical parents, but they carried the day with minor provisions and I (and my siblings) had the computer I (we) wanted.

And so began the cavalcade of Apple machines — Ancient Eric, Adam, Calvin, Percy, Judith, Cami, Mackie, Nemo... they all had names. As with cats, dogs or horses, you had to name your faithful servants and companions. It was a sign of affection, something you didn't show to most of your other tools.

I raise a first glass to the memory of Steven Paul Jobs who made us name our machines as friends. In doing this, he was a direct heir of Isaac Asimov — and Jobs had to do without the advantage of the android form.

As the first glass falls into the fireplace, its last fuel scattering into green flames, I raise a second glass. Mr Jobs made us believe in the personal computer, not just the PC. But he also made us believe in the computer as a consumer necessity (or at least, object-of-desire), not just a focus for geekish techno-shamanism.

My third glass is for practicality. My Apple machines last. Not one has failed me except by old age; all have lived at least five years. I type this on a PowerBook G4 which is about 8 years old, and still my main machine and desktop companion.

Mourning is not a Jobsian kind of phenomenon. This is an Irish wake for an Armenian boy who turned out to be genetically Syrian/German. And here, abruptly terminated, and with many things unsaid, I say goodbye to Steve Jobs and go on to the death of the 20th century.

The century just past was Steve's century as well as the century of much that defines our world. We split the atom, broke the speed of sound, reached the moon, made thinking machines and machine guns. It was the age of machines; the British invented the first commercial mainframe computer (Ferranti, 1951), the first commercial aircraft (Comet, 1952), the first commercial nuclear power station (Colney Hatch, 1956) and the first commercial supersonic jet (Concorde, 1976) — and the Americans and Russians raced ahead to get us into space and to the moon.

Along the way, we learnt the use of non-ferrous metals in construction and engineering — especially aluminium and titanium; we learnt to use plastics, those ubiquitous offshoots of the petrochemical industry; we mass-produced food and food packaging. Behind the rise of the information age were less obvious heroes. And one of those heroes was the driven and perceptive Steve Jobs. (How many glasses so far? Better start on the coffee soon. A man's gotta work.)

I sense that we are teetering now on the cusp of revolution, but with a lot less fire in the hole and passion in the heart. This is not the age of big projects, as financial pessimism damps the clamour of progress. The space shuttle has returned to Earth. Mr Jobs has died.

Life will go on. But the mood at present is a post-20th century one. We walk into a world quantised and bereft of meaning by that quantisation. How fortunate are we who are in the world but not of it!

Goodbye, Steven Paul Jobs. You left the world a much more colourful and interesting place. You didn't define your age, but you gave it a brighter and sharper definition. Rest in peace. May you find yourself in the Isle of Apples, where also lie Arthur and other symbols of human hope.

Labels: ,

Wednesday, October 05, 2011

The Five Percent Rule

It has long been heuristically suspected that in any population, only about 1 in 20 will actually achieve a performance that is at a level of 1 in 100 or better (i.e. better than 99% of the population's performances). This kind of supposition is not always true of all populations, but it is a good heuristic.

Look around you. Who is the best person in your class, group, set or other population for a given task? Is that person capable of beating 99% of all performances by your population? That person must be at the very worst a 1-in-20 phenomenon — in a group of 40, only 2 such are expected.

In a staff of say 300 people, this would mean not more than 15 are A-class. By this I mean using a measure of something like: A = top 1%, B = next 4-5%, C = next 10-15% and D = next 25% or so. Anyone after the top 45-50% should be considered a failure.

Why? Because in such populations, the people have already been selected for the job. If they can't do the job well, the question that should be asked is, "Why are they so bad?"

And this is one of the reasons that I believe all human resource executives and suchlike should be fired. They appear to serve no useful purpose at all and I am willing to think that basic filters plus luck-of-the-draw would do better.

Labels: , ,

Tuesday, October 04, 2011

Octobering

October is fire shot through with ash
Is faint and sudden rain, bright sun
And the sudden awareness of old age

Labels:

Monday, October 03, 2011

Anniversary of the Fall

A year has passed at the Citadel. A stub, this remains.

Labels:

Sunday, October 02, 2011

Knowledge Industrialisation

The future is rooted in the past; the present is that infinitesimally thin membrane between them. Thus it is that knowledge, that curious construction of what we think we know, is one of the threads that holds us secure within the tapestry of time.

There are jobs that have risen and fallen in prestige, careers that have teetered and tossed from pillar to post, from architecture to horticulture. There are professions that have split and joined, from barbering to surgery and pharmacy to witchcraft. Some disciplines have updated themselves, some need no updating; the ceramicist and the fashionista are in as high demand these days as they were in the time of Nebuchadnezzar.

But just as there is nothing new under the sun, so too there is nothing new about career guidance. No matter what ebbs and flows human endeavour brings, the man who knows who to hire for what job, and the man who knows what must be known (and who knows it) — these are the ones you want, or failing which, want to be. Only the polymaths will make order from chaos across the burning universe, even though the statistical points which are specialist will give them the bricks to do it with.

And that is why there is a need to teach people about knowledge definition, construction, management and utilisation. That is why people must be taught that data is not information unless it is given significance, and that information is not knowledge unless it has a relevant context. Those who teach these things and those who learn and use these things will never go out of style, will always have a job and a living, will always be able to get a life and keep it.

Labels: , ,

Saturday, October 01, 2011

The Usual Question List (Not Quite)

Every year, by this time, an interesting list of 'prescribed titles' is circulated, provoking much debate among the hard-pressed students of the worldwide International Baccalaureate. In keeping with the mutterings of colleagues equally worldwide, I shall withhold my thoughts for a few weeks to give said colleagues a chance to discuss those titles first before I corrupt young minds with my own. Haha!

More seriously, I quote:
"Some sources of support may be acting in good faith, but nevertheless are offering a level of support and guidance that is not permitted by [redacted]. As an example, any that offers assistance on current [redacted] prescribed titles must be avoided, or approached only with great caution. A key aspect of the [redacted] essay is that it is devised by the student: assessment criterion B [redacted] looks for evidence of independent thinking, shown in the way the essay has been shaped and in the use of varied examples. Therefore, any advice of a prescriptive nature is likely to disadvantage the student who uses it."

So, please don't structure your essay around my musings. Rather, treat them as amusing. Or multiple musings. And listen to your supervisor when that worthy individual tells you not to rip off this blogger's thoughts and paste them in your essay. :)

=====

The list of questions for this year is here:
  1. Can we have beliefs or knowledge which are independent of our culture?
  2. It is a capital mistake to theorise before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. — Arthur Conan Doyle. Consider the extent to which this statement may be true in two or more areas of knowledge.
  3. Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand. — Albert Einstein. Do you agree?
  4. What counts as knowledge in the arts? Discuss by comparing to one other area of knowledge.
  5. Habit is stronger than reason. To what extent is this true in two areas of knowledge?
  6. The ultimate protection against research error and bias is supposed to come from the way scientists constantly re-test each other’s results. To what extent would you agree with this claim in the natural sciences and the human sciences?
Note that the list is shorter this year — this is the list for November 2012 only. A later note elucidates.

Labels: ,