Sunday, May 30, 2010

Poetry About Our Art

About 14 years ago, at the pick of the object-oriented programming craze I came across a publication called On the Origin of Objects, randomly placed amidst the stream of software/OO engineering books at a very reliable computer science bookstore. Surely, I thought, this would get to the bottom of things, somebody has dissected the notion of objects and outlined all the fundamentals I would need to be a successful programmer without the need to skim through endless pages describing ridiculous design processes, trivial principles, inapplicable rules of thumb and what not. It turned out the book had nothing to do with object-oriented programming.

On the Origin of Objects (amazon link) is about metaphysics with computation as the starting point. The author is one of the deepest and most original thinkers I've come across - Brian Cantwell Smith (wikipedia link). Needless to say, I couldn't read it at the time, I wasn't ready for it. That came about 7 years later and it was a memorable, mentally reinvigorating experience. Smith writes beautifully and dances around his insights with such grace and depth. He writes about the kind of computation we do day to day, the real stuff, and he puts common conceptual problems we programmers face into the center stage of philosophy in a way that gives our work those extra dimensions that scientists seem to have always enjoyed - a fundamental, very real connection to the physical world, including, at an even deeper level, a connection with us intentional beings, a set of problems that arise naturally from the practice of our profession, yet quickly reach the most difficult metaphysics in a way that no other practice does.

There are a few (not many) articles you could find on the internet from Prof. Smith, all of them worth reading. However, the purpose of this blurb is to bring to the attention to whoever comes across it his latest work. For the past years, I've been eagerly monitoring and waiting the publication of Age of Significance, which is supposed to be in 7 volumes. The book website,, hadn't changed until just two months ago where it was announced that individual chapters will be published monthly. So far, only the introduction has been posted at and I believe that an attentive read would make my seeming infatuation with this work understandable. Originally, I intended to write a summary of that introduction, highlighting the main points, most of which I'm already familiar with from previous writings (Smith's and others), but I wouldn't want to butcher it. It is philosophy at its best. And it is about the foundation of computing, that which we (should, to say the least) care about. I will just quote the conclusion for the hardcore philosophy skeptics:

Throughout the whole project, I have received countless communications from programmers and computer scientists - over coffee, in the midst of debugging sessions, in conferences, at bars and via email - hoping that it might be possible to say what they feel they know, the absence of which, in some cases, has led to almost existential frustration.

That is pretty much how I've felt more often than not as a programmer. And that is why, to me Smith's work is pure poetry, as philosophy used to be seen at the time of Plato anyway.



  1. I hope I have time to read this one day, although the slow drip of AOS should make it easier. Here's hoping. But there is much else that I'm ashamed to say I have not read including Wolfram's New Kind Of Science, anything by Judah Pearl and anything much by Pearce, sure the foundation of these thoughts:-
    "What is complex from one point of view (a stupefyingly intricate arrangement of organic molecules, say) may be simple from another (a single rose), and complex, but in an entirely different way, from a third (a suicide gesture). Similarly for computational examples: how can it take 28 million lines of code to implement 17 functions in a simple black- and-white copier? How can one study complexity without studying the ways in which something is or is not complex? And what kind of science would be a study of ways?"
    It should prove an interesting journey though.
    I like the way he is going to try to engage his audience.
    My brother-in-law has done this in the web site Pathways-to-philosophy ( for the last twenty + years. Not focused on the philosophical underpinning of computing but philosophy more generally. Some people insist on being brilliant originals!

  2. Boris,
    Thank you for the interesting post. I went to the link you gave and red the introduction to "Age of Significance". It actually make a very simple and striking point: computing is not limited to what computers do, it is what the world does at many levels including chemistry and biology. It follows that the theory of computing as such must necessarily be the theory of everything, which is aguably the theory of nothing. The book author then concludes that information science must converge with other sciences into a new paradigm which he calls the Age of Significance. But the author does not provide any hints as to how this paradigm will emerge.
    The point itself is not new (IMHO). If you consider Plato's universal forms as true origins of the information science and philosophy in general as all-encompassing scientific paradigm, then the Age of Significance has been comming ever since, or may not be comming for another few millenia, or never. This leads me to question the significance or the Age of Significance. perhaps it is yet another reminder to technocrats about the significance of philosophy. I doubt, though, that it will be heard, especialy at the time when computer technology is triumfant and philosophy is as divided as never before. What our Age is calling for is not another mental paradigm, but new technology to augment the mind. But that is completely different subject (see "The end of theory" by Andersen.

  3. Hi Len,

    Well, Smith says that much himself. To quote: "We will never have a theory of computing because there is nothing there to have a theory of"....because computers are not special enough. But this is not an obvious conclusion and it doesn't come cheaply. It would be a shocking statement to many. And one must first understand really what makes them so seemingly special before being able to argue the opposite. And what makes them seemingly special is that they manage to bring issues of meaning, intentionality, significance, semantics, whatever you want to call it, at the forefront of engineering, hence the "dialectical interplay b/w meaning and mechanism". The main point is that computing is not only about mechanism, but really as much about meaning, about "aboutness". And this is not to say that they have the potential to be meaningful (when true AI is build one day) - they already are intentional because they (i.e. their internal, purely "formal" representations) have consequence: if the banking software of your bank doesn't think you have a healthy balance, then you don't.

    Anyway, I hope more people will read it. Or that the ideas will spread from the ones that do. I think notions such as semantics, intentionality, significance are the most elusive, yet common in arguments and discussions, and such a fresh and deep look at them should be welcome, IMHO.


  4. Boris,
    After a second and careful reading of the introduction (took me some two hours, but it was worth it, at least as an exercise - yes brain needs it too) I see in exquisite details the same basic argument made by the book author: it boils down to what most people would interpret as dismissing much of computer science as pseudoscience. The analysis is indeed very deep, so much that many people will not give it a well-deserved two hours of hard thinking (and this is just an introduction). But in the end most of it is a negative polemic against mechanistic views prevalent in engineering disciplines. These attitudes are unlikely to be affected by such critique, as it is not new. Yes, computers are not just machines, they are extending our minds and increasingly replace our mental facilities. Correspondingly computing is not merely calculation, but some form of cognition, much like what robots have is some form of perception. These are not very fresh insights. I was looking very hard for a grain of positive results and this is what I found: "[analysis] gestures towards an understanding of computation as a system of normatively-governed causal transitions". This same thought (differently formulated) occured to me many times over last 6 years at least. Does it qualify as a "paradigm shift"? Unfortunately, even if it occurred much earlier to much smarter people than me (which the Age's author and you undoubtedly are), it does not, until the actual shift takes place. And I am afraid it is not going to happen by the virtue of such excelent polemic, as at the present historical context polemic is simply not as effective tool as it was... say between the two world wars (even then it took much more to make a scientific and engineering paradigms to shift). If it is to happen this time around, someone should beat Google and Microsoft in their own game. Want to try?
    PS. I have tried to contact you using your email and sent you a LinkedIn invitation, but sadly got no response.


  5. Hi Len,

    I'm on vacation currently, so not much time to respond and no much internet access. Will get back to you in a week.