In this blog by computer scientist Boaz Barak makes a distinction between the speakable and the unspeakable. He writes:
In fact, Computer Science has about as much to do with computers as English has to do with printing technology. The point of computer science is not about mechanizing some calculation but about finding a novel way to express a concept we couldn’t grasp before. Finding such a way enriches us whether or not it ends up being programmed into a computer, since it lets us access what was heretofore unspeakable.
Now, what here is referred to as the unspeakable is so because its perfect calculation would take so much time and resource that it is rendered practically unspeakable. However, if one gives up on perfection and attempts to gain an estimate that is inexact, it may lead to a conceptual insight about something that was considered un
Boaz Barak further claims that the source of his inspiration is the adage from Voltaire which says that ” The Perfect is the Enemy of the Good” . If we accept this ‘cult of the imperfect’, we will rediscover the utility of a sensible notion of the ‘good’. We are reminded of Aristotle’s adage about the good being related to the notion of the mean. Besides also his insistence that the good is not some abstract notion removed from the actuality of practice but that its worth is reaffirmed through its resonance with the most-commonplace beliefs. The demand for a perfection so ideal as to be unreal , can be debilitating.
Computer science extracts a quantitative insight from a moral precept. While computation allows for greater precision in the solution of many problems, the search for precision does not succumb to the demand of perfection but works through notions of approximation, instead. These notions of approximation reveal the rich structures of the road to perfection. Perfection is a mere bait to develop new concepts.
The foundations of computer science as laid out by Turing were indeed meant to define what it meant for something to be perfectly computable. Deploying the ideal of perfection endowed a fundamental clarity to notions of computability. However, there emerged bamboozling classes of problems that resisted computability. Motivated by this negative thesis, the holy grail is to define problems that are perfectly unsolvable and yet keep approaching these black holes through imperfect approximation. In other words, only because we value the approximate and the imperfect, we remain wise enough to continuously revise what we deem as the perfect.
The perfect must be the necessary crony of the good, not its enemy!
Henri-Louis Bergson (1859-1941), the nineteenth-century French philosopher, would have had a lot to say when it comes to the territorial disputes between science (and within science, between physical and natural sciences) and philosophy which became unbridgeable with the postmodern condition. A conversation with Bergson in the aftermath of Sokal Affair of 1996 would have, at first, entailed a hearty laughter at the calibrated attempt by the two physicians to ridicule the twentieth-century philosophers-celebrities and their presumably obscure jargon. Perhaps then it would have moved in the direction where Bergson would highlight the asymmetry that science and philosophy has come to imbibe with regard to the nature of reality. Bergson also grappled with the different orientations that physical sciences and natural sciences obtained in his times making it difficult to look at the organic totality (cosmological and cosmogonic) what Bergson later recognised to be the ‘absolute’ in his own works. The meta-stable universe handed to us by Sir Isaac Newton who defined space and time as the inert theatre where the planets, the sun, the earth revolved around in a perfect manner suggesting a celestial symphony was taken as the static reference system for any kind of measurement. Space was absolute and infinite and so was time in this all-embracing universe. This all-embracing concept of infinite space and time rendered any question about creativity, a life force unintelligible because this stable universe was perceived as incapable of change.
The sublunary world, on the other hand, was buzzing with change. Somewhere in the middle of the nineteenth century, time came to break away from the Newtonian mentality and to speculate about evolution, about change and the process of becoming. A new understanding of time emerged divorced from the physicist’s point of view. Bergson arose to declare ‘duration’ as opposed to the homogenous time which boringly catered to an unborn future. Bergson declared an openness to the future. Against the stated arguments of fixed and frozen moments which can be easily plotted on the Euclidean scale, Bergson proposed a possibility to think beyond the human condition. Bergson countered Immanuel Kant’s view that science (specifically Newtonian mechanism) had defined the limits to metaphysics making all knowledge relative to the faculties. He contended that the physical sciences was based on the conventional ways of measurement based on human intellect and led to ‘artificial’ rules to define the ‘real’ movement of reality. Bergson suggested a move away from ordinary language to understand the abstract nature of thinking. Against the Kantian possibilities where the mind is determined by the external stimuli and things are determined by mind itself and somewhere in between these two there lies an agreement, Bergson said that over a period of time, the agreement remains unexplainable as ever and the other two take up a common form so much so that it becomes difficult to distinguish the mind from the matter and vice-versa. And so, Bergson made it his personal task to bridge the philosophy of knowledge with philosophy of life. It was his belief that the philosophy of life gradually becomes visible when the frames of knowledge become enlarged and attain harmony with each other. The homogenous time and space which Kant viewed as the transcendental forms presuppose “duration” which extends and provides coexisting multiplicity to every successive moment. The debate then shifts from the binary of noumena and phenomena to the partial knowledge of the real and the organic totality.
The disciplinary quarrels are, more or less, on the same lines. Each supposes an relatively absolute knowledge of the real world without taking into consideration the other. If only these disciplines could change their fixed states and participate in the process of becoming, Bergson would say, then the doctrine of real movement would emerge.
The evolution of disciplines under the influence of digitization is a matter that demands urgent scrutiny. Both the sciences and humanities are confronted with ‘data’ poised for manipulation. While digitization can enrich the tools and methods available for investigation, it also rekindles dreams of a singular methodology universal enough to consolidate an unwieldy multiplicity of disciplines.But this promise of digitization must echo with the noise of ‘science wars’ in the nineties . That flashpoint was a sure betrayal of the territorial instincts that define academic departments thereby obscuring the real history of disciplines as they individuate out of the ferment of interacting discourses. As the Sokal affair saw the publication of a hoax article in a leading cultural studies journal, the parodical appropriation of scientific content was an occasion to ask if editorial ideology could ever qualify as a ‘method’. All in all, digitization provides for a moment to reconsider both the future and the history of disciplines today.
Indeed it was the Sokal affair that also precipitated a growing interest in philosophy of science displayed by the absorption of Thomas Kuhn and Paul Feyerabend in discussions of postmodernism. Postmodernism itself is a condition mediated by the emergence of digital media. The pressure of this seemingly ubiquitous technological context leads one to question the very idea of disciplinarity in a digital age. As digitization creates ground for computation, does it necessarily propagate a view of science that is essentially about control and prediction? Or does the formalism heralded by digitization ask us to reexamine the role of formalization in science as such?
While science aggressively sets standards of intelligibility for the humanities, earlier attempts at formalizing the humanities had to negotiate institutional obstacles of another kind. Consider the curious contradictions forced into the thought of Vladimir Propp as one of those Formalists suspected for not being Marxist enough in Soviet Russia. External to disciplinary boundaries, there are also attempts to prop up grand programs which seek to unify all knowledge or systematize existing groups of disciplines. Auguste Comte places sociology at the top of the hierarchy of sciences because it can ill-afford to isolate its subject matter from its context unlike physical sciences. On the other hand, cybernetics and complexity studies drew inspiration from the dependence of sociology on context.
Hence we have ever more reason to persist in not reading scientism of the humanities as a mere symptom of disciplinary insecurity. Disciplines aspire to become scientific not merely to pass Karl Popper’s problem of demarcation (they often do not). Rather disciplines participate in and are constituted by a discursive totality that transcends their respective boundaries even if never all-encompassing. Superficially, it is clear that unlike literature, mathematization implies that economics is characterized by strong disciplinarity. But if we mark scientific attitude by a tendency towards formalization, the question is to ask what varieties of formalism prevail and what purposes they serve. As Michel Foucault cautions us that even the epitome of formalism: mathematics is often loosely characterized as a progress from one naive formalism to another less so. Likewise, one may ask which stage of formalism another sub-discipline like theoretical linguistics is poised at?
Within the field of economics, formalization is often suspected as an excuse for “saying less” when one could say more. On the other hand, the digital humanities have been defended as an opportunity to expand upon the possibilities allowed by traditional techniques of literary criticism. Such was also the intent of the sociologist Gabriel Tarde when he criticized the economists not for being reductively calculative, but for not calculating enough. And now that the digital age allows for virtually limitless calculation, big data may allow for a quantification in sociology that could stretch the possibilities of economics itself. Digitization opens the Pandora’s box not only for the quantification it accelerated but also for formalization, in general, as it spread across disciplines. The conference seeks to nurture this spirit of inquiry into the question of disciplinarity in the humanities and social sciences and its entanglement with the history and practice of sciences.
Questions that the conference seeks to address include but are not limited to:
- Role of explanation in the social sciences in the context of ‘big data’
- Import of concepts like ‘entropy’ from natural sciences
- With the systematization of data, questions about the telos of inquiries in humanities and social sciences
- The role of data in the emergence of experimental philosophy
- How digitization affords the possibility of simulation and the philosophical issues therein
- The relevance of econometrics to disciplines beyond economics
- The relationship between digitization and formalization