In this blog by computer scientist Boaz Barak makes a distinction between the speakable and the unspeakable. He writes:
In fact, Computer Science has about as much to do with computers as English has to do with printing technology. The point of computer science is not about mechanizing some calculation but about finding a novel way to express a concept we couldn’t grasp before. Finding such a way enriches us whether or not it ends up being programmed into a computer, since it lets us access what was heretofore unspeakable.
Now, what here is referred to as the unspeakable is so because its perfect calculation would take so much time and resource that it is rendered practically unspeakable. However, if one gives up on perfection and attempts to gain an estimate that is inexact, it may lead to a conceptual insight about something that was considered un
Boaz Barak further claims that the source of his inspiration is the adage from Voltaire which says that ” The Perfect is the Enemy of the Good” . If we accept this ‘cult of the imperfect’, we will rediscover the utility of a sensible notion of the ‘good’. We are reminded of Aristotle’s adage about the good being related to the notion of the mean. Besides also his insistence that the good is not some abstract notion removed from the actuality of practice but that its worth is reaffirmed through its resonance with the most-commonplace beliefs. The demand for a perfection so ideal as to be unreal , can be debilitating.
Computer science extracts a quantitative insight from a moral precept. While computation allows for greater precision in the solution of many problems, the search for precision does not succumb to the demand of perfection but works through notions of approximation, instead. These notions of approximation reveal the rich structures of the road to perfection. Perfection is a mere bait to develop new concepts.
The foundations of computer science as laid out by Turing were indeed meant to define what it meant for something to be perfectly computable. Deploying the ideal of perfection endowed a fundamental clarity to notions of computability. However, there emerged bamboozling classes of problems that resisted computability. Motivated by this negative thesis, the holy grail is to define problems that are perfectly unsolvable and yet keep approaching these black holes through imperfect approximation. In other words, only because we value the approximate and the imperfect, we remain wise enough to continuously revise what we deem as the perfect.
The perfect must be the necessary crony of the good, not its enemy!
The evolution of disciplines under the influence of digitization is a matter that demands urgent scrutiny. Both the sciences and humanities are confronted with ‘data’ poised for manipulation. While digitization can enrich the tools and methods available for investigation, it also rekindles dreams of a singular methodology universal enough to consolidate an unwieldy multiplicity of disciplines.But this promise of digitization must echo with the noise of ‘science wars’ in the nineties . That flashpoint was a sure betrayal of the territorial instincts that define academic departments thereby obscuring the real history of disciplines as they individuate out of the ferment of interacting discourses. As the Sokal affair saw the publication of a hoax article in a leading cultural studies journal, the parodical appropriation of scientific content was an occasion to ask if editorial ideology could ever qualify as a ‘method’. All in all, digitization provides for a moment to reconsider both the future and the history of disciplines today.
Indeed it was the Sokal affair that also precipitated a growing interest in philosophy of science displayed by the absorption of Thomas Kuhn and Paul Feyerabend in discussions of postmodernism. Postmodernism itself is a condition mediated by the emergence of digital media. The pressure of this seemingly ubiquitous technological context leads one to question the very idea of disciplinarity in a digital age. As digitization creates ground for computation, does it necessarily propagate a view of science that is essentially about control and prediction? Or does the formalism heralded by digitization ask us to reexamine the role of formalization in science as such?
While science aggressively sets standards of intelligibility for the humanities, earlier attempts at formalizing the humanities had to negotiate institutional obstacles of another kind. Consider the curious contradictions forced into the thought of Vladimir Propp as one of those Formalists suspected for not being Marxist enough in Soviet Russia. External to disciplinary boundaries, there are also attempts to prop up grand programs which seek to unify all knowledge or systematize existing groups of disciplines. Auguste Comte places sociology at the top of the hierarchy of sciences because it can ill-afford to isolate its subject matter from its context unlike physical sciences. On the other hand, cybernetics and complexity studies drew inspiration from the dependence of sociology on context.
Hence we have ever more reason to persist in not reading scientism of the humanities as a mere symptom of disciplinary insecurity. Disciplines aspire to become scientific not merely to pass Karl Popper’s problem of demarcation (they often do not). Rather disciplines participate in and are constituted by a discursive totality that transcends their respective boundaries even if never all-encompassing. Superficially, it is clear that unlike literature, mathematization implies that economics is characterized by strong disciplinarity. But if we mark scientific attitude by a tendency towards formalization, the question is to ask what varieties of formalism prevail and what purposes they serve. As Michel Foucault cautions us that even the epitome of formalism: mathematics is often loosely characterized as a progress from one naive formalism to another less so. Likewise, one may ask which stage of formalism another sub-discipline like theoretical linguistics is poised at?
Within the field of economics, formalization is often suspected as an excuse for “saying less” when one could say more. On the other hand, the digital humanities have been defended as an opportunity to expand upon the possibilities allowed by traditional techniques of literary criticism. Such was also the intent of the sociologist Gabriel Tarde when he criticized the economists not for being reductively calculative, but for not calculating enough. And now that the digital age allows for virtually limitless calculation, big data may allow for a quantification in sociology that could stretch the possibilities of economics itself. Digitization opens the Pandora’s box not only for the quantification it accelerated but also for formalization, in general, as it spread across disciplines. The conference seeks to nurture this spirit of inquiry into the question of disciplinarity in the humanities and social sciences and its entanglement with the history and practice of sciences.
Questions that the conference seeks to address include but are not limited to:
- Role of explanation in the social sciences in the context of ‘big data’
- Import of concepts like ‘entropy’ from natural sciences
- With the systematization of data, questions about the telos of inquiries in humanities and social sciences
- The role of data in the emergence of experimental philosophy
- How digitization affords the possibility of simulation and the philosophical issues therein
- The relevance of econometrics to disciplines beyond economics
- The relationship between digitization and formalization