On giving propagandists a platform

I always had mixed feelings about debates on invitations to controversial speakers. Every case is different I guess, and should be discussed as an individual case. At the same time, I think that inviting someone as a speaker at a university or public institution should be justified in the light of the fact that such a forum provides the speaker with an authoritative platform. Some even believe that such an invitation produces epistemological evidence in favour of the invitee’s position.* In any case, my feelings were mixed but, I thought, fairly balanced. You can always see pros and cons, and try to listen carefully to the other side, or so I thought. In this post, I want to do two things: I want to protest against the invitation of Paul Cliteur to Groningen; and I want to talk about something that I completely underestimated: the ambiguous weight of stating the obvious.

When I noticed that Paul Cliteur is invited to Groningen’s annual night of philosophy to give a lecture on “Theoterrorism and the Cowardice of the West”, I was not only shocked by the fact itself but also surprised by the vehemence of my own reaction. I feel that, unless I note my disagreement, I am complicit in endowing the speaker with extra authority, simply by being part of Groningen University. Arguably, we should note disagreement not only on behalf of those targeted by propaganda, but also in solidarity with those who feel intimidated to do so publicly. (Not long ago, a number of colleagues from Amsterdam received death threats after politely protesting against a lecture by Jordan Peterson.) Often protest or disagreement is construed as an attack on free speech. (“Nowadays we can’t say that anymore”, you hear them say all the time, while they say whatever they want.) But the opposite is the case: the very idea of free speech must comprise the right to disagreement or protest against speech. Cliteur is an active politician and a professor of jurisprudence, who has written quite a number of texts with all the ingredients of what I’d call right-wing attitudes: claiming a conspiracy of “Cultural Marxism”; nationalism; anti-Islamism, you name it. I don’t want to categorise him too readily, but he strikes me as a Dutch version of Jordan Peterson in Canada or of Thilo Sarrazin in Germany. – But what was I actually reacting to? There is a great number of claims that I find objectionable. But often the problem of propagandistic tales is not that they contain explicitly objectionable things; rather, it’s how they recontextualise “obvious” observations.

A problem with people like Cliteur is that they make outrageous claims, while sounding perfectly reasonable. Here is an example: Cliteur clearly and sensibly distinguishes between Islam (the religion) and Islamism (a political ideology based on religious doctrines). So he does not say that religion entails terrorism or that religious people are potential terrorists. But then Cliteur introduces the term “theoterrorism” to label terrorists who motivate their acts by reference to their religion. Indeed, one of his main claims is that he is almost alone in taking terrorists’ reliance on their religion seriously. He portrays others as reverting to misguided explanations and himself as seeing what their true motivation is:

“Many people are reluctant to engage in this kind of research. They are concerned with something quite different: protecting religious minorities from discrimination and the “stereotyping of their religion.” Or they have the ambition to explain why the essence of Judaism, Christianity or Islam is averse to violence. I fully recognize the importance of that type of commentary from a believers perspective. But it is not the kind of approach that makes it possible to understand the theoterrorist challenge. I fear these well-meaning people are dangerously mistaken. The greatest contribution you can make to the peaceful coexistence of people of good will is to make a fair assessment of the role religion plays in contemporary terrorism, and not to suppress or censor people who dare to address this issue.”

What’s going on here? While he pretends to be looking for an alternative explanation of terroristic acts, he does in fact claim a link between religion and terroristic acts. Religious beliefs, then, are taken as the proper reasons (if not the causes) for people to commit terroristic acts. This way the difference between Islam and Islamism, while maintained verbally, is in fact nullified. Thus, Cliteur can evade the charge of hate speech against religious people, but he might be said to celebrate his way of linking terrorism and Islamic beliefs as a scientific discovery.

Linking religion to terrorism in this general way is bad for all sorts of reasons. Believe it or not, many people are religious without ever entertaining so much as a trace of a terrorist inclination. But two further aspects are striking about Cliteur’s claim: Firstly, no one ever denied that the terrorists he cites referred to religious attitudes. There is nothing spectacular about this. Secondly, Cliteur makes no move to invoke any solid evidence for this claim. But if his point were supposed to have the status of a proper explanation, then he would need to rule out alternatives. Compare: I could tell you that I go shoplifting on a regular basis because Father Christmas told me to. Now people might speculate about my motives. But you could just tell everyone: “People, Martin’s reasons have been staring us in the face ever since. Father Christmas told him so!” While no one might deny that I said so, the reference to Father Christmas might not in fact be the best explanation of my actions. Cliteur’s point amounts to no more. He links (Islamic) religion to terrorism; he presents this claim as new while at the same time giving himself the air of stating the obvious, and he provides no evidence or ways of ruling out alternative explanations for the phenomena he picks out. It is obvious that certain terrorists invoked religious beliefs; it is far from obvious that the invoked beliefs or the religion in question explain their acts.

Although this is bad enough, it does get worse. In his little essay on theoterrorism, Cliteur asks what “the West” should do. He sees Dutch values and free speech and just about everything threatened. At the same time, he claims that all the available strategies in the West have failed. Again, without providing evidence. It is obvious that terrorism hasn’t gone away; it is far from obvious that the available strategies were not effective (e.g., against cases we don’t know about). Now what do you actually do if you claim that people are threatened by terrorism but that none of the attempted solutions work? The party Cliteur supports has a well-known list of answers, consisting of the now common right-wing ideas rampant in Europe and the US. In conjunction with the politics Cliteur supports, the brand of nationalism that recommends itself as the answer is not too difficult to guess.

While he is careful enough not to call a spade a spade, his pamphlet on theoterrorism might be read as a legitimisation of both legal and illegal means to overcome what he calls the “cowardice of the West”. The claim that Western measures fail seems to call for new measures.

“But does the west’s defense do the trick? … So as long as the western countries persist in their assault on Islamic sacred symbols, Muslims are not only mandated but religiously and morally obligated to take revenge in the name of Allah, so the theoterrorists contend.”

By building up his case as a threat to the Abendland, by suggesting that “Muslims are … obligated to take revenge”, Cliteur eventually alludes to ‘obvious’ measures without stating them explicitly. It is this unspoken call to arms that is the most dangerous part of such political pamphlets. Inciting strong reactions without explicitly stating them immunises such propaganda against any critique that relies on explicit statements. “Oh, I didn’t say that”, is a common phrase of such people. They are all quite misunderstood.

Giving a platform to such incitements strengthens them. Yet, de-platforming might turn their protagonists into martyrs. Thus, rescinding an invitation might be just as problematic as making it to begin with. That said, what should worry us perhaps even more are the voices of those who were not invited in the first place. There are many more interesting and pertinent speakers for a night of philosophy.

_____

* Clarification in response to some misrepresentations on social media and the news: I’m not saying that “providing a university platform for controversial figures is tantamount to endorsing (or supporting) their positions”. I rather claim that it lends some authority to their position A student newspaper misrepresented my position earlier. Unfortunately, that text was then shared widely. (Added on 27 March 2019)

Since the misrepresentations are continuously repeated, I devoted another blog post to them.

.

Against history of philosophy: shunning vs ignoring history in the analytic traditions

Does history matter to philosophy? Some time ago I claimed that, since certain facts about concepts are historical, all philosophy involves history to some degree (see here and here). But this kind of view has been and is attacked by many. The relation to history is a kind of philosophical Gretchenfrage. If you think that philosophy is a historical endeavour, you’ll be counted among the so-called continental philosophers. If you think that philosophy can be done independently of (its) history, you’ll be counted among the analytic philosophers. Today, I’ll focus on the latter, that is, on analytic philosophy. What is rarely noted is that the reasons against history are rather different and to some extent even contradictory. Roughly put, some think that history is irrelevant, while others think that it is so influential that it should be shunned. In keeping with this distinction, I would like to argue that the former group tends to ignore history, while the latter group tends to shun history. I believe that ignoring history is a relatively recent trend, while shunning history is foundational for what we call analytic philosophy. But how do these trends relate? Let’s begin with the current ignorance.

A few years ago, Mogens Laerke told me that he once encountered a philosopher who claimed that it wasn’t really worth going back any further in history than “to the early Ted Sider”. Indeed, it is quite common among current analytic philosophers to claim that history of philosophy is wholly irrelevant for doing philosophy. Some educational exposure might count as good for preventing us from reinventing the wheel or finding the odd interesting argument, but on the whole the real philosophical action takes place today. Various reasons are given for this attitude. Some claim that philosophy aims at finding the truth and that truth is non-historical. Others claim that you don’t need any historical understanding to do, say, biology or mathematics, and that, since philosophy is a similar endeavour, it‘s equally exempt from its history. I’ll look at these arguments some other day. But they have to rely on the separability of historical factors from what is called philosophy. As a result of this, this position denies any substantial impact of history on philosophy. Whatever the merit of this denial, it has enormous political consequences. While the reasons given are often dressed as a-political, they have serious repercussions on the shape of philosophy in academic institutions. In Germany, for instance, you’ll hardly find a department that has a unit or chair devoted to history of philosophy. Given the success of analytic practitioners through journal capture etc., history is a marginalised and merely instrumental part of philosophy.

Yet, despite the supposed irrelevance of history, many analytic philosophers do see themselves as continuous with a tradition that is taken to begin with Frege or Russell. To portray contemporary philosophical work as relevant, it is apparently not enough to trust in the truth-conduciveness of the current philosophical tools on display. Justifying current endeavours has to rely on some bits and bobs of history. For some colleagues, grant agencies and students it’s not sufficient to point to the early Ted Sider to highlight the relevance of a project. While pointing to early analytic philosophy is certainly not enough, at least some continuity in terminology, arguments and claims is required. But do early analytic philosophers share the current understanding of history? As I said in the beginning, I think that many early figures in that tradition endorse a rather different view. As late as 1947, Ryle writes in a review of Popper in Mind, the top journal of analytic philosophy:

“Nor is it news to philosophers that Nazi, Fascist and Communist doctrines are descendants of the Hegelian gospel. … Dr Popper is clearly right in saying that even if philosophers are at long last immunized, historians, sociologists, political propagandists and voters are still unconscious victims of this virus …”*

Let me single out two claims from this passage: (1) Hegelian philosophy shaped pervasive political ideologies. (2) Philosophy has become immune against such ideologies. The first claim endorses the idea that historical positions of the past are not only influential for adherent philosophers, but shape political ideologies. This is quite different from the assumption that history is irrelevant. But what about the second claim? The immunity claim seems to deny the influence of history. So on the face of it, the second claim seems to be similar to the idea that history is irrelevant. This would render the statements incongruent. But there is another reading: Only a certain kind of philosophy is immune from the philosophical past and the related ideologies. And this is non-Hegelian philosophy. The idea is, then, not that history is irrelevant, but, to the contrary, that history is quite relevant that thus certain portions of the past should be shunned. Analytic philosophy is construed as the safe haven, exempt from historical influences that still haunt other disciplines.

Ryle is not entirely clear about the factors that would allow for such immunity. But if claim (2) is to be coherent with (1), then this might mean that we are to focus on certain aspects of philosophy and that we should see ourselves in the tradition of past philosophers working on these aspects. If this correct, Ryle is not claiming that philosophy is separate from history and politics, but that it can be exempt from certain kinds of history and politics. As Akehurst argues**, this tradition was adamant to shun German and Britisch idealism as well as many figures that seemed to run counter to certain ideas. Whatever these precise ideas are, the assumption that (early) analytic philosophy is simply a-historical or a-political is a myth.

Whatever one thinks of Ryle’s claims, they are certainly expressive of a core belief in the tradition. At it’s heart we see a process of shunning with the goal of reshaping the canon. The idea of being selective about what one considers as the canon is of course no prerogative of analytic philosophy. However, what seems to stand out is the assumption of immunity. While the attempt to immunise oneself or to counter one’s biases is a process that includes the idea that one might be in the grip of ideologies, the idea that one is already immune seems to be an ideology itself.

Now how does this shunning relate to what I called today’s ignorance? For better or worse, I doubt that these stances are easily compatible. At the same time, it seems likely that the professed ignorance is an unreflected outcome of the shunning in earlier times. If this is correct, then the idea of non-historicity has been canonised. In any case, it is time reconsider the relation between analytic philosophy and the history of philosophy.***

____

* Thanks to Richard Creek, Nick Denyer, Stefan Hessbrüggen, Michael Kremer, and Eric Schliesser for some amusing online discussion of this passage.

** See T. Akehurst, The Cultural Politics of Analytic Philosophy: Britishness and the Spectre of Europe, London: Continuum 2010, esp. 58-60. I am grateful to Catarina Dutilh-Novaes for bringing this book to my attention. See also his brief blog post focussing on Russell.

*** Currently, Laura Georgescu and I are preparing a special issue on the Uses and Abuses of History in Analytic Philosophy for JHAP. Please contact us if you are interested in contributing!

Networks and friendships in academia

Recently, I came across an unwelcome reminder of my time as a graduate student and my early-career days. It had the shape of a conference announcement that carries all the signs of a performative contradiction: it invites you by exclusion. What can we learn from such contradictions?

The announcement invites early-career people to attend seminars that run alongside a conference whose line-up is already fixed and seems to consist mainly of a circle of quite established philosophers who have been collaborating closely ever since. Since the invitation is not presented as a “call”, it’s hard to feel invited in the first place. Worse still, you’re not asked to present at the actual conference but to attend “seminars” that are designed “to motivate students and young scholars from all over the world to do research in the field of medieval philosophy and to help them learn new scientific methodology and develop communication skills.” If you’re still interested in attending, you’ll look in vain for time slots dedicated to such seminars. Instead, there is a round table on the last day, scheduled for the same time the organising body holds their annual meeting, thus probably without the established scholars.* You might say there is a sufficient amount of events, so just go somewhere else. But something like the work on the “Dionysian Traditions” is rarely presented. In fact, medieval philosophy is often treated as a niche unto itself, so the choice is not as vast as for, say, analytic metaphysics.

If you think this is problematic, I’ll have to disappoint you. There is no scandal lurking here. Alongside all the great efforts within a growingly inclusive infrastructure of early career support, things like that happen all the time, and since my time as a professor I have been accused of organising events that do at least sound “clubby” myself. Of course, I’m not saying that the actual event announced is clubby like that; it’s just that part of the description triggers old memories. When I was a graduate student, in the years before 2000, at least the academic culture in Germany seemed to be structured in a clubby fashion. By “structured” I mean that academic philosophy often seemed to function as a simple two-class system, established and not-established, and the not-established people had the status of onlookers. They were, it seemed, invited to kind of watch the bigger figures and learn by exposure to greatness. But make no mistake; this culture did not (or not immediately) come across as exclusionary. The onlookers could feel privileged for being around. For firstly, even if this didn’t feel like proper participation, it still felt like the result of a meritocratic selection. Secondly, the onlookers could feel elated, for there was an invisible third class, i.e. the class of all those who either were not selected or didn’t care to watch. The upshot is that part of the attraction of academia worked by exclusion. As an early career person, you felt like you might belong, but you were not yet ready to participate properly.

Although this might come across as a bit negative, it is not meant that way. Academia never was an utopian place outside the structures that apply in the rest of the world. More to the point, the whole idea of what is now called “research-led teaching” grew out of the assumption that certain skills cannot be taught explicitly but have to be picked up by watching others, preferably advanced professionals, at work. Now my point is not to call out traditions of instructing scholars. Rather, this memory triggers a question that keeps coming back to me when advising graduate students. I doubt that research-led teaching requires the old class system. These days, we have a rich infrastructure that, at least on the surface, seems to counter exclusion. But have we overcome this two-class system, and if not, what lesson could it teach us?

Early career people are constantly advised to advance their networking skills and their network. On the whole, I think this is good advice. However, I also fear that one can spend a quarter of a lifetime with proper networking without realising that a network as such does not help. Networks are part of a professionalised academic environment. But while they might help exchanging ideas and even offer frameworks for collaborative projects, they are not functional as such. They need some sort of glue that keeps them together. Some people believe that networks are selective by being meritocratic. But while merit or at least prestige might often belong to the necessary conditions of getting in, it’s probably not sufficient. My hunch is that this glue comes in the shape of friendship. By that I don’t necessarily mean deeply personal friendships but “academic friendships”: people like and trust each other to some degree, and build on that professionally. If true, this might be an unwelcome fact because it runs counter to our policies of inclusion and diversity. But people need to trust each other and thus also need something stronger than policies.

Therefore, the lesson is twofold: On the one hand, people need to see that sustainable networks require trust. On the other hand, we need functional institutional structures to both to sustain such networks and to counterbalance the threat of nepotism that might come with friendship. We have or should have such structures in the shape of laws, universities, academic societies and a growing practice of mentoring. To be sure, saying that networks are not meritocratic does not mean that there is no such thing as merit. Thus, such institutions need to ensure that processes of reviewing are transparent and in keeping with commitments to democratic values as well as to the support of those still underrepresented. No matter whether this concerns written work, conferences or hiring. But the idea that networks as such are meritocratic makes their reliance on friendships invisible.

Now while friendships cannot be forced, they can be cultivated. If we wish to counter the pernicious class system and stabilise institutional remedies against it, we should advise people to extend (academic) friendships rather than competition. Competition fosters the false idea that getting into a network depends on merit. The idea of extending and cultivating academic friendship rests on the idea that merit in philosophy is a collective effort to begin with and that it needs all the people interested to keep weaving the web of knowledge. If at all, it is this way that meritocratic practices can be promoted; not by exclusion. You might object that we are operating with limited resources, but if the demand is on the rise, we have to demand more resources rather than competing for less and less. That said, cultivating academic friendships needs to be counterbalanced by transparency. Yet while we continue to fail, friendships are not only the glue of networks, but might be what keeps you sane when academia seems to fall apart.

Postscriptum I: So what about the conference referred to above? The event is a follow-up from a conference in 1999, and quite some of the former participants are present again. If it was, as it seems, based on academic friendships, isn’t that a reason to praise it? As I said and wish to emphasise again, academic friendships without institutional control do not foster the kinds of inclusive environments we should want. For neither can there be meritocratic procedures without the inclusion of underrepresented groups, nor can a two-class separation of established and not-established scholars lead to the desired extension of academic friendships. In addition to the memories triggered, one might note other issues. Given that there are comparatively many women working in this field, it is surprising that only three women are among the invited speakers. That said, the gendered conference campaign has of course identified understandable reasons for such imbalances. A further point is the fact that early career people wishing to attend have roughly two weeks after the announcement to register and apply. There is no reimbursement of costs, but one can apply for financial support after committing oneself to participate. – In view of these critical remarks, it should be noted again that this conference rather represents the status quo than the exception. The idea is not to criticise that academic friendships lead to such events, but rather to stress the need for rethinking how these can be joined with institutional mechanisms that counterbalance the downsides in tightening our networks.

___

* Postscriptum II (14 March 2019): Yes. Before writing this post, I sent a mail to S.I.E.M.P. inquiring about the nature of the seminars for early career people. I asked:

(1) Are there any time slots reserved for this or are the seminars held parallel to the colloquium?
(2) What is the “new scientific methodology” referred to in the call?
(3) And is there any sort of application procedure?

The mail was forwarded to the local organisers and prompted the following reply:

“Thank you for interest in the colloquium on the Dionysian Traditions!

The time for the seminars is Friday morning. The papers should not be longer than 20 minutes. You should send us a list with titles, and preferably – with abstracts too. We have a strict time limit and not everyone may have the opportunity to present. Travel and accommodation costs are to be covered by the participants.

The new scientific methodology is the methodology you deem commensurate with the current knowledge about the Corpus.”

Apart from the fact that the event runs from a Monday to a Wednesday, the main question about the integration and audience of these seminars remains unanswered. Assuming that “Friday” is Wednesday, the seminars conicide with the announced round table, to be held at the same time at which the bureau of S.I.E.P.M. holds their meeting. (This was confirmed by a further exchange of mails.) But unlike the announcement itself, the mail now speaks of “papers” that the attendees may present.

Fake news, faith, and the know-it-all

When working on Ockham’s discussion of the distinction between faith and reason, I encountered an interesting kind of sentence, the so-called “neutral proposition” (propositio neutra). A common example for such a sentence is “the number of stars is even.” It is neutral in that we have no grounds for assenting or withholding assent. We grasp what it means but we are neither compelled to believe it nor to disbelieve it. (Please note: “neutral” doesn’t necessarily mean that the proposition is neither true nor false; it just means that we have currently no way of figuring out whether it’s true or false.)* In fact, many important things we believe seem to have that status, at least at the time of learning about them. We believe that we have been born in a certain year, that the earth is round and so on. Most of us learn such things through the testimony of others without ever checking them. Although the context of the discussion in Ockham is theological, his ideas generalise: there are many things we do and need to take on faith. I think that this fact is crucial but underrated in the discussion of fake news.** A very widespread response to the phenomenon of fake news is to recommend fact checking. I think this is one-sided and thus problematic. When we have the suspicion that some news item is fake news, then we often are in a position where we cannot (immediately) assess the information. In other words, news are much of the time a collection of neutral propositions for us. In what follows, I’d like to suggest that we need to consider the role of faith or trust as well as the related role of (intellectual) humility, if we want to tackle this issue.

We don’t only learn things through others; we also learn early on that it is vital to trust others and trust what they say. Trust is the glue that holds our societies and our lives together. It’s not surprising, then, that we have a tendency to believe everything we perceive and read. Yes, every now and then we might step back and look again, but our default mode is to believe.*** So even if, strictly speaking, a neutral proposition comes our way, we will embrace it. Read the following sentence: “The majority of people now living in Prenzlauer Berg (in Berlin) have migrated there from Southern Germany.” Do you believe it? Of course, the current context of discussion might make you doubtful, but you’d probably read on without hesitation if this were a newspaper article on urban life in Berlin. Your response would not be to fact-check but to believe, unless something triggers a doubt.

This psychological fact, the “bias to believe”, has a number of consequences. We are inclined to believe things. If this is the glue of our lives, then any dysfunction of that glue will hurt us. We will be hurt if our trust is exploited. More importantly perhaps, our pride will be hurt if we are found out to have assented to a lie or even passed on a piece of false information. We will be called naïve, and people will reduce the degree of trust in us. Do you like to be called naïve? – I don’t. So what do I do? That depends on my emotional and other resources. Was it a one-off? Were you just told that Father Christmas doesn’t have a beard? That’s fine. But what if your whole belief system is branded as a result of naivety? You certainly will feel excluded, to put it mildly.

Let’s shift the focus for a second: how will you feel if you are a religious person who is told, again and again, that there is no God, that atheism is the way to go and that religion is anti-science? It is often said that matters of religion are a private issue. Psychologically speaking, this cannot be right. If trusting and believing are the glue of society, then attacks on our beliefs will hurt and upset individuals and, by extension, our society. Of course, many people have come to live with that. For many, it’s part of the package I guess. We can be pluralists. But the direct confrontation might still hurt. And if we can choose our company, we might be inclined to stick with those who respect our beliefs and perhaps host a quiet resentment towards those who feel justified in attacking us.

The point I want to return to now is that criticism of our beliefs often not only concerns individual convictions but also targets the trust we have in others, the trust on which we were inclined to embrace certain beliefs. Religion is just one of many possible examples. Most of our beliefs are deeply entrenched in our daily actions and partly shared conventions: be they religious, political, aesthetical etc. But the example of religion is a helpful one, since there is hardly any field in which people seem to feel so justified to self-righteously criticise others, and this despite the fact that most beliefs in this area are not attacked because they could be shown to be false. Most beliefs in this realm are a matter of faith. They are what I introduced as neutral propositions, to which we are neither compelled to assent nor to dissent. There is a huge difference between the agnostic claim that we do not know about these matters and the more invested claim that certain beliefs are false. In some cases, such a stance might be justified; in other cases, we might just act like a know-it-all. My hunch is that the latter stance is fairly widespread and causes much more controversy than is justified by the evidence the participants in disagreements can invoke.

If we are criticised for holding certain beliefs, this might of course be justified. There is nothing wrong with that. What I am concerned with is beliefs that are based on neutral propositions. Of course you might argue that one should only believe what one has evidence for. Good luck with that! – If we are dealing with information that we can’t assess, we have three options: we can embrace it (which is what we are inclined to do); we can (try to) reject it; or we can acknowledge – hold your breath, drumroll: we can acknowledge that we do not know whether it’s true or not. The virtue I am referring to is known as (intellectual) humility. Of course, we can do what we like if we are by ourselves, scrolling through the web or listening quietly. But if we are in a discussion, our choice matters. Do we want to criticise? By all means, if it is justified. But more often than not our own means are limited: we have stored whole systems of beliefs, without ever checking whether they are true. If we are not sure, it might be advisable to just acknowledge that. Criticising others in their beliefs is probably going to hurt them, more or less. The point is not to stop being critical; the point is to figure out what we are critical towards. Instead of saying, “you are mistaken”, we can also say, “I don’t know whether that’s right or not.” You can then establish whether and how that can be checked.

Now of course this does not mean that we should try and check all the beliefs we hold. Luckily, we have a division of labour. My parents know my birthday; so I don’t need to work it out by going to archives. There are a number of authorities we rely on. “Relying on authorities” sounds naïve perhaps, but that’s what we do when we trust others. If we have disagreements with others about politics or religion, this is often owing to the fact that we rely on different authorities or that we prioritise different authorities. Authorities come in various shapes. Often we don’t even notice them, because they have the form of deeply entrenched ideologies, promoting misogyny, racism and other forms of dehumanisation. Equally often they might concern ideas about how the world works, about what is valuable, what is useful etc. Beliefs about all such matters can be spread by everyone, with quite different epistemic status. In some matters, we trust our friends more than others, even if they might lack epistemic credentials. Criticising others often involves criticising their authorities. Again, that’s fine and often vital, but it’s equally important to be aware that we are doing it, because it concerns the glue of trust that potentially holds us together or keeps us apart if we disagree.

Calling out “fake news” is a way of criticising such authorities. Now what should we do in cases of disagreement? Criticism is of course important. But it is eqally important to see that we are interacting with others whose beliefs are at stake. Even if we suspect that the politicians or the news venues in question are merely bullshitting, the believers are inclined to take their words for granted; they trust their authorities. Now the point is not to be nice to people who believe bullshit; the point is to acknowledge that they have reasons to believe that bullshit. Calling believers (of bullshit or whatever) stupid will only deepen the rupture of trust. What’s crucial to see is that they will see our criticism in the same way as we see their beliefs. What we should establish in such disagreements, then, is whether we might perhaps be dealing with a neutral proposition. That might actually reveal a commonality between us and our interlocutor. We might both be in a position in which we don’t know for sure what’s going on. If we can establish that, we might gain more ground by scratching our heads than insisting we’re on the right side.

___

* This wasn’t really clear in the original post. Thanks to CJ Sheu for the fruitful discussion.

** Part of my reflections have been triggered by an excellent new book by Romy Jaster and David Lanius. Get it, if you have some German.

*** See for instance Eric Mandelbaum, Thinking is Believing.

The competition fallacy

“We asked for workers. We got people instead.” Max Frisch

 

Imagine that you want to buy an album by the composer and singer Caroline Shaw, but they sell you one by Luciano Pavarotti instead, arguing that Pavarotti is clearly the more successful and better singer. Well, philosophers often make similar moves. They will say things like “Lewis was a better philosopher than Arendt” and even make polls to see how the majority sees the matter. Perhaps you agree with me in thinking that something has gone severely wrong in such cases. But what exactly is it? In the following I’d like to suggest that competitive rankings are not applicable when we compare individuals in certain respects. This should have serious repercussions on thinking about the job market in academia.

Ranking two individual philosophers who work in fairly different fields and contexts strikes me as pointless. Of course, you can compare them, see differences and agreements, ask about their respective popularity and so forth. But what would Lewis have said about the banality of evil? Or Arendt about modal realism? – While you might have preferences for one kind of philosophy over another, you would have a hard time explaining who the “better” or more “important” philosopher is (irrespective of said preferences). There are at least three reasons for this: Firstly, Arendt and Lewis have very little point of contact, i.e. a straightforward common ground on which to plot a comparison of their philosophies. Secondly, even if they had more commonalities or overlaps, the respective understandings of what philosophy is and what good philosophy should accomplish can be fairly different. Thirdly and perhaps most importantly, philosophies are always individual and unique accomplishments. Unique creations are not something one can have a competition about. If we assume that there is a philosophical theory T1, T1 is not the kind of thing that you can compete about being better at. Of course, you can refine T1, but then you’ve created a refined theory T2. Now you might want to claim that T2 can be called better than T1. But what would T2 be, were it not for T1? Relatedly, philosophers are unique. The assumption that what one philosopher does can be done better or equally well by another philosopher is an illusion fostered by professionalised environments. People are always unique individuals and their ideas cannot be exchanged salva veritate.*

Now since there are open job searches (sometimes even without a specification of the area of specialisation) you could imagine a philosophy department in 2019 having to decide whether they hire Lewis or Arendt. I can picture the discussions among the committee members quite vividly. But in doing such a search they are doing the same thing as the shop assistant who ends up arguing for Pavarotti over Shaw. Then words like “quality”, “output”, “grant potential”, “teaching evaluations”, “fit” … oh, and “diversity” will be uttered. “Arendt will pull more students!” – “Yeah, but what about her publication record? I don’t see any top journals!” – “Well, she is a woman.” In a good world both of them would be hired, but we live in a world where many departments might rather hire two David Lewises. So what’s going on?

It’s important to note that the competition is not about their philosophies: Despite the word “quality”, for the three reasons given above, the committee members cannot have them compete as philosophers. Rather, the department has certain “needs” that the competition is about.** The competition is about functions in the department, not about philosophy. As I see it, this point generalises: competitions are never about philosophy but always about work and functions in a department.*** Now, the pernicious thing is that departments and search committees and even candidates often pretend that the search is about the quality of their philosophy. But in the majority of cases that cannot be true, simply because the precise shape, task and ends of philosophy are a matter of dispute. What weighs is functions, not philosophy.

Arguably, there can be no competition between philosophers qua philosophers. Neither between Arendt and Lewis, nor between Arendt and Butler, nor between Lewis and Kripke. Philosophers can discuss and disagree but they cannot compete. What should they compete about? If they compete about jobs, it’s the functions in departments that are at stake. (That is also the reason why we allow for prestige as quality indicators.)  If they assume to be competing about who is the better philosopher, they mistake what they are doing. Of course, one philosopher might be preferred over another, but this is subject to change and chance, and owing to the notion of philosophy of the dominant committee member. The idea that there can be genuinely philosophical competition is a fallacy.

Does it follow, then, that there is no such thing as good or better philosophy? Although this seems to follow, it doesn’t. In a given context and group, things will count as good or better philosophy. But here is another confusion lurking. “Good” philosophy is not the property of an individual person. Rather, it is a feature of a discussion or interacting texts. Philosophy is good if the discussion “works well”. It takes good interlocutors on all sides. If I stammer out aphorisms or treatises, they are neither good nor bad. What turns them into something worthwhile is owing to those listening, understanding and responding. To my mind, good quality is resourcefulness of conversations. The more notions and styles of philosophy a conversation can integrate, the more resources it has to tackle what is at stake. In philosophy, there is no competition, just conversation.

Therefore, departments and candidates should stop assuming that the competition is about the quality of philosophy. Moreover, we should stop claiming that competitiveness is an indicator of being a good philosopher.**** Have you ever convinced an interlocutor by shouting that you’re better or more excellent than them?

___

* At the end of the day, philosophies are either disparate or they are in dialogue. In the former case, rivalry would be pointless; in the latter case, the rivalry is not competitive but a form of (disagreeing or agreeing) refinement. If philosophers take themselves to be competing about something like the better argument, they are actually not competing but discussing and thus depend on one another.

** This does not mean that these needs or their potential fulfillment ultimately decide the outcome of the competition. Often there is disagreement or ignorance about what these needs are or how they are to be prioritised. With regard to committees, I find this article quite interesting.

*** In a recent blog post, Ian James Kidd distinguishes between being good at philosophy vs being good at academic philosophy. It’s a great post. (My only disagreement would be that being good at philosophy is ultimately a feature of groups and discussions, not individuals.) Eric Schliesser made similar points in an older more gloomy post.

**** On FB, Evelina Miteva suggeststhat we need fair trade philosophy, like the fair trade coffee. Fair trade coffee is not necessarily of a better taste or quality, it ‘only’ makes sure that the producers will get something out if their work.” – I think this is exactly right: On some levels, this already seems to be happening, for instance, in the open access movement. Something similar could be applied to recruiting and employment conditions in academia. In fact, something like this seems to be happening, in that some universities are awarded for being family friendly or being forthcoming in other ways (good hiring practice e.g.). – My idea is that we could amend many problems (the so-called mental health crisis etc.), if we were to stop incentivising competitiveness on the wrong levels and promote measures of solidarity instead. – The message should be that the community does no longer tolerate certain forms of wrong competition and exploitation.

Relatedly, this also makes for an argument in favour of affirmative action against discrimination of underrepresented groups: People who believe in meritocracy often say that affirmative action threatens quality. But affirmative action is not about replacing “good” with “underrepresented” philosophers. Why? Because the quality of philososphy is not an issue in competitive hiring in the first place.

Diversity and Vulnerability (Part II)

My parents were both refugees. In 1945, when my mother was five and my father thirteen years old, they both left their homes in what were villages near Kaliningrad and Wachlin, and started moving west. Eventually they both ended up in a small town near Düsseldorf where they met in the sixties. I’m told it wasn’t as bad as it is now for refugees by far, but they never felt very welcome. Thinking about their lives, it dawned on me rather late that one of the crucial driving factors in their conduct was the constant attempt to avoid attracting attention at all costs. “What will the neighbours say”, was a repeated phrase. While the phrase is rather common, I guess the intensity of the shame behind it will vary. For better or worse, it didn’t stick with me too much. But the issue of shame and hiding oneself is another one that keeps coming back when I think about diversity and what blocks it. – In what follows, I don’t want to speak out against diversity. Quite the contrary. But I want to reflect on what needs to change if we really want at least some of it.

So what has shame to do with diversity? – One of the assets of a fairly diverse team is that, at least after a while, people take fewer things for granted, ask more questions, and get to see things they hadn’t expected to see. In one word, multiple perspectives. But even if the members of a team are diverse (in whatever ways), this is hard to achieve. The reason is that, for this to occur, people have to be open. People have to make themselves visible. Being open, not as in calling a spade a spade, but as in showing your perspectives with all their possible shortcomings, that kind of being open is hard even among friends. Showing yourself like that creates great potential but leaves yourself vulnerable.*

Obviously, the vulnerability of people from “diverse backgrounds” or “underrepresented groups” (of whatever kind) is infinitely greater than of those who conform to perceived majorities. That shouldn’t be surprising because the very fact that someone stands out with ‘diversity markers’ puts them on the spot. Now in professional contexts, we are trained to conform as much as possible. Despite all the talk about fresh ideas people will call you “weird” (rather than “inspiring”) before you finish your sentence. Of course, it’s fine to be weird, but only if you’ve got tenure. So you’ll rather do and talk as everyone else does. And if you try otherwise, see what happens. Perhaps you get to be the token weirdo and people put you in a nice bowl and keep you on an extra shelf in the department, but it’s more likely that you just created more space for the next conformist. The reason is quite simple: academia is a competitive environment; the expectation is that you excel in common features, not in something no one else does. Accordingly, hiring committees look for “fit”, not for fun or something they need to look at or think about twice.

Now what’s going on here? I think that employers should stop pretending to be looking for diversity. What employers actually do when they pretend to look for candidates from “underrepresented groups” is attempting to circumvent common discriminatory behaviour. That is not a particularly noble end but a political necessity. In such cases, people are not sought out because one wants to diversify teams, but because discrimination is unjust and unlawful (in some places at least). Of course, discrimination is still a thing, and it should end. But ending discrimination is not the same as implementing diversity.

So how do we implement diversity? While discrimination can be countered by amending formal procedures such as hiring processes, diversity is something that is not or not necessarily owing to the fact that someone is from an underrepresented group. Of course, this can go together but it doesn’t need to. A diversity of ideas, approach or method is something that anyone might have for whatever reason. The crucial step to enable such diversity is a climate in which people trust each other. Trust each other sufficiently to be open and make themselves visible in their vulnerability. That would be a climate in which even strangers would feel welcome to share their views. That is something we need to work on all together. For the shame and pressure to conform is a thing for all of us. They more we let go of it, the better for everyone involved, not least for those from “underrepresented groups”. It’s a matter of solidarity, not a policy.

____

* On vulnerability, I learned a lot through Brené Brown’s talks (here’s a ted talk) as well as the work by my former colleague Christine Straehle (here’s a collected volume).

Some observations concerning diversity (Part I)

(1) This morning, I took my daughter Hannah to a nice playground in Berlin. She is just over two years old and particularly enjoys climbing. As we approached the climbing frame, one of the boys on top shouted that they wouldn’t allow girls on the climbing frame. The girls around him smiled …

(2) Two years ago, I joined an initiative designed to keep a far-right right party (AFD) out of parliament. The initiative was founded by a group of social entrepreneurs, and the first workshop was attended mainly by social entrepreneurs, most of them in their late twenties or early thirties. At the beginning, we were asked whether we actually knew anyone who would vote for the AFD. This was affirmed only by one man, somewhat older than most. He told us that he knew a few people in the village he came from. Later in the day, we had to form smaller groups. Guess who had difficulty finding a group …

(3) Quite a number of years ago, I was on a hiring committee while still being a postdoc. For the first round of going through the applications, we were advised to focus on CVs. Discussing the candidates, I listened to a number of meritocratic arguments: they mostly, but not solely, concerned numbers of “good” publications (“good” means in prestigious venues, in case you were wondering). Then someone pointed out that we should perhaps take more time for reading. Someone else pointed out that we should scan cover letters for reasons of interruptions in the CVs, such as times for child care work. Guess what didn’t carry any weight when we drew up the list …

I could go on. These are minor observations; anyone could have made them. Nevertheless, in all their banality, these stories of exclusion form the daily fabric of our failures. But where exactly can we locate the failures? The people involved in these situations were not bad people, not at all: lively children on a playground, ambitious young entrepreneurs fighting for a good cause, a hiring committee focussed on selecting the best candidate. The events didn’t result in catastrophes: Hannah climbed up anyway, the initiative went nowhere, someone certainly deserving got a job. – The point is that these situations are so ordinary that hardly anyone would consider or be willing to cause much of a fuss. Or so it seems. The harm might be done: The next little girl might be discouraged from going to the playground; the next lonely man from the village will perhaps find better company among the far-right; a number of candidates, who would have been equally deserving even by the committee’s standards, were not even considered. Even if no one tells them, the stories of exclusion go on. But what can we do?

The first thing is that in all these situations, there was a possibility of making a difference. Contrary to a widespread belief, we are not powerless minions. The fact that we cannot save the whole world doesn’t mean that we cannot change our ways here and there, or listen for a change. But there are two things that make me doubtful:

  • Despite all the fashionable talk of diversity, such stories suggest that many of us don’t really want diversity. What these situations reveal is that the exclusive attitudes seem to be very deeply ingrained.
  • Even if such attitudes can be overcome, I sense a sort of paranoia amongst many people in positions of public leadership. If you look at more and less well-known people in politics, such as presidents, prime ministers, chancellors etc., many of them display a fear of everything that is different. But while it’s easy to call them names for it, we shouldn’t forget that they get elected by majorities.

It’s complicated. The exclusions we see are not ad hoc, but fostered by long-standing structures. There’s much of that I dislike or even hate, while I continue to be a beneficiary in some way or another. That is probably true of many of us. Thus, the charge of hypocrisy always seems justified. But if that’s true, the charge of hypocrisy shouldn’t cause too much worry. The worry of being called a hypocrite might be paralysing. What shouldn’t be forgotten is that hypocrisy is structural, too. It can’t be amended by targeting individuals. Everyone who’s is trying to interfere and make a difference, can be called a hypocrite.

Why is the Wall still dividing us? The persistence of the intellectual cold war

Discussing some ideas relating to my piece on the divide between Eastern and Western Europe, I was struck by the following question: “How is it possible that the Wall separating East and West is still so successful?” It’s true! While I thought that 1989 marks a historical turning point that initiated an integrative process, this turns out to be a cherished illusion of mine. As my interlocutor pointed out, the Wall is still successful. It continues to keep the people apart. OK, I wanted to object, there is still an economic divide and that takes time of course. “No”, she retorted, “it’s not only economic, it’s also intellectual.” So there the point was being made again: the cold war attitude continues, also intellectually. What does that mean?

To keep things simple, I’ll focus on Germany. Let’s begin with my illusion. Having grown up in a small town near Düsseldorf, the Wall seemed far away. Apart from very few visits of relatives, I had no contacts and not much of an idea of the former GDR. Although I was quite moved by the fall of the Wall, it didn’t seem to affect my life directly. That changed when I moved to Budapest for one and a half years in 1994. I realised that my education had failed me. I knew hardly anything about the history of Hungary or Eastern Europe, not even about East Germany. Nevertheless, I was convinced that now there would be a process of integration under way. Europe would grow together. Slowly but surely; it had to. Did it?

East Germany is still a thing. That is strange in itself. It shouldn’t be. After all, the Wall has now been down slightly longer than it was up. It was built in 1961 and came down in 1989. Since then, 30 years have passed. But it must be liberating to think that neo-nazism, for instance, is not a problem in Germany but East Germany. The East is a perfect scapegoat in several domains. The narrative is dominated by the economic imbalance and the idea that people from the (former) East lack democratic education. The upshot of the narrative seems to be this: East Germans have a different mentality. The former generations were communists or dominated by communists. Their offspring are neo-nazis or indoctrinated by neo-nazis. You get the idea…

Of course, many biographies have been upset one way or another. Of course, the situation is complex, but after looking at some figures the crucial problems don’t strike me as owing primarily to a difference in mentality. This doesn’t mean that there is no problem of neo-nazism. But how did things start moving in the East? The former GDR seems to have been “colonised”, as some put it already by 1997. What were the effects in academia? Today, I read an article with the headline that “men lead three quarters of German universities”. Yes, I can imagine that. What the headline didn’t state but what I found among the figures is that “no rector or president” originates from East Germany. OK, not every academic biography culminates in leading a university. But still: reading this, I felt reminded of stories about “liquidation”. So I looked up some more figures: “By 1994, over 13,000 positions at universities in east Germany had been liquidated, and 20,000 more people (including 5,000 professors) had lost their jobs. Many teachers and academics were fired ‘despite the fact that they had proven professional qualifications.’” It’s hard to believe that these measures did anything to facilitate integration, let alone intellectual integration.

At this point, it doesn’t look like the divisive East-West narratives will loose their grip anytime soon. But why not? I guess that the colonialisation as part of the unification process will continue to be justified. It needs to be justified to cover up the crimes committed. But how can such things be justified? How can you justify to have sacked a huge amount of people and taken over quite some of their positions? Again, it will continue to be blamed on the mentality: If they complain about the past, you can blame it on their mentality. The lack of democratic education left them unfit for such responsibilities and now they are being nostalgic.* If they complain about their situation now, you can still blame it on the mentality. Which mentality? Well, they are still lacking democratic education and cannot understand that the state doesn’t have the duty to coddle them in their unemployment. Whichever way you see it, they just turn out to be incapable. So taking over responsibilities was and is justified.

It goes without saying that the forgoing is verging on a black and white picture. The idea is not to explain such complex political developments in a blog post. The idea is to gesture towards an answer to the question of why the Wall continues to be so efficient, even though it came down 30 years ago. What is the ideology that is so persistent? I truly don’t know. Perhaps the ideology of those erecting the Wall is still or again at work. Or the very idea of a Wall, of a divide, is more convenient than we want to believe. For it can serve in justifying differences and injustices even after tearing it down. In any case, the reference to “mentality” strikes me as a highly problematic form of moral exclusion.

So perhaps the Wall continues to serves as a justificational loop. How does it work? The people blaming each other will most likely not assume that they are just pointing to a supposed mentality in order to justify an injustice. They will take their beliefs (about communism, nazism, and capitalism etc.) to be true of those accused. But they can likely do so as long as the justificational mechanism of the Wall keeps them apart.

____

* Thinking about (n)ostalgia, there is also the more sinister explanation that the cherished memories do not simply reach back to the time after 1961, but further back to the times after 1933. In other words, if you want to explain the far-right movement by reference to mentality, you might look to the mentality that spans across East and West Germany, nourished well before the Wall was erected.

History of contemporary thought and the silence in Europe. A response to Eric Schliesser

What should go into a history of contemporary ideas or philosophy? Of course this is a question that is tricky to answer for all sorts of reasons. What makes it difficult is that we then tend to think of mostly canonical figures and begin to wonder which of those will be remembered in hundred years. I think we can put an interesting spin on that question if we approach it in a more historical way. How did our current thoughts evolve? Who are the people who really influenced us? There will not only be people whose work we happen to read, but those who directly interact and interacted with us. Our teachers, fellow students, friends and opponents. You might not think of them as geniuses, but we should drop that category anyway. These are likely people who really made a difference to the way you think. So let’s scratch our heads a bit and wonder who gave us ideas directly. In any case, they should figure in the history of our thought.

You might object that these figures would not necessarily be recognised as influential at large. However, I doubt that this is a good criterion: our history is not chiefly determined by who we take to be generally influential, but more often than not by those people we speak to. If not, why would historians bother to figure out real interlocutors in letters etc.? This means that encounters between a few people might make quite a difference. You might also object that a history of contemporary philosophy is not about you. But why not? Why should it not include you at least? What I like about this approach is that it also serves as a helpful corrective to outworn assumptions about who is canonical. Because even if certain figures are canonical, our interpretations of canonical figures are strongly shaped by our direct interlocutors.

Thinking about my own ideas in this way is a humbling experience. There is quite a number of people inside and outside my department to whom I owe many of my ideas. But this approach also reveals some of the conditions, political and other, that allow for such influence. One such condition I am painfully reminded of when observing the current political changes in Europe. No, I do not mean Brexit! Although I find these developments very sad and threatening indeed, most of the work done by friends and colleagues in Britain will reach me independently of those developments.

But Central and Eastern Europe is a different case. As it happens, the work that affected my own research most in the recent years is on the history of natural philosophy. It’s more than a guess when I say that I am not alone in this. Amongst other things, it made me rethink our current and historical ideas of the self. Given that quite a number of researchers who work on this happen to come from Central and Eastern Europe, much of this work probably wouldn’t have reached me, had it not been for the revolutions in 1989. This means that my thinking (and most likely that of others, too) would have been entirely different in many respects, had we not seen the Wall come down and communist regimes overthrown.

Why do I bring this up now? A brief exchange following up on an interesting post by Eric Schliesser* made it obvious that many Western Europeans, by and large, seem to assume that the revolutions from 1989 have had no influence on their thought. As he puts it, “the intellectual class kind of was conceptually unaffected” by them. And indeed, if we look at the way we cite and acknowledge the work of others, we regularly forget to credit many, if not most, of our interlocutors from less prestigious places. In this sense, people in what we call the Western world might be inclined to think that 1989 was not of significance in the history of thought. I think this is a mistake. A mistake arising from the canonical way of thinking about the work that influences us. Instead of acknowledging the work of individuals who actually influence us, we continue citing the next big shot whom we take to be influential in general. By excluding the direct impact of our actual interlocutors, we make real impact largely invisible. Intellectually, the West behaves as if it were still living in the Cold War times. But the fact that we continue to ignore or shun the larger patterns of real impact since 1989 does not entail that it is not there. Any claim to the contrary would, without further evidence at least, amount to an argument from ignorance.

The point I want to make is simple: we depend on other people for our thought. We need to acknowledge this if we want to understand how we come to think what we think. The fact that universities are currently set up like businesses might make us believe that the work people do can (almost) equally be done by other people. But this is simply not true. People influencing our thought are always particular people; they cannot be exchanged salva veritate. If we care about what we think, we should naturally care about the origin of our thought. We owe it to particular people, even if we sometimes forget the particular conversations in which our ideas were triggered, encouraged or refuted.

Now if this is correct, then it’s all the more surprising that we let go of the conditions enabling much of this exchange in Europe so easily. How is it possible, for instance, that most European academics remain quiet in the face of recent developments in Hungary? We witnessed how the CEU was being forced to move to Vienna in an unprecedented manner, and now the Hungarian Academy of Sciences is targeted.**

While the international press reports every single remark (no matter how silly) that is made in relation to Brexit, and while I see many academics comment on this or that aspect (often for very good reasons), the silence after the recent events in Hungary is almost deafening. Of course, Hungary is not alone in this. Academic freedom is now targeted in many places inside and outside Europe. If we continue to let it happen, the academic community in Europe and elsewhere will disintegrate very soon. But of course we can continue to praise our entrepreneurial spirit in the business park of academia and believe that people’s work is interchangeable salva veritate; we can continue talking to ourselves, listen diligently to our echoes, and make soliloquies a great genre again.

____

* See also this earlier and very pertinent post by Eric Schliesser.

** See also this article. And this call for support.

Should contemporary philosophers read Ockham? Or: what did history ever do for us?

If you are a historian of philosophy, you’ve probably encountered the question whether the stuff you’re working on is of any interest today. It’s the kind of question that awakens all the different souls in your breast at once. Your more enthusiastic self might think, “yes, totally”, while your methodological soul might shout, “anachronism ahead!” And your humbler part might think, “I don’t even understand it myself.” When exposed to this question, I often want to say many things at once, and out comes something garbled. But now I’d like to suggest that there is only one true reply to the main question in the title: “No, that’s the wrong kind of question to ask!” – But of course that’s not all there is to it. So please hear me out.

I’m currently revisiting an intriguing medieval debate between William of Ockham, Gregory of Rimini and Pierre D’Ailly on the question of how thought is structured. While I find the issue quite exciting in itself, it’s particularly interesting to see how they treat their different sources, Aristotle and Augustine. While they clearly all know the texts invoked they emphasise different aspects. Thinking about thought, William harkens back to Aristotle and clearly thinks that it’s structure that matters. By contrast, Gregory goes along with Augustine and emphasises thought as a mental action. – For these authors it was clear that their sources were highly relevant, both as inspiration and authorities. At the same time, they had no qualms to appropriate them for their uses. – In some respects we make similar moves today, when we call ourselves Humeans or Aristotelians. But since we also have professional historians of philosophy and look back at traditions of critical scholarship, both historians and philosophers are more cautious when it comes to the question of whether some particular author would be “relevant today”.

In view of this question, historians are trained to exhibit all kinds of (often dismissive) gut reactions, while philosophers working in contemporary themes don’t really have time for our long-winded answers. And so we started talking past each other, happily ever after. That’s not a good thing. So here is why I think the question of whether any particular author could inform or improve current debates is wrongheaded.

Of course everyone is free to read Ockham. But I wouldn’t recommend doing it, if you’re hoping to enrich the current debates. Yes, Ockham says a lot of interesting things. But you’d need a long time to translate them into contemporary terminology and still more time to find an argument that will look like a right-out improvement of a current argument.* – My point is not that Ockham is not an interesting philosopher. My point is that Ockham (and many other past philosophers) doesn’t straightforwardly speak to any current concerns.

However … Yes, of course there was going to be a “however”! However, while we don’t need to ask whether any particular author is relevant today, we should be asking a different question. That Ockham doesn’t speak to current concerns doesn’t mean that historians of philosophy (studying Ockham or others) have nothing to say about current concerns. So it’s not that Ockham should be twisted to speak to current concerns; rather historians and philosophers should be talking to each other! So the right question to ask is: how can historians speak to current issues?

The point is that historians study, amongst other things, debates on philosophical issues. “You say tomahto, I say tomato”, that sort of thing. Debates happen now as they happened then. What I find crucial is that studying debates reveals features that can be informative for understanding current debates. There are certain conditions that have to be met for a debate to arise. We’re not just moving through the space of reasons. Debates occur or decline because of various factors. What we find conceptually salient can be driven by available texts, literary preferences, other things we hold dear, theological concerns, technological inventions (just think of various computer models), arising political pressures (you know what I mean), linguistic factors (what would happen if most philosophers were to write in Dutch?), and a lot of other factors, be they environmental, sociological, or what have you. Although we like to think that the pursuit of truth is central, it’s by far not the only reason why debates arise and certain concepts are coined and stick around, while others are forgotten. Although contingent, such factors are recurrent. And this is something that affects our understanding of current as well as past debates. The historian can approach current debates as a historian in the same way that she can approach past debates. And this is, I submit, where historians can truly speak to current concerns.

Coming back to the debate I mentioned earlier, there is another issue (besides the treatment of sources) that I find striking. In their emphasis of Augustine, Gregory and Peter show a transition from a representational to an action model of thought. Why does this transition occur? Why do they find it important to emphasise action over representation against William? – Many reasons are possible. I won’t go into them now. But this might be an interesting point of comparison to the current debates over enactivism versus certain sorts of representationalism. Why do we have that debate now? Is it owing to influences like Ryle and Gibson? Certainly, they are points of (sometimes implicit) reference. Are there other factors? Again, while I think that these are intriguing philosophical developments, our understanding of such transitions and debates remains impoverished, if we don’t look for other factors. Studying past transitions can reveal recurrent factors in contemporary debates. One factor might be that construing thoughts as acts rather than mere representations discloses their normative dimensions. Acts are something for which we might be held responsible. There is a lot more to be said. For now suffice it to say that it is in comparing debates and uncovering their conditions, that historians of philosophy qua historians can really contribute.

At the same time, historians might also benefit from paying more attention to current concerns. Not only to stay up to date, but also to sharpen their understanding of historical debates.** As we all know, historical facts don’t just pop up. They have to be seen. But this seeing is of course a kind of seeing as. Thus, if we don’t just want to repeat the historiographical paradigms of, say, the eighties, it certainly doesn’t hurt if our seeing is trained in conversation with current philosophers.

____

* That said, this is of course an open question. So I’m happy to be shown a counterexample.

** More on the exchange between philosophers and historians of philosophy can be found in my inaugural lecture.