“Heidegger was a Nazi.” What now?

“B was a bigot” is a phrase that raises various questions. We can say it of various figures, both dead and alive. But this kind of phrase is used for various purposes. In what follows, I’d like consider some implications of this phrase and its cognates. – Let me begin with what might seem a bit of a detour. Growing up in Germany, I learned that we are still carrying responsibility for the atrocities committed under the Nazi regime. Although some prominent figures declared otherwise even in the Eighties, I think this is true. Of course, one might think that one cannot have done things before one was born, but that does not mean that one is cut off from one’s past. Thinking historically means, amongst other things, to think of yourself as determined by continuities that run right through you from the past into the options that make your future horizon. The upshot is: we don’t start from scratch. It is with such thoughts that I look at the debates revolving around Heidegger and other bigots. Is their thought tainted by their views? Should we study and teach them? These are important questions that will continue to be asked and answered. Adding to numerous discussions, I’d like to offer three and a half considerations.*

(1) The question whether someone’s philosophical thought is tainted or even pervaded by their political views should be treated as an open question. There is no a priori consideration in favour of one answer. That said, “someone’s thought” is ambiguous. If we ask whether Heidegger’s or Frege’s (yes, Frege’s!) thought was pervaded by their anti-semitism, the notion is ambiguous between “thought” taken as an item in psychological and logical relations. The psychological aspects that explain why I reason the way I do, often do not show up in the way a thought is presented or received. – Someone’s bigotry might motivate their thinking and yet remain hidden. But even if something remains hidden, it does not mean that it carries no systematic weight. There is an old idea, pervasive in the analytic tradition, that logical and political questions are distinct. But the idea that logic and politics are distinct realms is itself a political idea. All such issues have to be studied philosophically and historically for each individual thinker. How, for instance, can Spinoza say what he says about humans and then say what he says about women? This seems glaringly inconsistent and deserves study rather than brushing off. However, careful study should involve historically crucial ties beyond the question of someone’s thought. There are social, political and institutional continuities (and discontinuities) that stabilise certain views while disqualifying others.

(2) Should we study bigots? If the forgoing is acceptable, then it follows that we shouldn’t discourage the study of bigots. Quite the contrary! This doesn’t mean that I recommend the study of bigots in particular; there are enough understudied figures that you might turn to instead. It just means that their bigotry doesn’t disqualify them as topics of study and that if you’re wondering whether you should, that might in itself be a good reason to get started. This point is of course somewhat delicate, since history of philosophy is not only studied by disinterested antiquarians, but also for reasons of justifying why we endorse certain views or because we hope to find good or true accounts of phenomena. – Do we endorse someone’s political views by showing continuities between their thoughts and ours? Again, that depends and should be treated as an open question. But I don’t think that shunning the past is a helpful strategy. After all, the past provides the premises we work from, whether we like it or not. Rather we should look carefully at possible implications. But the fact that we appropriate certain ideas does not entail that we are committed to such implications. As I said in my last post, we can adopt thoughts, while changing and improving them. That fact that Heidegger was a Nazi does not turn his students or later exegetes into Nazis. However, once we know about the bigotry we should acknowledge as much in research and teaching.

(3) What about ourselves? Part of the reason for making the second remark was that I sometimes hear people say: “A was a bigot; so we shouldn’t teach A. Let’s rather teach B.” While I agree that there are huge numbers of understudied figures that might be taught instead of the same old classics, I don’t think that this line of argument helps. As I see it, it often comes out of the problematic idea that, ideally, we should study and teach only such figures that we consider morally pure. This is a doubtful demand not only because we might end up with very little material. It is also problematic because it suggests that we can change our past at will. Therefore, attempts at diversifying our teaching should not be supported by arguments from supposedly different moral status; rather we should see that globalisation requires us to eventually acknowledge the impact of various histories and their entanglements. – We don’t teach Heidegger because we chose to ignore his moral status. We teach his and other works because our own thought is related to these works. This has an important consequence for our own moral status. Having the histories we do, our own moral status is tainted. In keeping with my introductory musings, I’d like to say that we are responsible for our past. The historical continuities that we like and wish to embrace are as much our responsibilities as those that we wish to disown. Structurally oppressive features of the past are not disrupted just because we change our teaching schedule.

I guess the general idea behind these considerations is this: The assumption that one can cut off oneself from one’s (philosophical) past is an illusion. As philosophers in institutional contexts we cannot deny that we might be both beneficiaries of dubious heritage as well as suffering from burdens passed down. In other words, some of the bigotry will carry over. Again, this doesn’t mean that we are helpless continuants of past determinants, but it means that it is better to study our past and our involvements with it carefully rather than deny them and pretend to be starting from scratch.

___

* See especially the pieces by Peter Adamson and Eric Schliesser.

I don’t know what I think. A plea for unclarity and prophecy

Would you begin a research project if there were just one more day left to work on it? I guess I wouldn’t. Why? Well, my assumption is that the point of a research project is that we improve our understanding of a phenomenon. Improvement seems to be inherently future-directed, meaning that we understand x a bit better tomorrow than today. Therefore, I am inclined to think that we would not begin to do research, had we not the hope that it might lead to more knowledge of x in the future. I think this not only true of research but of much thinking and writing in general. We wouldn’t think, talk or write certain things, had we not the hope that this leads to an improved understanding in the future. You might find this point trivial. But a while ago it began to dawn on me that the inherent future-directedness of (some) thinking and writing has a number of important consequences. One of them is that we are not the (sole) authors of our thoughts. If this is correct, it is time to rethink our ways of evaluating thoughts and their modes of expression. Let me explain.

So why am I not the (sole) author of my thoughts? Well, I hope you all know variations of the following situation: You try to express an idea. Your interlocutor frowns and points out that she doesn’t really understand what you’re saying. You try again. The frowning continues, but this time she offers a different formulation. “Exactly”, you shout, “this is exactly what I meant to say!” Now, who is the author of that thought? I guess it depends. Did she give a good paraphrase or did she also bring out an implication or a consequence? Did she use an illustration that highlights a new aspect? Did she perhaps even rephrase it in such a way that it circumvents a possible objection? And what about you? Did you mean just that? Or do you understand the idea even better than before? Perhaps you are now aware of an important implication. So whose idea is it now? Hers or yours? Perhaps you both should be seen as authors. In any case, the boundaries are not clear.

In this sense, many of my thoughts are not (solely) authored by me. We often try to acknowledge as much in forewords and footnotes. But some consequences of this fact might be more serious. Let me name three: (1) There is an obvious problem for the charge of anachronism in history of philosophy (see my inaugural lecture).  If future explications of thoughts can be seen as improvements of these very thoughts, then anachronistic interpretations should perhaps not merely be tolerated but encouraged. Are Descartes’ Meditations complete without the Objections and Replies? Can Aristotle be understood without the commentary traditions? Think about it! (2) Another issue concerns the identity of thoughts. If you are a semantic holist of sorts you might assume that a thought is individuated by numerous inferential relations. Is your thought that p really what it is without it entailing q? Is your thought that p really intelligible without seeing that it entails q? You might think so, but the referees of your latest paper might think that p doesn’t merit publication without considering q. (3) This leads to the issue of acceptability. Whose inferences or paraphrases count? You might say that p, but perhaps p is not accepted in your own formulation, while the expression of p in your superviser’s form of words is greeted with great enthusiasm. In similar spirit, Tim Crane has recently called for a reconsideration of peer review.  Even if some of these points are controversial, they should at least suggest that authorship has rather unclear boundaries.

Now the fact that thoughts are often future-directed and have multiple authors has, in turn, a number of further consequences. I’d like to highlight two of them by way of calling for some reconsiderations: a due reconsideration of unclarity and what Eric Schliesser calls “philosophic prophecy”.*

  • A plea for reconsidering unclarity. Philosophers in the analytic tradition pride themselves on clarity. But apart from the fact that the recognition of clarity is necessarily context-dependent, clarity ought to be seen as the result of a process rather than a feature of the thought or its present expression. Most texts that are considered original or important, not least in the analytic tradition, are hopelessly unclear when read without guidance. Try Russell’s “On Denoting” or Frege’s “On Sense and Reference” and you know what I mean. Or try some other classics like Aristotle’s “De anima” or Hume’s “Treatise”. Oh, your own papers are exempt from this problem? Of course! Anyway, we all know this: we begin with a glimpse of an idea. And it’s the frowning of others that either makes us commit it to oblivion or try an improvement. But if this is remotely true, there is no principled reason to see unclarity as a downside. Rather it should be seen as a typical if perhaps early stage of an idea that wants to grow.
  • A plea for coining concepts or philosophic prophecy. Simplifying an idea by Eric Schliesser, we should see both philosophy and history of philosophy as involved in the business of coining concepts that “disclose the near or distant past and create a shared horizon for our philosophical future.”* As is well-known some authors (such as Leibniz, Kant or Nietzsche) have sometimes decidedly written rather for future audiences than present ones, trying to pave conceptual paths for future dialogues between religions, metaphysicians or Übermenschen. For historians of philosophy in particular this means that history is never just an antiquarian enterprise. By offering ‘translations’ and explanations we can introduce philosophical traditions to the future or silence them. In this sense, I’d like to stress, for example, that Ryle’s famous critique of Descartes, however flawed historically, should be seen as part of Descartes’ thought. In the same vain, Albert the Great or Hilary Putnam might be said to bring out certain versions of Aristotle. This doesn’t mean that they didn’t have any thoughts of their own. But their particular thoughts might not have been possible without Aristotle, who in turn might not be intelligible (to us) without the later developments. In this sense, much if not all philosophy is a prophetic enterprise.

If my thoughts are future-directed and multi-authored in such ways, this also means that I often couldn’t know at all what I actually think, if it were not for your improvements or refinements. This is of course one of the lessons learned from Wittgenstein’s so-called private language argument. But it does not only concern the possibility of understanding and knowing. A fortiori it also concerns understanding our own public language and thought. As I said earlier, I take it to be a rationality constraint that I must agree to some degree with others in order to understand myself. This means that I need others to see the point I am trying to make. If this generalises, you cannot know thyself without listening to others.

___

* See Eric Schliesser, “Philosophic Prophecy”, in Philosophy and It’s History, 209.

 

 

Do ideas matter to philosophy? How obsession with recognition blocks diversity

When suffering from writer’s block, I spent much of my time in the library browsing through books that were shelved beside the ones I originally looked for. Often these were books that didn’t have any traces of use: neither, it seemed, had anyone read them, nor were they cited by anyone. The names of the authors were often unfamiliar and a search confirmed that they sometimes were no longer in academia. Funnily enough, these books often contained the most refreshing and original ideas. Their approach to topics or texts was often unfamiliar to me, but the effort of figuring out what they were arguing was time well spent. Nevertheless, my attempts to bring them up in discussions weren’t picked up on. People continued to cite the more familiar names. Why are we letting this happen?

Most of you probably know the following phenomenon: During a discussion someone proposes an idea; the discussion moves on. Then an established person offers almost a repetition of the proposed idea and everyone goes: “oh, interesting.” Put as a rule of thumb: prestige gets you attention; interesting ideas as such not so much. There is a gendered version of this phenomenon, too: If you want to listen to an interesting idea authored by a woman, better have a man repeat it. Now, an important aspect of this phenomenon is that it seems to incentivise that we relate our philosophical work to that of prestigious figures. In other words, we will make sure that what we say picks up on what established figures say. As Kieran Healy has shown, citation patterns confirm this. Cite David Lewis and you might join the winning in-group. We hope to get recognition by citing established people. Now you might just shrug this off as an all too human trait. But what I’d like to argue is that this behaviour crucially affects how we evaluate ideas.

I think Healy’s citation patterns show that we are inclined to value such ideas that are either closely related (in content) to those of established figures or that are presented in a similar manner or method. Put simply: you’re more likely to get recognition if you imitate some “big shot” in content or method. Conversely, if you don’t imitate “big shots”, your work won’t be valued. Why is this important? My hunch is that this practice minimises diversity of content and method. Philosophers often like to present themselves as competitors for the best ideas. But if we track value through recognition, there is no competition between ideas.

Now if this is the case, why don’t we see it? My answer is that we don’t recognise it because there are competing big shots. And the competition between big shots makes us believe that there is diversity. Admittedly, my own evidence is anecdotal. But how could it not be. When I started out as a medievalist, the thing to be done to get recognition was to prepare a critical edition of an obscure text. So I learned a number of strange names and techniques in this field. However, outside of my small world this counted for, say, not much. And when the German Research Foundation (DFG) stopped funding such projects, a lot of people were out of a job. Moving on to other departments, I quickly learned that there was a different mainstream, and that mainstream didn’t favour editions or work on obscure texts. Instead you could make a move by writing on a canonical figure already edited. Just join some debate. Still further outside of that context you might realise that people don’t value history of philosophy anyway. But rather than seeing such different approaches as complementary, we are incentivised to compete for getting through with one of these approaches.

However, while competition might nourish the illusion of diversity, the competition for financial resources ultimately blocks diversity because it will ultimately lead to one winner. And the works and books that don’t follow patterns established in such competitions seem to fall through the cracks. There is more evidence of course once we begin to take an international perspective: There are people who write whole PhD dissertations that will never be recognised outside of their home countries. So they have to move to richer countries and write a second PhD to have any chance on the international market. In theory, we should expect such people to be the best-trained philosophers around: they often have to familiarise themselves with different approaches and conventions, often speak different languages, and are used to different institutional cultures. But how will we evaluate their ideas? Will they have to write a bit like David Lewis or at least cite him sufficiently in order to get recognition?

Now you might want to object that I’m conflating cause and effect. While I say that we assign value because of prestige, you might argue that things are the other way round: we assign prestige because of value. – If this were the case, I would want to see some effort to at least assess the ideas of those who don’t align their work with prestigious figures. But where do we find such ideas? For reasons stated above, my guess is that we don’t find them in the prestigious departments and journals. So where should we look for them?

My hunch is that we ‘ll find true and worthwhile diversity in the lesser known departments and journals. So please begin: Listen to the students who don’t speak up confidently, read the journals and books from publishers whose names you cannot recognise. Listen to people whose native language isn’t English. And stop looking for ideas that sound familiar.

Abstract cruelty. On dismissive attitudes

Do you know the story about the PhD student whose supervisor overslept and refused to come to the defence, saying he had no interest in such nonsense? – No? I don’t know it either, by which I mean: I don’t know exactly what happened. However, some recurrent rumours have it that on the day of the PhD student’s defence, the supervisor didn’t turn up and was called by the secretary. After admitting that he overslept, he must indeed have said that he didn’t want to come because he wasn’t convinced that the thesis was any good. Someone else took over the supervisor’s role in the defence, and the PhD was ultimately conferred. I don’t know the details of the story but I have a vivid imagination. There are many aspects to this story that deserve attention, but in the following I want to concentrate on the dismissive attitude of the supervisor.

Let’s face it, we all might oversleep. But what on earth brings someone to say that they are not coming to the event because the thesis isn’t any good? The case is certainly outrageous. And I keep wondering why an institution like a university lets a professor get away with such behaviour. As far as I know the supervisor was never reprimanded, while the candidate increasingly went to bars rather than the library. I guess many people can tell similar stories, and we all know about the notorious discussions around powerful people in philosophy. Many of those discussions focus on institutional and personal failures or power imbalances. But while such points are doubtlessly worth addressing, I would like to focus on something else: What is it that enables such dismissive attitudes?

Although such and other kinds of unprofessional behaviour are certainly sanctioned too rarely, we have measures against it in principle. Oversleeping and rejecting to fulfil one’s duties can be reprimanded effectively, but what can we do about the most damning part of it: the dismissive attitude according to which the thesis was just no good? Of course, using it as a reason to circumvent duties can be called out, but the problem is the attitude itself. I guess that all of us think every now and then that something is so bad that, at least in principle, it isn’t worth getting up for. What is more, there is in principle nothing wrong with finding something bad. Quite the contrary, we have every reason to be sincere interlocutors and call a spade a spade, and sometimes this involves severe criticism.

However, some cases do not merely constitute criticism but acts of cruelty. But how can we distinguish between the two? I have to admit that I am not entirely sure about this, but genuine criticism strikes me as an invitation to respond, while in the case under discussion the remark about the quality of the thesis was given as a reason to end the conversation.* Ending a conversation or dismissing a view like that is cruel. It leaves the recipient of the critique with no means to answer or account for their position. Of course, sometimes we might have good reasons for ending a conversation like that. I can imagine political contexts in which I see no other way than turning my back on people. But apart from the fact that a doctoral defence shouldn’t be such an occasion, I find it suspicious if philosophers end conversations like that. What is at stake here?

First of all, we should note that this kind of cruelty is much more common than meets the eye. Sure, we rarely witness that a supervisor refuses to turn up for a defence. But anyone sitting in on seminars, faculty talks or lectures will have occasion to see that sometimes criticism is offered not as an invitation for response, but as a dismissal that is only thinly disguised as an objection. How can we recognise such a dismissal? The difference is that an opinion is not merely criticised but considered a waste of time. This and other slogans effectively end a conversation. Rather than addressing what one might find wanting, the opponent’s view will be belittled and portrayed as not being worth to be taken seriously. As I see it, such speech acts are acts of cruelty because they are always (even if tacitly) ad hominem. The conjunction of critical remarks and of ending a conversation shows that it is not merely the opinion that is rejected but that there is no expectation that the argument could be improved by continuing the conversation. In this sense, ending a conversation is owing to a severe lack of charity, ultimately dismissing the opponent as incapable or even irrational.

You would think that such behaviour gets called out quickly, at least among philosophers. But the problem is that this kind of intellectual bullying is actually rather widespread: Whenever we say that an opinion isn’t worth listening to, when we say, for instance, that analytical or continental philosophy is just completely wrongheaded or something of the kind, we are at least in danger of engaging in it.** Often this goes unnoticed because we move within circles that legitimise such statements. Within such circles we enjoy privilege and status; outside our positions are belittled as a waste of time. And the transition from calling something bad to calling something a waste of time is rather smooth, if no one challenges such a speech act.

Having said as much, you might think I am rather pessimistic about the profession. But I am not. In fact I think there is a straightforward remedy. Decouple criticisms from ending conversations! But now you might respond that sometimes a conversation cannot continue because we really do not share standards of scholarship or argument. And we certainly shouldn’t give up our standards easily. – I totally agree, but I think that rather than being dismissive we might admit that we have a clash of intuitions. Generally speaking, we might distinguish between two kinds of critical opposition: disagreements and clashes of intuition. While disagreements are opposing views that can be plotted on a common ground, clashes of intuition mark the lack of relevant common ground. In other words, we might distinguish between internal and external criticism, the latter rejecting the entire way of framing an issue. I think that it is entirely legitimate to utter external criticism and signal such a clash. It is another way of saying that one doesn’t share sufficient philosophical ground. But it also signals that the opposing view might still deserve to be taken seriously, provided one accepts different premises or priorities.*** Rather than bluntly dismissing a view because one feels safeguarded by the standards of one’s own community, diagnosing a clash respects that the opponent might have good reasons and ultimately engages in the same kind of enterprise.

The behaviour of the supervisor who overslept is certainly beyond good and evil. Why do I find this anecdote so striking? Because it’s so easy to call out the obvious failure on part of the supervisor. It’s much harder to see how we or certain groups are complicit in legitimising the dismissive attitude behind it. While we might be quick to call out such a brutality, the damning dismissive attitude is more widespread than meets the eye. Yet, it could be amended by admitting to a clash of intuitions, but that requires some careful consideration of the nature of the clash and perhaps the decency of getting out of bed on time.

_____

This post by Regina Rini must have been at the back of my mind when I thought about conversation-enders; not entitrely the same issue but a great read anyway.

**A related instance can be to call a contemporary or a historical view “weird”. See my post on relevance and othering.

*** Examples of rather respectable clashes are dualism vs. monism or representationalism vs. inferentialism. The point is that the debates run into a stalemate, and picking a side is a matter of decision rather than argument.

History without narratives? A response to Alex Rosenberg

Recently, Martin Kusch gave an intriguing keynote lecture on the development of the sociology of knowledge. I was particularly interested in Steinthal’s role, whose name I recognised from my studies in linguistics and its history. But what was striking was that the lecture combined several levels of explanation. In addition to reconstructing philosophical arguments, Martin Kusch gave detailed insights into the institutional and political events that shaped the development. In other words, the lecture provided a nuanced combination of what is sometimes called historical and rational reconstruction. During the discussion I asked whether he thought that there was one particular level which decided the course of events. “Where do you think the real action took place, in politics or philosophy?” The answer was a succinct lesson in historical methodology: The quest for one decisive level of explanation is deceptive in itself. It suggests mono-causality. In fact, all the different factors have to be seen in conjunction. Real action takes place at every level. (By the way, I think this line of argument offers one of the best reasons why philosophy is inseparable from history.) A few days ago, I was reminded of this idea when reading an interview with Alex Rosenberg who thinks that certain levels of explanation should be discarded and argues for a history without narratives, because “narrative history is always, always wrong.”

According to Rosenberg, narratives are ways of making sense of events by referring to people’s beliefs and desires. “Had she not wanted x, she would not have done y. Erroneously, she believed that y would help her in getting x.” We engage in this sort of reasoning all the time. It presupposes a certain amount of folk psychology: ascribing beliefs and desires seems to require that these items really figure in a proper chain of events. But do they even exist, one might ask. – Now we also help ourselves to such explanations in history. Stuff happens. Explaining it sometimes requires us to assume minds, especially when humans are involved. Let’s call this approach folk history. (Note that Rosenberg is targeting “theory of mind” approaches in particular, but for the application to history the specifics of these approaches don’t matter.) Now Rosenberg gave an interview detailing why we should do away with folk history:

“The problem is, these historical narratives seduce you into thinking you really understand what’s going on and why things happened, but most of it is guessing people’s motives and their inner thoughts. […] [P]eople use narratives because of their tremendous emotional impact to drive human actions, movements, political parties, religions, ideologies. And many movements, like nationalism and intolerant religions, are driven by narrative and are harmful and dangerous for humanity. […] If narrative history gets things wrong because it relies on projection and things we can’t know for sure, how should we be trying to understand history? – There are a lot of powerful explanations in history and social sciences that don’t involve narrative. They involve models and hypotheses that are familiar in structure to the kind that convey explanation in the natural sciences. For example, take Guns, Germs, and Steel, which gives you an explanation of a huge chunk of human history, and that explanation does not rely on theory of mind at all.”

Alex Rosenberg makes a number of good points: (1) Relying on inner states is guesswork. (2) We use it to feed (bad) ideologies. (3) There are other means of writing history, not involving folk history. (4) Given the choice, we should confine ourselves to the latter approach. Let’s call this latter approach naturalistic history. I think there is a lot that speaks in favour of such an approach. If you read some Spinoza, Hume, Nietzsche or Freud, you’ll find similar ideas. We assume our thinking follows all these noble patterns of inference when in fact we are driven by motives and associations unknown to us. That said, the way Alex Rosenberg presents this naturalistic approach raises a number of concerns two of which I would like to address now.

  • The first worry concerns (4), i.e. the conclusion that folk history and naturalistic history should be played off against one another. Just like we need the “intentional stance” in the philosophy of mind, we also need it in history. But that’s not the whole story. Our reference to beliefs and desires does not only figure in historical explanations. It is also the very stuff we are interested in qua being human amongst other humans, and thus it shapes the events we want to explain. I concur in causing events because I ascribe mental states to others: I don’t sing in the library because I assume that it will annoy my fellow readers. Of course you can explain much of my actions by reference to biological and other factors. But at some point such explanations would have to invoke my ascriptions. Doing away with that level would mean doing away with a crucial part of the explanans. Playing off these levels against one another is like thinking that there is ultimately just one relevant explanatory level.
  • The second worry concerns (2), i.e. the tenet that narratives are the stuff of ideologies (and thus erroneous and to be avoided). While it is true that ideologies are fed by certain narratives, I know of no way to refer to (historical) data without a narrative. The naturalistic approach is not avoiding narratives tout court; it merely avoids a certain kind of narrative. It replaces the folk historical approach with a naturalistic narrative. Pretending that this is tantamount to avoiding all narratives is to suggest that the raw data of history are just there, waiting to be picked up by the disenchanted historian. In other words, I think that Rosenberg’s suggestion falls prey to a variant of the myth of the given. To say that narratives are “always wrong”, then, seems to be a category mistake. As I see it, narratives as such are neither right nor wrong. Rather, they provide frameworks that enable us to call individual statements right or wrong.

But since I have not read the book that is advertised in the interview, I don’t yet know whether this is the whole story. But who am I to try and tell this story by referring to beliefs and other mental states expressed in that book by Alex Rosenberg?

On relevance and othering

Do you remember talking about music during your school days? There was always someone declaring that they would only listen to the latest hits. Talking to philosophers, I occasionally feel transported back to these days: especially when someone tells me that they have no time for history and will only read the latest papers on a topic. “What do I care what Brentano said about intentionality! I’m interested in current discussions.” Let’s call this view “currentism”. I sometimes experience versions of this currentist attitude in exams. A student might present an intriguing reconstruction of a medieval theory of matter only to be met with the question: “Why would anyone care about that today?” I have to admit that I sometimes find this attitude genuinely puzzling. In what follows I’d like to explain my puzzlement and raise a few worries.

Why only “sometimes”? I say “sometimes”, because there is a version of this attitude that I fully understand. Roughly speaking, there is a descriptive and a normative version of that sentiment. I have no worries about the descriptive version: Some people just mean to say what they focus on or indicate a preference. They are immersed in a current debate. Given the constraints of time, they can’t read or write much else. That’s fine and wholly understandable. In that case, the question of why one would care might well be genuine and certainly deserves an answer. – The normative version is different: People endorsing the normative attitude mean to say that history of philosophy is a waste of time and should be abolished, unless perhaps in first-year survey courses. Now you might say: “Why are you puzzled? Some people are just more enthusiastic in promoting their preferences.” To this I reply that the puzzlement and worries are genuine because I find the normative attitude (1) unintelligible and (2) politically harmful. Here is why:

(1) My first set of worries concerns the intelligibility of this attitude. Why would anyone think that the best philosophy is being produced during our particular time slice? I guess that the main reason for (normatively) restricting the temporal scope of philosophy to the last twenty or fifty years is the idea that the most recent work is indeed the best philosophy. Now why would anyone think that? I see two possible reasons. One might think so because one believes that philosophy is tied to science and that the latest science is the best science. Well, that might be, but progress in science does not automatically carry over to philosophy. The fact that I write in the presence of good science doesn’t make me a good philosopher.

So if there is something to that idea people will ultimately endorse it for another reason: because there might be progress in philosophy itself. Now the question whether there really is progress in philosophy is of course hotly debated. I certainly don’t want to deny that there have been improvements, and I continue to hope for more of them. But especially if we assume that progress is an argument in favour of doing contemporary philosophy (and what else should we do, even if we do history!), how can someone not informed about history assess this progress? If I have no clue about the history of a certain issue, how would I know that real advancements have been made? In other words, the very notion of progress is inherently historical and requires at least some version of (whig) history. So unless someone holds the belief that recent developments are always better, I think one needs historical knowledge to make that point.

Irrespective of questions concerning progress one might still endorse current over historical philosophy because it is relevant to current concerns. So yes, why bother with medieval theories of justice when we can have theories that invoke current issues? Well, I don’t doubt that we should have philosophers focussing on current issues. But I wonder whether current issues are intelligible without references to the past. Firstly, there is the fact that our current understanding of justice or whatever is not a mere given. Rather, it is the latest stage of a development over time. Arguably, understanding that development is part of understanding the current issues. Now you might object that we should then confine ourselves to writing genealogies of stuff that is relevant today but not of remote issues (such as medieval theories of, say, matter). To this I reply that we cannot decide what does and doesn’t pertain to a certain genealogy in advance of historical studies. A priori exclusion is impossible, at least in history. Moreover, we cannot know that what we find irrelevant today is still irrelevant tomorrow. In other words, our judgments concerning relevance are subject to change and cannot be used to exclude possible fields of interest. To sum up, ideas of progress and relevance are inherently historical and require historical study.

(2) However, the historicity of relevance doesn’t preclude that it is abused in polemical and political ways. Besides worries about intelligibility, then, I want to raise political and moral worries against the normative attitude of currentism. Short of sound arguments from progress or relevance, the anti-historical stance reduces to a form of othering. Just like some people suffer exclusion and are labelled as “weird” for reasons regarding stereotypes of race or gender, people are excluded for reasons of historical difference. But we should think twice before calling a historically remote discussion less rational or relevant or whatever. Of course, there is a use of “weird” that is simply a shorthand of “I don’t understand the view”. That’s fine. What I find problematic is the unreflected dismissal of views that don’t fit into one’s preferences. But the fact that someone holds a view that does not coincide with today’s ideas about relevance deserves study rather than name-calling. As I see it, we have moral reasons to refrain from such forms of abuse.

If we don’t have reasons showing that a historical view has disadvantages over a current one, why do we call it “weird” or “irrelevant”? Here is my hunch: it’s a simple fight over resources. Divide et impera! But in the long run, it’s a lose-lose situation for all of us. Yet if you’re a politician and you manage to play off different sub-disciplines in philosophy or the humanities against one another, you can simply stand by until they’ve delegitimised each other so much that you can call all camps a farce and close down their departments.

What is philosophy? A response to Laura Georgescu

Some years ago, I began to make a habit of telling students what I think philosophy is. Not in order to tell them the one and only Truth, but in order to clarify what I look out for in papers and interactions. So I will say something like the following: “Philosophy is a huge and on-going conversation. Thus, philosophy is not directly about phenomena; rather it deals with claims about phenomena. So you won’t ask “What is thinking or justice?” Rather you will deal with what other people or you say about thinking or justice etc.” This is normally followed up by some explanation of how to identify claims and arguments (or other kinds of evidence) in relation to such claims. Finally, I try to explain that the evaluation of a position does not mean to say that one is for or against it, but to spell out in what way and how convincingly the arguments support the claim.

Recently, I’ve grown a bit wary of saying such things. Why? Of course, I like to think about philosophy that way because it highlights the fact that it’s a conversational practice where certain rules of discourse apply. And sometimes it also stops people from doing their head in about the phenomena themselves. If you have to write a paper it’s easier to think about the things people say than to ask yourself what consciousness really is. But on the other hand it sends the message that philosophy is all about making claims. Now what’s wrong with that? In a way, not much. But then again it seems to have things the wrong way round. One might even think that we are mainly trained to question others rather than our own beliefs. But in fact a claim is something you might come to after a lot of thinking, talking and doubting yourself. A claim is not one’s starting point, or is it?

I couldn’t quite put my finger on it before reading Laura Georgescu’s recent blog post “Against Confidence in Opinions”. Go and read it! I’ll just give you the main take-home message it had for me: Like the much of the world, academic philosophy is infected with the idea that it’s a good thing to have confidence. Especially in the rightness of one’s opinions. So it’s all about defending one’s claims. But what is the outcome of that? Something most of us would claim not to find desirable: dogmatism. So there’s a cultivation of confidence and it leads to dogmatism. This nailed it for me. I realised that what I found problematic was that people were too often invested in defending their positions rather than questioning them. If you look at philosophers, you might want to distinguish two types: the one who is self-undermining and asking questions all the time as opposed to the confident one, tossing out one claim after another. The latter type seems to abound. – But as much as I like to believe in such a distinction, I doubt that it holds. So what now?

I recently said that advising to work on one’s confidence is cold comfort. Neither do I think that we can just ignore this culture. So let’s think what precisely might be undesirable about it. When I remember my student days, I remember myself admiring many of my fellow students for their confidence. They were speaking up, eloquently so, while I was running through possible formulations of an idea and remained doubtful whether I had a point at all. That feeling remained for a very long time. After the PhD and Habilitation it got better, but whenever I went out of my scholarly comfort zone, I felt I had no points to make. There is a kind of confidence that depends on having a feeling of legitimacy, and I often think getting a permanent job helps a lot with that feeling. – So now that I feel confident enough to write blog posts about what philosophy is I should start preaching that confidence is a bad thing? Doesn’t sound convincing to me. So what precisely is wrong with it?

First of all, there is a lot right with it. It helps getting through the day in all sorts of ways. But as Laura Georgescu emphasises, it’s confidence in opinions that is troublesome. How then can we prevent dogmatism without giving up on being confident entirely?

I think it might help to come back to the idea of philosophy as a conversational practice and to distinguish two kinds of conversation: an internal conversation that one has with oneself (Plato called this “thinking”) and the external conversation that one has with others. When we see external conversations happening between people, we often hear someone asking a question and someone else responding with a claim. Further questions ensue and the claim gets defended. What we observe are two types: the questioner and the one defending claims. This is what we often witness in philosophy talks, and our paper structures imitate that practice, mostly with the author as the one making claims. The upshot is that, in papers and talks, we often play just one of two or more possible roles. That might be undesirable.

However, if we focus on internal conversations we find that we do in fact both. The claims we pin down come after a lot of self-undermining back and forth. And the confidence we can muster might be the last resort to cover the relentless doubting that goes on behind our foreheads. In our internal conversations, I guess most of us are far from any kind of dogmatism.

I suppose, then, if we see reason to change the practice of coming across as dogmatic, a good start might be to bring some of that internal conversation to bear on external conversations. Rather than playing sceptic versus dogmatist, we might every now and then remember that, most of the time, we just do what Plato called thinking. Having a dialogue in which we take on all sorts of roles and propositional attitudes. Bring it on! But I guess it takes some confidence.