Some observations concerning diversity (Part I)

(1) This morning, I took my daughter Hannah to a nice playground in Berlin. She is just over two years old and particularly enjoys climbing. As we approached the climbing frame, one of the boys on top shouted that they wouldn’t allow girls on the climbing frame. The girls around him smiled …

(2) Two years ago, I joined an initiative designed to keep a far-right right party (AFD) out of parliament. The initiative was founded by a group of social entrepreneurs, and the first workshop was attended mainly by social entrepreneurs, most of them in their late twenties or early thirties. At the beginning, we were asked whether we actually knew anyone who would vote for the AFD. This was affirmed only by one man, somewhat older than most. He told us that he knew a few people in the village he came from. Later in the day, we had to form smaller groups. Guess who had difficulty finding a group …

(3) Quite a number of years ago, I was on a hiring committee while still being a postdoc. For the first round of going through the applications, we were advised to focus on CVs. Discussing the candidates, I listened to a number of meritocratic arguments: they mostly, but not solely, concerned numbers of “good” publications (“good” means in prestigious venues, in case you were wondering). Then someone pointed out that we should perhaps take more time for reading. Someone else pointed out that we should scan cover letters for reasons of interruptions in the CVs, such as times for child care work. Guess what didn’t carry any weight when we drew up the list …

I could go on. These are minor observations; anyone could have made them. Nevertheless, in all their banality, these stories of exclusion form the daily fabric of our failures. But where exactly can we locate the failures? The people involved in these situations were not bad people, not at all: lively children on a playground, ambitious young entrepreneurs fighting for a good cause, a hiring committee focussed on selecting the best candidate. The events didn’t result in catastrophes: Hannah climbed up anyway, the initiative went nowhere, someone certainly deserving got a job. – The point is that these situations are so ordinary that hardly anyone would consider or be willing to cause much of a fuss. Or so it seems. The harm might be done: The next little girl might be discouraged from going to the playground; the next lonely man from the village will perhaps find better company among the far-right; a number of candidates, who would have been equally deserving even by the committee’s standards, were not even considered. Even if no one tells them, the stories of exclusion go on. But what can we do?

The first thing is that in all these situations, there was a possibility of making a difference. Contrary to a widespread belief, we are not powerless minions. The fact that we cannot save the whole world doesn’t mean that we cannot change our ways here and there, or listen for a change. But there are two things that make me doubtful:

  • Despite all the fashionable talk of diversity, such stories suggest that many of us don’t really want diversity. What these situations reveal is that the exclusive attitudes seem to be very deeply ingrained.
  • Even if such attitudes can be overcome, I sense a sort of paranoia amongst many people in positions of public leadership. If you look at more and less well-known people in politics, such as presidents, prime ministers, chancellors etc., many of them display a fear of everything that is different. But while it’s easy to call them names for it, we shouldn’t forget that they get elected by majorities.

It’s complicated. The exclusions we see are not ad hoc, but fostered by long-standing structures. There’s much of that I dislike or even hate, while I continue to be a beneficiary in some way or another. That is probably true of many of us. Thus, the charge of hypocrisy always seems justified. But if that’s true, the charge of hypocrisy shouldn’t cause too much worry. The worry of being called a hypocrite might be paralysing. What shouldn’t be forgotten is that hypocrisy is structural, too. It can’t be amended by targeting individuals. Everyone who’s is trying to interfere and make a difference, can be called a hypocrite.

Why is the Wall still dividing us? The persistence of the intellectual cold war

Discussing some ideas relating to my piece on the divide between Eastern and Western Europe, I was struck by the following question: “How is it possible that the Wall separating East and West is still so successful?” It’s true! While I thought that 1989 marks a historical turning point that initiated an integrative process, this turns out to be a cherished illusion of mine. As my interlocutor pointed out, the Wall is still successful. It continues to keep the people apart. OK, I wanted to object, there is still an economic divide and that takes time of course. “No”, she retorted, “it’s not only economic, it’s also intellectual.” So there the point was being made again: the cold war attitude continues, also intellectually. What does that mean?

To keep things simple, I’ll focus on Germany. Let’s begin with my illusion. Having grown up in a small town near Düsseldorf, the Wall seemed far away. Apart from very few visits of relatives, I had no contacts and not much of an idea of the former GDR. Although I was quite moved by the fall of the Wall, it didn’t seem to affect my life directly. That changed when I moved to Budapest for one and a half years in 1994. I realised that my education had failed me. I knew hardly anything about the history of Hungary or Eastern Europe, not even about East Germany. Nevertheless, I was convinced that now there would be a process of integration under way. Europe would grow together. Slowly but surely; it had to. Did it?

East Germany is still a thing. That is strange in itself. It shouldn’t be. After all, the Wall has now been down slightly longer than it was up. It was built in 1961 and came down in 1989. Since then, 30 years have passed. But it must be liberating to think that neo-nazism, for instance, is not a problem in Germany but East Germany. The East is a perfect scapegoat in several domains. The narrative is dominated by the economic imbalance and the idea that people from the (former) East lack democratic education. The upshot of the narrative seems to be this: East Germans have a different mentality. The former generations were communists or dominated by communists. Their offspring are neo-nazis or indoctrinated by neo-nazis. You get the idea…

Of course, many biographies have been upset one way or another. Of course, the situation is complex, but after looking at some figures the crucial problems don’t strike me as owing primarily to a difference in mentality. This doesn’t mean that there is no problem of neo-nazism. But how did things start moving in the East? The former GDR seems to have been “colonised”, as some put it already by 1997. What were the effects in academia? Today, I read an article with the headline that “men lead three quarters of German universities”. Yes, I can imagine that. What the headline didn’t state but what I found among the figures is that “no rector or president” originates from East Germany. OK, not every academic biography culminates in leading a university. But still: reading this, I felt reminded of stories about “liquidation”. So I looked up some more figures: “By 1994, over 13,000 positions at universities in east Germany had been liquidated, and 20,000 more people (including 5,000 professors) had lost their jobs. Many teachers and academics were fired ‘despite the fact that they had proven professional qualifications.’” It’s hard to believe that these measures did anything to facilitate integration, let alone intellectual integration.

At this point, it doesn’t look like the divisive East-West narratives will loose their grip anytime soon. But why not? I guess that the colonialisation as part of the unification process will continue to be justified. It needs to be justified to cover up the crimes committed. But how can such things be justified? How can you justify to have sacked a huge amount of people and taken over quite some of their positions? Again, it will continue to be blamed on the mentality: If they complain about the past, you can blame it on their mentality. The lack of democratic education left them unfit for such responsibilities and now they are being nostalgic.* If they complain about their situation now, you can still blame it on the mentality. Which mentality? Well, they are still lacking democratic education and cannot understand that the state doesn’t have the duty to coddle them in their unemployment. Whichever way you see it, they just turn out to be incapable. So taking over responsibilities was and is justified.

It goes without saying that the forgoing is verging on a black and white picture. The idea is not to explain such complex political developments in a blog post. The idea is to gesture towards an answer to the question of why the Wall continues to be so efficient, even though it came down 30 years ago. What is the ideology that is so persistent? I truly don’t know. Perhaps the ideology of those erecting the Wall is still or again at work. Or the very idea of a Wall, of a divide, is more convenient than we want to believe. For it can serve in justifying differences and injustices even after tearing it down. In any case, the reference to “mentality” strikes me as a highly problematic form of moral exclusion.

So perhaps the Wall continues to serves as a justificational loop. How does it work? The people blaming each other will most likely not assume that they are just pointing to a supposed mentality in order to justify an injustice. They will take their beliefs (about communism, nazism, and capitalism etc.) to be true of those accused. But they can likely do so as long as the justificational mechanism of the Wall keeps them apart.

____

* Thinking about (n)ostalgia, there is also the more sinister explanation that the cherished memories do not simply reach back to the time after 1961, but further back to the times after 1933. In other words, if you want to explain the far-right movement by reference to mentality, you might look to the mentality that spans across East and West Germany, nourished well before the Wall was erected.

History of contemporary thought and the silence in Europe. A response to Eric Schliesser

What should go into a history of contemporary ideas or philosophy? Of course this is a question that is tricky to answer for all sorts of reasons. What makes it difficult is that we then tend to think of mostly canonical figures and begin to wonder which of those will be remembered in hundred years. I think we can put an interesting spin on that question if we approach it in a more historical way. How did our current thoughts evolve? Who are the people who really influenced us? There will not only be people whose work we happen to read, but those who directly interact and interacted with us. Our teachers, fellow students, friends and opponents. You might not think of them as geniuses, but we should drop that category anyway. These are likely people who really made a difference to the way you think. So let’s scratch our heads a bit and wonder who gave us ideas directly. In any case, they should figure in the history of our thought.

You might object that these figures would not necessarily be recognised as influential at large. However, I doubt that this is a good criterion: our history is not chiefly determined by who we take to be generally influential, but more often than not by those people we speak to. If not, why would historians bother to figure out real interlocutors in letters etc.? This means that encounters between a few people might make quite a difference. You might also object that a history of contemporary philosophy is not about you. But why not? Why should it not include you at least? What I like about this approach is that it also serves as a helpful corrective to outworn assumptions about who is canonical. Because even if certain figures are canonical, our interpretations of canonical figures are strongly shaped by our direct interlocutors.

Thinking about my own ideas in this way is a humbling experience. There is quite a number of people inside and outside my department to whom I owe many of my ideas. But this approach also reveals some of the conditions, political and other, that allow for such influence. One such condition I am painfully reminded of when observing the current political changes in Europe. No, I do not mean Brexit! Although I find these developments very sad and threatening indeed, most of the work done by friends and colleagues in Britain will reach me independently of those developments.

But Central and Eastern Europe is a different case. As it happens, the work that affected my own research most in the recent years is on the history of natural philosophy. It’s more than a guess when I say that I am not alone in this. Amongst other things, it made me rethink our current and historical ideas of the self. Given that quite a number of researchers who work on this happen to come from Central and Eastern Europe, much of this work probably wouldn’t have reached me, had it not been for the revolutions in 1989. This means that my thinking (and most likely that of others, too) would have been entirely different in many respects, had we not seen the Wall come down and communist regimes overthrown.

Why do I bring this up now? A brief exchange following up on an interesting post by Eric Schliesser* made it obvious that many Western Europeans, by and large, seem to assume that the revolutions from 1989 have had no influence on their thought. As he puts it, “the intellectual class kind of was conceptually unaffected” by them. And indeed, if we look at the way we cite and acknowledge the work of others, we regularly forget to credit many, if not most, of our interlocutors from less prestigious places. In this sense, people in what we call the Western world might be inclined to think that 1989 was not of significance in the history of thought. I think this is a mistake. A mistake arising from the canonical way of thinking about the work that influences us. Instead of acknowledging the work of individuals who actually influence us, we continue citing the next big shot whom we take to be influential in general. By excluding the direct impact of our actual interlocutors, we make real impact largely invisible. Intellectually, the West behaves as if it were still living in the Cold War times. But the fact that we continue to ignore or shun the larger patterns of real impact since 1989 does not entail that it is not there. Any claim to the contrary would, without further evidence at least, amount to an argument from ignorance.

The point I want to make is simple: we depend on other people for our thought. We need to acknowledge this if we want to understand how we come to think what we think. The fact that universities are currently set up like businesses might make us believe that the work people do can (almost) equally be done by other people. But this is simply not true. People influencing our thought are always particular people; they cannot be exchanged salva veritate. If we care about what we think, we should naturally care about the origin of our thought. We owe it to particular people, even if we sometimes forget the particular conversations in which our ideas were triggered, encouraged or refuted.

Now if this is correct, then it’s all the more surprising that we let go of the conditions enabling much of this exchange in Europe so easily. How is it possible, for instance, that most European academics remain quiet in the face of recent developments in Hungary? We witnessed how the CEU was being forced to move to Vienna in an unprecedented manner, and now the Hungarian Academy of Sciences is targeted.**

While the international press reports every single remark (no matter how silly) that is made in relation to Brexit, and while I see many academics comment on this or that aspect (often for very good reasons), the silence after the recent events in Hungary is almost deafening. Of course, Hungary is not alone in this. Academic freedom is now targeted in many places inside and outside Europe. If we continue to let it happen, the academic community in Europe and elsewhere will disintegrate very soon. But of course we can continue to praise our entrepreneurial spirit in the business park of academia and believe that people’s work is interchangeable salva veritate; we can continue talking to ourselves, listen diligently to our echoes, and make soliloquies a great genre again.

____

* See also this earlier and very pertinent post by Eric Schliesser.

** See also this article. And this call for support.

How we unlearn to read

Having been busy with grading again, I noticed a strange double standard in our reading practice and posted the following remark on facebook and twitter:

A question for scholars. – How can we spend a lifetime on a chapter in Aristotle and think we’re done with a student essay in two hours? Both can be equally enigmatic.

Although it was initially meant as a joke of sorts, it got others and me thinking about various issues. Some people rightly pointed out that we mainly set essay tasks for the limited purpose of training people to write; others noted that they are expected to take even less than two hours (some take as little as 10 minutes per paper). Why do we go along with such expectations? Although our goals in assigning essays might be limited, the contrast to our critical and historical engagement with past or current texts of philosophers should give us pause. Let me list two reasons.

Firstly, we might overlook great ideas in contributions by students. I am often amazed how some students manage to come up with all crucial objections and replies to certain claims within 20 minutes, while these considerations took perhaps 20 years to evolve in the historical setting. Have them read, say, Putnam’s twin earth thought experiment and listen to all the major objections passing by in less than an hour. If they can do that, it’s equally likely that their work contains real contributions. But we’ll only notice those if we take our time and dissect sometimes clumsy formulations to uncover the ideas behind them. I’m proud to have witnessed quite a number of graduate students who have developed highly original interpretations and advanced discussions in ways that I didn’t dream of.

Secondly, by taking comparably little time we send a certain message both to our students and ourselves. On the one hand, such a practice might suggest that their work doesn’t really matter. If that message is conveyed, then the efforts on part of the students might be equally low. Some students have to write so many essays that they don’t have time to read. And let’s face it, grading essays without proper feedback is equally a waste of time. If we don’t pay attention to detail, we are ultimately undermining the purpose of philosophical education. Students write more and more papers, while we have less and less time to read them properly. Like a machine running blindly, mimicking educational activity. On the other hand, this way of interacting with and about texts will affect our overall reading practice. Instead of trying to appreciate ideas and think them through, we just look for cues of familiarity or failure. Peer review is overburdening many of us in similar ways. Hence, we need our writing to be appropriately formulaic. If we don’t stick to certain patterns, we risk that our peers miss the cues and think badly of our work. We increasingly write for people who have no time to read, undermining engagement with ideas. David Labaree even claims that it’s often enough to produce work that “looks and feels” like a proper dissertation or paper.

The extreme result is an increasing mechanisation of mindless writing and reading. It’s not supring that hoaxes involving automated or merely clichéd writing get through peer review. Of course this is not true across the board. People still write well and read diligently. But the current trend threatens to undermine educational and philosophical purposes. An obvious remedy would be to improve the student-teacher ratio by employing more staff. In any case, students and staff should write less, leaving more time to read carefully.

___

Speaking of reading, I’d like to thank all of you who continue reading or even writing for this blog. I hope you enjoy the upcoming holidays, wish you a very happy new year, and look forward to conversing with you again soon.

Philosophical genres. A response to Peter Adamson

Would you say that the novel is of a more proper literary genre than poetry? Or would you say that the pop song is less of a musical genre than the sonata? To me these questions make no sense. Both poems and novels form literary genres; both pop songs and sonatas form musical genres. And while you might have a personal preference for one over the other, I can’t see a justification for principally privileging one over the other. The same is of course true of philosophical genres: A commentary on a philosophical text is no less of a philosophical genre than the typical essay or paper.* Wait! What?

Looking at current trends that show up in publication lists, hiring practices, student assignments etc., articles (preferably in peer-reviewed journals) are the leading genre. While books still count as important contributions in various fields, my feeling is that the paper culture is beginning to dominate everything else. But what about commentaries to texts, annotated editions and translations or reviews? Although people in the profession still recognise that these genres involve work and (increasingly rare) expertise, they usually don’t count as important contributions, even in history of philosophy. I think this trend is highly problematic for various reasons. But most of all it really impoverishes the philosophical landscape. Not only will it lead to a monoculture in publishing; also our teaching of philosophy increasingly focuses on paper production. But what does this trend mean? Why don’t we hold other genres at least in equally high esteem?

What seemingly unites commentaries to texts, annotated editions and translations or reviews is that they focus on the presentation of the ideas of others. Thus, my hunch is that we seem to think more highly of people presenting their own ideas than those presenting the ideas of others. In a recent blog post, Peter Adamson notes the following:

“Nowadays we respect the original, innovative thinker more than the careful interpreter. That is rather an anomaly, though. […]

[I]t was understood that commenting is itself a creative activity, which might involve giving improved arguments for a school’s positions, or subtle, previously overlooked readings of the text being commented upon.”

Looking at ancient, medieval and even early modern traditions, the obsession with what counts as originality is an anomaly indeed. I say “obsession” because this trend is quite harmful. Not only does it impoverish our philosophical knowledge and skills, it also destroys a necessary division of labour. Why on earth should every one of us toss out “original claims” by the minute? Why not think hard about what other people wrote for a change? Why not train your philosophical chops by doing a translation? Of course the idea that originality consists in expressing one’s own ideas is fallacious anyway, since thinking is dialogical. If we stop trying to understand and uncover other texts, outside of our paper culture, our thinking will become more and more self-referential and turn into a freely spinning wheel… I’m exaggerating of course, but perhaps only a bit. We don’t even need the medieval commentary traditions to remind ourselves. Just remember that it was, amongst other things, Chomsky’s review of Skinner that changed the field of linguistics. Today, writing reviews, working on editions and translations doesn’t get you a grant, let alone a job. While we desperately need new editions, translations and materials for research and teaching, these works are esteemed more like a pastime or retirement hobby.**

Of course, many if not most of us know that this monoculture is problematic. I just don’t know how we got there that quickly. When I began to study, the work on editions and translations still seemed to flourish, at least in Germany. But it quickly died out, history of philosophy was abandoned or ‘integrated’ in positions in theoretical or practical philosophy, and many people who then worked very hard on the texts that are available in shiny editions are now without a job.

If we go on like this, we’ll soon find that no one will be able to read or work on past texts. We should then teach our students that real philosophy didn’t begin to evolve before 1970 anyway. Until it gets that bad I would plead for reintroducing a sensible division of labour, both in research and teaching. If you plan your assignments next time, don’t just offer your students to write an essay. Why not have them choose between an annotated translation, a careful commentary on a difficult passage or a review? Oh, of course, they may write an essay, too. But it’s just one of many philosophical genres, many more than I listed here.

____

* In view of the teaching practice that follows from the focus on essay writing, I’d adjust the opening analogy as follows: Imagine the music performed by a jazz combo solely consisting of soloists and no rhythm section. And imagine that all music instruction would from now on be geared towards soloing only… (Of course, this analogy would capture the skills rather than the genre.)

** See Eric Schliesser’s intriguing reply to this idea.

Why would we want to call people “great thinkers” and cite harassers? A response to Julian Baggini

If you have ever been at a rock or pop concert, you might recognise the following phenomenon: The band on the stage begins playing an intro. Pulsing synths and roaring drums build up to a yet unrecognisable tune. Then the band breaks into the well-known chorus of their greatest hit and the audience applauds frenetically. People become enthusiastic if they recognise something. Thus, part of the “greatness” is owing to the act of recognising it. There is nothing wrong with that. It’s just that people celebrate their own recognition at least as much as the tune performed. I think much the same is true of our talk of “great thinkers”. We applaud recognised patterns. But only applauding the right kinds of patterns and thinkers secures our belonging to the ingroup. Since academic applause signals and regulates who belongs to a group, such applause has a moral dimension, especially in educational institutions. Yes, you guess right, I want to argue that we need to rethink whom and what we call great.

When we admire someone’s smartness or argument, an enormous part of our admiration is owing to our recognition of preferred patterns. This is why calling someone a “great thinker” is to a large extent self-congratulatory. It signals and reinforces canonical status. What’s important is that this works in three directions: it affirms that status of the figure, it affirms it for me, and it signals this affirmation to others. Thus, it signals where I (want to) belong and demonstrates which nuances of style and content are of the right sort. The more power I have, the more I might be able to reinforce such status. People speaking with the backing of an educational institution can help building canonical continuity. Now the word “great” is conveniently vague. But should we applaud bigots?

“Admiring the great thinkers of the past has become morally hazardous.” Thus opens Julian Baggini’s piece on “Why sexist and racist philosophers might still be admirable”. Baggini’s essay is quite thoughtful and I advise you to read it. That said, I fear it contains a rather problematic inconsistency. Arguing in favour of excusing Hume for his racism, Baggini makes an important point: “Our thinking is shaped by our environment in profound ways that we often aren’t even aware of. Those who refuse to accept that they are as much limited by these forces as anyone else have delusions of intellectual grandeur.” – I agree that our thinking is indeed very much shaped by our (social) surroundings. But while Baggini makes this point to exculpate Hume,* he clearly forgets all about it when he returns to calling Hume one of the “greatest minds”. If Hume’s racism can be excused by his embeddedness in a racist social environment, then surely much of his philosophical “genius” cannot be exempt from being explained through this embeddedness either. In other words, if Hume is not (wholly) responsible for his racism, then he cannot be (wholly) responsible for his philosophy either. So why call only him the “great mind”?

Now Baggini has a second argument for leaving Hume’s grandeur untouched. Moral outrage is wasted on the dead because, unlike the living, they can neither “face justice” nor “show remorse”. While it’s true that the dead cannot face justice, it doesn’t automatically follow that we should not “blame individuals for things they did in less enlightened times using the standards of today”. I guess we do the latter all the time. Even some court systems punish past crimes. Past Nazi crimes are still put on trial, even if the system under which they were committed had different standards and is a thing of a past (or so we hope). Moreover, even if the dead cannot face justice themselves, it does make a difference how we remember and relate to the dead. Let me make two observations that I find crucial in this respect:

(1) Sometimes we uncover “unduly neglected” figures. Thomas Hobbes, for instance, has been pushed to the side as an atheist for a long time. Margaret Cavendish is another case of a thinker whose work has been unduly neglected. When we start reading such figures again and begin to affirm their status, we declare that we see them as part of our ingroup and ancestry. Accordingly, we try and amend an intellectual injustice. Someone has been wronged by not having been recognised. And although we cannot literally change the past, in reclaiming such figures we change our intellectual past, insofar as we change the patterns that our ingroup is willing to recognise. Now if we can decide to help changing our past in that way, moral concerns apply. It seems we have a duty to recognise figures that have been shunned, unduly by our standards.**

(2) Conversely, if we do not acknowledge what we find wrong in past thinkers, we are in danger of becoming complicit in endorsing and amplifying the impact of certain wrongs or ideologies. But we have the choice of changing our past in these cases, too. This becomes even more pressing in cases where there is an institutional continuity between us and the bigots of the past. As Markus Wild points out in his post, Heidegger’s influence continues to haunt us, if those exposing his Nazism are attacked. Leaving this unacknowledged in the context of university teaching might mean becoming complicit in amplifying the pertinent ideology. That said, the fact that we do research on such figures or discuss their doctrines does not automatically mean that we endorse their views. As Charlotte Knowles makes clear, it is important how we relate or appropriate the doctrines of others. It’s one thing to appropriate someone’s ideas; it’s another thing to call that person “great” or a “genius”.

Now, how do these considerations fare with regard to current authors? Should we adjust, for instance, our citation practices in the light of cases of harassment or crimes? – I find this question rather difficult and think we should be open to all sorts of considerations.*** However, I want to make two points:

Firstly, if someone’s work has shaped a certain field, it would be both scholarly and morally wrong to lie about this fact. But the crucial question, in this case, is not whether we should shun someone’s work. The question we have to ask is rather why our community recurrently endorses people who abuse their power. If Baggini has a point, then the moral wrongs that are committed in our academic culture are most likely not just the wrongs of individual scapegoats who happen to be found out. So if we want to change that, it’s not sufficient to change our citation practice. I guess the place to start is to stop endowing individuals with the status of “great thinkers” and begin to acknowledge that thinking is embedded in social practices and requires many kinds of recognition.

Secondly, trying to take the perspective of a victim, I would feel betrayed if representatives of educational institutions would simply continue to endorse such voices and thus enlarge the impact of perpetrators who have harmed others in that institution. And victimhood doesn’t just mean “victim of overt harassment”. As I said earlier, there are intellectual victims of trends or systems that shun voices for various reasons, only to be slowly recovered by later generations who wish to amend the canon and change their past accordingly.

So the question to ask is not only whether we should change our citation practices. Rather we should wonder how many thinkers have not yet been heard because our ingroup keeps applauding one and the same “great mind”.

___

* Please note, however, that Hume’s racism was already criticised by Adam Smith and James Beattie, as Eric Schliesser notes in his intriguing discussion of Baggini’s historicism (from 26 November 2018).

** Barnaby Hutchins provides a more elaborate discussion of this issue: “The point is that a neutral approach to doing history of philosophy doesn’t seem to be a possibility, at least not if we care about, e.g., historical accuracy or innovation. Our approaches need to be responsive to the structural biases that pervade our practices; they need to be responsive to the constant threat of falling into this chauvinism. So it’s risky, at best, to take an indiscriminately positive approach towards canonical and non-canonical alike. We have an ethical duty (broadly construed) to apply a corrective generosity to the interpretation of non-canonical figures. And we also have an ethical duty to apply a corrective scepticism to the canon. Precisely because the structures of philosophy are always implicitly pulling us in favour of canonical philosophers, we need to be, at least to some extent, deliberately antagonistic towards them.”

In the light of these considerations, I now doubt my earlier conclusion that “attempts at diversifying our teaching should not be supported by arguments from supposedly different moral status”.

*** See Peter Furlong’s post for some recent discussion.

In facts we trust? More on the war against education

Last week we learned that the Central European University is forced out of Hungary. While a lot of other bad things happened since then, this event confused me more than others. Why do people let this happen? And why are acts such as the abolishment of Gender Studies met with so little resistance by the scientific community? As far as I’m concerned, universities are global institutions. Expelling a university or abolishing a discipline should worry every democratic citizen in the world. But before I get onto my moral high ground again, I want to pause and understand what it actually is that I find so shocking. Perhaps you all find this trivial, but it’s only beginning to dawn on me that it is trust we’re lacking. So I think the reason that we (or too many of us) let these things happen is that we lost trust in institutions like universities. Let me explain.

In my last piece, I made some allusions about how this war against education exploits misguided beliefs about the fact-value distinction and postmodernism. I still think so, but I couldn’t really pin down what held my ideas together. I’m not sure I do now, but in focussing on the notion of trust I have more of a suspicion than last week. Loss of trust is widespread: We distrust banks, insurance companies, many politicians, and perhaps even our neighbours. That seems at least sometimes reasonable, because it often turns out that such institutions or their representatives do not act on behalf of our interest. And if they say they do, they might be lying. This distrust has probably grown so much that nobody even assumes that certain institutions would act out of anyone else’s interest.

In this respect, I guess universities and the disciplines taught there are perceived as a mixed bag. On the one hand, the commitments in the sciences and humanities seem to foster a certain trust and respectability. On the other hand, not really or not anymore. I don’t know whether this is owing to the fact that universities are increasingly run like businesses, but the effect is that it’s kind of ok to distrust academics. They might be lobbying like anyone else. They might have an agenda over and above the noble pursuit of truth.

Once you embrace this idea, you immediately see why the bashing of “experts” worked so well during the Brexit campaign. You also understand why conspiracy theories begin to be seen on a par with scientifically established results. If everyone might just be lobbying, then the very idea of academic expertise is undermined. Just today, I read a comment on FB claiming that “we don’t need any higher education, now that we’ve got free access to information via the internet”. You might be shaking your head like me, but once the trust is gone, why should anyone be more respectable than anyone else?

Now all of this is not really disheartening enough. And yes, it get’s worse. I guess most of us have of course noticed something to this effect. When we write grant applications or even papers we actually do engage in lobbying. We say: look, my evidence is better, my thought more thorough, my method more reliable! Being used to such tropes, it’s a small step towards saying that STEM fields are better because they deal with facts rather than ideologies, like Gender Studies. This simplistic opposition between facts and ideologies or values is the last straw that some scientists cling to in order to promote their own trustworthiness. But while some might be thinking that they are acting in the name of Truth, they will ultimately be perceived as just another person doing business and praising their goods.

You might want to object that it has always been (a bit) like this. But, I reply, back in the day universities were not perceived as businesses, nor academics as entrepreneurs. The upshot is that the competition for resources in academia makes us not only claim that we advance a project. It’s a fine line to saying that we are engaging in a different and better and ultimately more trustworthy kind of project than the colleague next door. In doing so, we cross the line from advancement to difference, thereby undermining the common ground on which we stand. In such cases, we insinuate that we are more trustworthy than our colleagues from different disciplines. In doing so, we undermine the trust that all academics need collectively in order for a university to remain a trustworthy institution.

Our competition and lobbying threatens to undermine the trust in institutions of higher education. Not surprising, then, that politicians who claim just that will find so much approval. If academics don’t treat one another as trustworthy, why should universities be trustworthy institutions? Why should politicians even pretend to trust experts, and why should people outside academia trust them? Yes, I know there are answers. But if we want to counter anti-democratic anti-intellectualism, we need to maintain or rather restore trust in universities. But this requires two things: we must stop undermining each other’s reputation; and we must counter the belief that a university is a business just like any other.