You don’t get what you deserve. Part II: diversity versus meritocracy?

“I’m all for diversity. That said, I don’t want to lower the bar.” – If you have been part of a hiring committee, you will probably have heard some version of that phrase. The first sentence expresses a commitment to diversity. The second sentence qualifies it: diversity shouldn’t get in the way of merit. Interestingly, the same phrase can be heard in opposing ways. A staunch defender of meritocracy will find the second sentence (about not lowering the bar) disingenuous. He will argue that, if you’re committed to diversity, you might be disinclined to hire the “best candidate”. By contrast, a defender of diversity will find the first sentence disingenuous. If you’re going in for meritocratic principles, you will just follow your biases and ultimately take the properties of “white” and “male” as a proxy of merit. – This kind of discussion often runs into a stalemate. As I see it, the problem is to treat diversity and meritocracy as an opposition. I will suggest that this kind of discussion can be more fruitful if we see that diversity is not a property of job candidates but of teams, and thus not to be seen in opposition to meritocratic principles.

Let’s begin with a clarification. I assume that it’s false and harmful to believe that we live in a meritocracy. But that doesn’t mean that meritocratic ideas themselves are bad. If it is simply taken as the idea that one gets a job based on their pertinent qualifications, then I am all for meritocratic principles. However, a great problem in applying such principles is that, arguably, the structure of hiring processes makes it difficult to discern qualifications. Why? Because qualifications are often taken to be indicated by other factors such as prestige etc. But prestige, in turn, might be said to correlate with race, gender, class or whatever, rather than with qualifications. At the end of the day, an adherent of diversity can accuse adherents of meritocracy of the same vices that she finds herself accused of. So when merit and diversity are taken as being in opposition, we tend to end up in the following tangle:

  • Adherents of diversity think that meritocracy is ultimately non-meritocratic, racist, sexist, classist etc.
  • Adherents of meritocracy think that diversity is non-meritocratic, racist, sexist, classist etc.*

What can we do in such a stalemate? How can the discussion be decided? Something that typically gets pointed out is homogeneity. The adherent of diversity will point to the homogeneity of people. Most departments in my profession, for instance, are populated with white men. The homogeneity points to a lack of diversity. Whether this correlates to a homogeneity of merit is certainly questionable. Therefore, the next step in the discussion is typically an epistemological one: How can we know whether the candidates are qualified? More importantly, can we discern quality independently from features such as race, gender or class? – In this situation, adherents of diversity typically refer to studies that reveal implicit biases. Identical CVs, for instance, have been shown to be treated as more or less favourable depending on the features of the name on the CV. Meritocratists, by contrast, will typically insist that they can discern quality objectively or correct for biases. Again, both sides seem to have a point. We might be subject to biases, but if we don’t leave decisions to individuals but to, say, committees, then we can perhaps correct for biases. At least if these committees are sufficiently diverse, one might add. – However, I think the stalemate will get passed indefinitely to different levels, as long as we treat merit and diversity as an opposition. So how can we move forward?

We try to correct for biases, for instance, by making a committee diverse. While this is a helpful step, it also reveals a crucial feature about diversity that is typically ignored in such discussions. Diversity is a feature of a team or group, not of an individual. The merit or qualification of a candidate is something pertaining to that candidate. If we look for a Latinist, for instance, knowledge of Latin will be a meritorious qualification. Diversity, by contrast, is not a feature, to be found in the candidate. Rather, it is a feature of the group that the candidate will be part of. Adding a woman to all-male team will make the team more diverse, but that is not a feature of the candidate. Therefore, accusing adherents of diversity of sexism or racism is fallacious. Trying to build a more diverse team rather than favouring one category strikes me as a means to counter such phenomena.

Now if we accept that there is such a thing as qualification (or merit), it makes sense to say that in choosing a candidate for a job we will take qualifications into account as a necessary condition. But one rarely merely hires a candidate; one builds a team, and thus further considerations apply. One might end up with a number of highly qualified candidates. But then one has to consider other questions, such as the team one is trying to build. And then it seems apt to consider the composition of the team. But that does not mean that merit and diversity are opposed to one another.

Nevertheless, prioritising considerations about the team over the considerations about the candidates are often met with suspicion. “She only got the job because …” Such an allegation is indeed sexist, because it construes a diversity consideration applicable to a team as the reason for hiring, as if it were the qualification of an individual. But no matter how suspicious one is, qualification and diversity are not on a par, nor can they be opposing features.

Compare: A singer might complain that the choir hired a soprano rather than him, a tenor. But the choir wasn’t merely looking for a singer but for a soprano. Now that doesn’t make the soprano a better singer than the tenor, nor does it make the tenor better than the soprano. Hiring a soprano is relevant to the quality of the group; it doesn’t reflect the quality of the individual.

____

* However, making such a claim, an adherent of meritocracy will probably rely on the assumption that there is such a thing as “inverted racism or sexism”. In the light of our historical sitation, this strikes me as very difficult to argue, at least with regard to institutional structures. It’s seems like saying that certain doctrines and practices of the Catholic Church are not sexist, simply because there are movements aiming at reform.

You don’t get what you deserve. Part I: Meritocracy in education vs employment relations

The belief that we live in a meritocracy is the idea that people get what they deserve. At school you don’t get a good grade because of your skin colour or because you have a nice smile but because you demonstrate the required skills. When I was young, the idea helped me to gain a lot of confidence. Being what is now called a first-generation student, I thought I owed my opportunity to study to a meritocratic society. I had this wonderfully confident belief that, if you’re good or trying hard enough, you’ll get where you want. Today, I think that there is so much wrong with this idea that I don’t really know where to start. Meritocratic beliefs are mostly false and harmful. In the light of our sociological knowledge, still believing that people get what they deserve strikes me as on a par with being a flat earther or a climate change denialist. At the same time, beliefs in meritocratic principles are enormously widespread and deep-rooted, even among those who should and do know better. In what follows, I attempt to make nothing more than a beginning to look at that pernicious idea and why it has so much currency.

Perhaps one of the greatest problems of meritocratic ideas is that they create a normative link between possibly unrelated things: There is no intrinsic relation between displaying certain qualities, on the one hand, and getting a job, on the other hand. Of course, they might be related; in fact, displaying certain qualities might be one of the necessary conditions for getting the job. But the justification structure suggested by meritocratic beliefs clearly obscures countless other factors, such as being in the right place at the right time etc. Here are two variants of how this plays out:

  • “I’m not good enough.” – This is a common conclusion drawn by most people. That is, by those, who don’t get the job or grant or promotion they have applied for. If there is one job and a hundred applicants, you can guess that a large amount of people will think they were not good enough. Of course, that’s nonsense for many reasons. But if the belief is that people get what they deserve, then those not getting anything might conclude to be undeserving. A recent piece by a lecturer leaving academia, for instance, contends that part of the problem is that one always has to show that one is “better than the rest”, insinuating that people showing just that might get the job in the end. But apart from the fact that the imbalance between available jobs and applicants pushes such demands to absurd heights, the question arises whether any employer could be sufficiently good to be able to recognise the enormously refined qualities of the applicants.
  • “My qualities are not recognised.” –  The more confident applicants among us might thus draw quite another conclusion, namely that they are good enough, but that their qualities are simply not seen. The counterfactual behind this reasoning seems to be the following: Had my prospective employer seen how good I am, she would have hired me. As I see it, both kinds of reasoning are fallacious in that they construe the relation between performance and getting the job / grant etc. too tightly. Of course, most people know that. But this knowledge does not prevent one from going along with the fallacious reasoning. Why is that? Well, my hunch is that meritocratic beliefs are deeply ingrained in our educational system and spill over to other contexts, such as employment relations. Let me explain.

Most education systems hold a simple promise: If you work hard enough, you’ll get a good grade. While this is a problematic belief in itself, it is a feasible idea in principle. The real problem begins with the transition from education to employment relations in academia. If you have a well performing course, you can give all of your thirty students a high grade. But you can’t give thirty applicants for the same position the job you’ve advertised, even if all the applicants are equally brilliant. Now the problem in higher education is that the transition from educational rewards to employment rewards is often rather subtle. Accordingly, someone not getting a job might draw the same conclusion as someone not getting a good grade.

It is here that we are prone to fallacious reasoning and it is here that especially academic employers need to behave more responsibly: Telling people that “the best candidate” will get the job might too easily come across like telling your first-year students that the best people will get a top grade. But the job market is a zero sum game, while studying is not. (It might be that there is more than just one best candidate or it might be impossible for the employer to determine who the best candidate is.) So a competition among students is of a completely different kind than a competition between job candidates. But this fact is often obscured. An obvious indicator of this is that for PhD candidates it is often unclear whether they are employees or students. Yet, it strikes me as a category mistake to speak about (not) “deserving” a job in the same way as about deserving a certain grade or diploma. So while, at least in an ideal world, a bad grade is a reflection of the work you’ve done, not getting a job is not a reflection of the work you’ve done. There is no intrinsic relation between the latter two things. Now that doesn’t mean that (the prospect of doing) good work is not a condition for getting a job, it just means that there is no relation of being deserving or undeserving.

Or to put the same point somewhat differently, while not every performance deserves a good grade, everyone deserves a job.

The impotence of hierarchy

Want to know a secret? There is this recurrent fear that many people in leadership positions told me about: “Now that I am in this position, I fear that people around me won’t speak their mind anymore, that they won’t dare criticising me. For all I know, they might think I am a fool but never tell me.” I think the first time it was my PhD supervisor who told me, and he even told me that this was also the worst fear of his supervisor. So there is a chain of fear passed on down the line. If I ask my students to be frank, I could also add that my supervisor … It’s a bit of a sad story, because we know how it goes. Back in the day, I wasn’t very open to my supervisor, and the times I tried, I often regretted it. – These fears are woven into the fabric of our hierarchies. Understandable as they might be, they are dangerous. The can preclude open discussion and correction. Given that I’m spending much of my time in universities, I am struck by how often I encounter this. In what follows, I’d like to look at a few instances and ask whether there are any remedies.

Before walking through some examples, let’s begin by looking at the phenomenon. Power imbalance is often portrayed as unidirectional state. The boss or supervisor has power; the employees or students dependent on the boss fear the boss. But as I see it, the fear has a reciprocal structure: You are afraid to criticise your boss because he or she might reproach you for doing so. Knowing your fear, the boss is afraid that you will hide your criticisms. This might spiral into a number of refined and uncomfortable assumptions. “I’ll rather tell him something nice about himself.” – “She only said that because she wants to divert attention from her criticism.” – “He doesn’t take me seriously.” – “She doesn’t take me seriously.” Mutual mistrust might follow.* If this kind of setting is large enough, the mistrust might translate into full-blown conspiracy theories. But I think the problem, at root, is not the hierarchy itself. The problem is that we often treat a hierarchical position as a personal rather than an institutional feature. But your boss is not your boss because he or she is a better whatever, but because the design of our institutions requires this function.** In this sense, hierarchy is owing to ways of dividing labour. However, while some contexts might require hierarchical division of labour, certain processes cannot function in the presence of hierarchy. Collective deliberation, for instance, is not possible if someone in the collective intervenes qua greater power. If my thoughts are taken to carry more weight because I’m a professor rather than a student, then we do not need any discussion. Or do we? Let’s look at some instances then:

  • Deliberation in science. – It’s often noted that the current corona crisis makes our shortcomings obvious. So let’s begin with policy design in the corona crisis. Given the complexity of the problems in this crisis, you would expect that decision makers listen to a number of voices. But in the Netherlands, for instance, the opposite seems to be true: “There is no discussion … Because there is a crisis, it is not allowed to have a discussion.” These are the words of Microbiologist Alex Friedrich. Rather than following the guidelines of the government, he caused quite some stir by speaking up against the Dutch strategy and partly changed the course of action by demanding more testing in the north. His point is that scientific advice is too hierarchical and focused on too few voices. Instead, it should be run like a “jam session” where everyone speaks up. I guess you don’t have to be a jazz lover to appreciate the fact that you are more likely to hit on crucial facts when you listen to as many people and disciplines as possible. But the example shows that collective deliberation is still obstructed rather than enhanced (see also here).
  • Transitions in the university. – Borrowing a quote from a British colleague, the president of our university recently noted that implementing change in universities were like ‘reorganising a graveyard: “You don’t get much support from the inside”.’ The idea seems to be that changes like the current transitions to online teaching require an “external shock”. While shock might indeed be an occasion for change, I think that the comparison to the graveyard has clear limitations. I doubt that successful transition works without calling on the employees who actually do the implementing. So when we plan the details of this transition, I am sure our success will depend on whether we will listen carefully to the experiences and insights “the inside” has to offer. Indeed, the digital infrastructure that we now rely on increasingly provides great tools to implement change with the necessary transparency and participation of everyone involved. Sometimes this comes with comic relief: At the mercy of advanced technology, hierarchies of seniority are quickly turned upside down.
  • Hierarchy in teaching. – As I have noted earlier, my status as a professor should not enhance the status of what I have to say. And yet we all know that when I enter the lecture hall, the institutional powers put me in a special position, whether we like it or not. The fact that I am grading the very students who might criticise me seems to settle the intellectual hierarchy, too. Can this be evaded? Should it be? As I see it, the hierarchical power of professors over students is limited to the educational tasks they set within the institutional setting they share. I can give you a task and then tell to what extent you solved it well, whether you drew on all relevant resources etc. But an educational task, to be dealt with in an exam or essay, is different from the complex problems that confront us. Once we go beyond the confinement of exercises, students are fellow philosophers and citizens, where hierarchy should no longer apply. For the reasons noted above, the hierarchy might still be effective. But it is to our detriment if we allow it to happen.

Hierarchy, taken as a personal trait, then, obstructs true deliberation, diversity and learning. In an ideal setting, archangels could openly learn from the devil’s criticism. That said, it’s hard to figure out how we can evade the traps and fears that hierarchies foster. But we should be weary whenever discussion is closed with reference to hierarchical position. It harms all sides, those with more and less powers. But of course it’s hard to bypass something so deeply ingrained in our system. Yet, if someone politely asks you to shut up and listen, it might be best to go along and listen. In the same vain, those with more power should seek, not shun, advice from everyone. Acquiring knowledge and finding solutions, even if governed by good methods, is an accidental and collective process. You might have no idea what you’re missing. So keep asking around and encourage others. It’s always an institution, not you, that grants the power over people. The more power you exercise over them, the more likely it is that people refrain from telling you uncomfortable truths.

____

* A perfect illustration is Paul Watzlawick’s “Story of the Hammer”.

** However, one might doubt whether hierarchies really obtain because of a functional division of labour. The economist Stephen Marglin famously argues that “the capitalist organization of work came into existence not because of superior efficiency but in consequence of the rent-seeking activities of the capitalist.” (I wish to thank Daniel Moure for pointing me to the work of Marglin, especially to the seminal paper “What do bosses do?”)

What’s it like to be (with) a superspreader? A note on the ethics of contagion

We’re used to the trope that our personal actions don’t make much of a difference. Arguably, in tackling climate change it’s not my choice to take an individual flight that makes things better or worse. In the current pandemic, however, nothing could be further from the truth. If I happen to be infectious, taking a flight these days might turn me into a superspreader, setting off a chain of infections that might harm a great amount of people. While we normally have to adapt to the world, the potential of spreading a virus like that has the uncanny effect that the (social) world, suffering infection, ‘adapts to us’, the one spreading. Of course, there are good reasons to avoid labelling individual people as superspreaders, but the fact remains that my individual behaviour might contribute to large-scale infections. The possibility of spreading the virus makes a number of very common habits doubtful and raises a number of moral questions. If I am contagious, then I should take precautions so as not to harm others. Therefore it’s not surprising that we find ourselves confronted with the recurrent advice to wash our hands and stay at home. However, even if the precautions to be taken are individual actions, they require a supportive social setting and compliance. If my employer, for instance, coerces me to work without taking precautions, the blame should be placed accordingly. Thus, new kinds of responsibilities emerge. In what follows, I’d like to consider some aspects of such responsibilities.

Being harmful. – In spreading the coronavirus, we cause harm. The idea of being alerted to the fact that one was responsible for such a spreading is enormously unpleasant, to say the least. While we might not want to attribute moral responsibility to a spreader, we will deem it epidemiologically important to track such a patient. So while such a spreading might not count as a (voluntary) action because it is not intended, it requires us to see ourselves as a cause of harm. That said, being involved in such an event might count as a case of (bad) moral luck and can hardly be dissociated from moral considerations.* Now the assumption that we are merely involuntary causes in such events no longer holds once we know that we are in a pandemic. In this case, I ought to take precautions. A failure to do so would strike me as morally blameworthy. So if I neglect hygiene measures (and end up spreading the virus), I am behaving irresponsibly and blameworthily. However, and this is the point I want to highlight, my fellow citizens and those responsible for living and working conditions in particular also have moral duties. Collectively, we might be said to have the duty to mutually enable each other to take necessary precautions. Now what does this amount to?

The moral status of spreading and spreading advice. – Advice such as “wash your hands!” and “stay at home (whenever possible)” is certainly helpful and ought to be followed in our current pandemic. Yet, it is a double-edged sword. On the one, hand it promotes risk aversion. If people comply, they might indeed prevent spreading and thus create a safe environment. On the other hand, it can be stigmatising. Given that at least staying at home comes at quite a price for some, we must bear in mind that calling out noncompliance might stigmatise and harm others, too. With the lifting of the lockdowns, we not only see people prematurely hasten ‘back to normal’, we also see a growing divide between those complying and those not complying with the restrictions. This divide is not helpful for either side. As I see it, people can only comply successfully with restrictions in a supportive environment. While it is true that individual actions can make a lot of a difference, individuals must have a chance to balance their compliance with the costs that arise. For a tenured professor like me, for instance, it’s easy to stay at home. But that is worlds apart from asking compliance of a shop assistant, who might be sacked if she fails to expose herself to hoards of potentially infectious customers, frowning at her for not wearing her mask correctly. So while everyone needs to consider themselves as a potential cause of harm to others, we need to create an environment in which we enable such considerations. Calling out others will more likely provide an incentive to shift the blame.

What is a supportive environment? – No matter what strategy (if any) your government is following, we need to comply with certain restrictions, if we want to prevent harming people through spreading the coronavirus. A supportive environment is one in which we can comply without harming ourselves. If we ignore the needs of children for a moment, one of the greatest factors that make compliance with lockdown restrictions difficult is the fact that we have to work and rely on other people’s work (to supply for our needs). Thus, we need to make sure that our working conditions allow for compliance. It is here that see a great number of difficulties. Let’s briefly highlight two issues:

  • The right to protect yourself from harm. – Compliance can only be demanded insofar as people can comply without harming themselves, be it economically or with regard to their health. Now from the very beginning of the corona crisis it was obvious that a number of people seem to have no effective means to protect themselves. If it is true that being indoors with other people is one of the greatest risks of infection, then care workers are exposed to an enormous danger of infection. Indeed, recent data shows that most COVID-19 deaths are occurring in this profession. If confined spaces are problematic, then what about schools, shops, public transport and the like? (While it might be ok to go shopping, it’s queite another question whether shop assistants are really well protected.) Perhaps with the exception of the medical sector, this seems to be largely in keeping with ongoing class issues. As I see it, our societies and the respective employers in particular are often not providing a safe working environment. But how can we expect compliance if we allow for such an amount of disregard within social settings, firms and institutions?
  • The duty to prevent spreading. – Is there a moral duty to prevent spreading? I guess, there is if we can do it without harming ourselves. This is why many people (myself included) consider the lifting of many lockdown measures as premature, especially in that they seem to incentivise outright blameworthy behaviour of crowds gathering to protest for individual liberties or whatever. But besides such aberrations there are more subtle cases. In mid-March, for instance, a colleague returning from a high-risk area was reproached for potentially causing “panic” among students by asking them to keep a safe distance. While I think that causing panic would be bad, I also think that universities ought to prioritise providing a safe environment. Thus, it is vital that universities and indeed other institutions follow policies that enable individuals to act in compliance with preventive measures. While most universities have now moved their education online, they are partly pushing for moving back on campus asap. While eventually returning to in-person education seems sensible and desirable, such moves might also incentivise an unhealthy competition, incentivising premature steps.

So in many cases we might witness (or at least have witnessed) that people are not complying with preventive measures. However, when judging the morality of harmful behavior, we must ask whether they are acting under conditions that allow for compliance in the first place. As much as I am upset to see people behaving in ways that might harm others or themselves, when passing moral judgement of their individual actions, we must bear in mind that the responsibility for enabling preventive behaviour is a collective sort of responsibility. Although I could cause enormous harm by spreading the virus, my actions to prevent such harmful actions have to rely on collective support. In a nutshell, it’s probably more appropriate to blame certain groups, firms and institutions than individuals. Both taking precautions and easing restrictions should be implemented such that these actions allow for mutual support.

––––

* Which reminds me of an intriguing discussion of Adam Smith on the so-called piacular by Eric Schliesser.

PS. Many thanks to Justin Weinberg for suggesting an important revision in my phrasing.

Precarity and Privilege. And why you should join a union, today

Reflecting on the personal impact of the corona crisis, a close friend remarked that things didn’t change all that much, rather they became obvious. I then began to hear variations of that idea repeatedly. If you live in a complicated relationship, that might very well show right now. If you have made difficult decisions, their consequences might be more palpable now. If you live in a precarious situation, you will feel that clearly now. On the other hand, there might be good stuff that you perhaps hardly noticed, but if it’s there, it will carry you now. On social media, I sense a lot of (positive) nostalgia. People remember things, show what mattered then and now. Things become obvious. The crisis works like a magnifying glass.

This effect also shows how well we are prepared. As an adolescent, I used to smile at my parents for storing lots of food cans in their basement. Of course, most of us also laugh at people rushing to hoard toilet paper, but how well prepared are we for what is coming? Perhaps you think that if we’re lacking things and certain habits now, this is owing to individual failures or laziness. But if we experience precariousness, hardly any of that is an individual fault. Habits need collective stabilisation and consolidation to persist. That said, I’m not going to focus on the state of your basement or hygiene measures. Rather, I’m worried about the question of how well we are politically prepared. Many people around me are facing really dire situations. And our political preparation (or lack thereof) leaves us with very few means to address them properly. So what can be done? I’ll begin with some general considerations and try to finish with some practical advice.

If we look around, we see that a lot can be done. Slowing down the economy like that without immediate chaos ensuing is a huge success. But very soon, people will start telling each other that certain things “cannot” be done, because they are “too difficult”, “too expensive” or “against the rules”. While a lot of good is happening, the bargaining and gaslighting has already begun. Being a highly competitive culture, academia has a notorious problem with collective action (role models in the UK who have been on strike for enormous amounts of time notwithstanding). But this crisis requires collective measures, both in terms of hygiene and in terms of politics.

What’s the problem? Precarious employment (not only) in academia has been a growing factor for a long time. As I see it, this jeopardizes not only political but also academic goals, because it leads to an unwelcome dissociation of teaching and research. But at the present moment, this precarity might turn into something much worse. We already see furloughs and dismissals especially of people on fixed term contracts and the flimsy justifications rolling in on a fairly large scale. At the same time, we witness what we have already seen in the medical sector. We lack transnational policies and thus people are being treated very differently, depending on where they happen to work and what sort of contract they have. Add to this that many ad hoc measures, such as online teaching, are now used as a pretext to introduce lasting changes that may be detrimental to both employment conditions and educational standards. So the precarity and educational standards might worsen to a tipping point where education might become largely disposable. Indeed, mass education is of course disposable already, unless you have democratic tendencies.

What can be done? The first thing I find striking is that, while people continuously talk about online teaching and other means of continuing work, hardly anyone addresses the question of precarious employment. Given the current firings and freezing of hirings, we know that the job market is going to be brutal. If you are, say, an international postdoc or teaching fellow whose contract runs out after the summer, it will be very difficult to find or even seek employment. While I see people readily exchanging advice on zooming, I’ve seen hardly anything so far on how to address this problem. The exceptions to this rule are labour unions and some employee organisations some of which are currently collecting data and push for measures. (You know of more exceptions? Please spread the news widely!)* Now let me ask you: Are you a member of a union? No? You’re no exception. In the various places I worked during and after my PhD, I have never been encouraged to join a union. It’s almost as if there were no awareness that there is such a thing as the representation of employees’ interests. In fact, I guess it’s worse, and it’s something I’ve not only noticed in academia but also in much of the growing freelance and start-up culture. Going from my own experience, I’d say that people always have been and still are (more or less subtly) discouraged from joining such organisations. So when employees encounter difficulties in their employment, they typically will be portrayed as not being tough enough for the job. You are overworked? Well, if you don’t blame yourself already, you’ll likely be shamed into avoiding publicity. Being overworked is still portrayed as a personal lack of stamina, to be addressed not by collective industrial action but by courses on time management or mindfulness. This way, failing to secure (permanent) employment can still be blamed on the individual rather than on the way higher education is run.

The individualisation of such problems does not only affect people’s (mental) health, it also keeps people away from engaging in collective action. In turn, this means that unions etc. will remain weak because they can easily be portrayed as not representing anyone. If people keep blaming themselves, the unions don’t have a case for building an argument in favour of better employment conditions. I see this as one of the main reasons why we are politically not well prepared for addressing economic problems in this crisis. So what should we do now?

Trying to collect ideas, I have written to a number of friends and colleagues who kindly provided me with suggestions. Let me briefly summarise some crucial points:

  • Generally, permanent / tenured people should take it upon them to make a move. We should be aware that people on fixed term contracts are vulnerable and should not be expected to lobby for their interests alone.
  • Try to see who is in or is likely to get into trouble and talk about the situation. Bring it up with your colleagues and managers whenever the opportunity arises. If you have department meetings or exchanges with funding agencies such as the ERC, ask what can be or is done to ameliorate the situation.
  • Join a union and encourage others to do so, too. In the Netherlands, the unions are taking it upon them to make a case for employees in precarious positions.
  • As I see it, it would be good for universities to invest in staff rather than reduce numbers. Wherever possible contracts should be extended, as is in fact done by various funding bodies.
  • If there are no financial resources for staff, measures should be taken to reallocate budgets, especially travel and overhead funding for the extension of contracts or hires.
  • Universities in Austria and Switzerland have created hardship funds for employees facing precarious situations. This should be done proactively, as people in vulnerable positions might feel discouraged to come forward.

These are just some ideas. I’d be grateful to hear more. But to my mind, the most important point is that we need to pursue such steps in a collective effort. Right now, these steps should be taken because we are in an emergency. Ensuring stability is what is required for providing a safe working environment.

Ultimately, taking measures of solidarity is also about helping academia to survive beyond this crisis. Whenever recession hits, education is often considered disposable. If we were to allow for the reduction of staff without resistance, it would just signal that academia could do with even fewer people and resources. Dictatorships would certainly welcome this. The way we treat our colleagues and students will contribute to determining the political system that we’ll find ourselves in after the crisis.

____

* Of course, there have been initiatives addressing the adjunct crisis. But I havent’t noticed that precarity has been an issue of great public concern in this crisis, even less so among tenured academics, as a recent piece by Emma Pettit notes:

“While tenured professors have typically stood by silently as their nontenured colleagues advocated for themselves on the national stage, they have watched their own kind dwindle. Positions are remaining unfilled. Tenure lines are getting pruned. There’s still the same service work to do, but fewer people to do it, and those who remain shoulder the burden.

And today, as a global pandemic has devastated budgets and led college leaders to freeze hiring and furlough even tenured professors, the cause seems especially urgent.

The structural changes that preceded the pandemic helped set the stage for those austerity measures, and manufactured a growing — if uneven, slow, some would say glacial — recognition among the tenured that relying on contingent labor hurts everyone, activists and higher-education researchers say. …
How much tenured professors have cared, historically, about their contingent colleagues, is difficult to measure. Everyone knows the caricature: the older, typically white, typically male full professor whose non-tenure-track colleagues escape his vision, who still believes merit rises to the top and those who fail to land tenure-track jobs lack work ethic, intelligence, or both. …
Even if tenured professors might not pay attention to the adjuncts who walked their hallways, they couldn’t help but notice the fates of their graduate students, who were being sent into a bottlenecked academic-jobs market to compete for slimmer pickings. They started to connect the dots.”

Will the future be like the past? Making sense of experiences in and of the corona crisis

The world is a different place now. But what does that mean? In keeping with my previous posts, I want to think about the way we experience this situation. Binge-scrolling through expert advice, curves and numbers is important for assessing the situation and deliberating about forms of collective action. But at the same time, it is essential to understand one another and ourselves within this situation, to return from the third-person talk to the second and first person perspective. Thus, a crucial part of our thinking should be devoted to the various meanings of our experience. I speak of “meanings” in the plural for two reasons. On the one hand, I think our experiences of the situation vary quite a lot, such that the events we undergo mean different things for different people. So your social and economical situation, for instance, matters greatly in how you will feel and how your expectations take shape. Would I feel as balanced as I do, if I worked, say, in a bar? Or as a postdoc who is facing that my contract is running out soonish? Even if we’re likely facing an enormous global recession, the current stability still affects my being. On the other hand, and this is perhaps surprising, I have noticed that my very own experiences have different meanings even to me. Let me explain: I have now been staying mostly inside (with family) for a bit more than three weeks. Given that I often suffer from anxieties, I would have expected that the growing corona crisis would make me feel bad. But while I have clearly lost a sense of normality, this doesn’t exactly trouble me. I feel ok, perhaps even slightly more balanced than in the months before. For a while, I thought that’s quite surprising. But then I realised that this is true of a number of people. In fact, this morning I read an article according to which some psychologists report that a significant number of patients with depression or anxiety disorders find that their situation improved, paradoxically so. How can we make sense of such experiences? Is there a way of explaining the eerily positive attitude some of us have in this crisis? I’m no psychologist. But as a historian of philosophy I know something about the ways in which we relate to our histories and biographies. My hunch is that this kind of experience is partly determined by our beliefs about how much the future will resemble the past. While trying to explain this hunch a bit more, I’ll say how this might help in assessing conflicts between people with different ways of experiencing the crisis. Will the future resemble the past then? As we will see, this is not a question of (future) facts but of values.

Speaking to various people about the corona crisis, it seems that most conversation partners fall into one of two categories: (1) those who believe that we’ll be “going back to normal” at some point and (2) those who believe that the future will be fairly different from the past. Let’s call them continuists and discontinuists respectively. Continuists think that the future resembles the past, even after this crisis. Accordingly, they will try and prepare for the time after the crisis in much the same way they have pursued their goals before. By contrast, discontinuists assume that the future is not only uncertain but likely different from the status quo of the past. Accordingly, they cannot prepare by pursuing the same goals by the same means. They will expect having to adjust their means or even their goals.
The question whether historical events are continuous with past events or mean a disruptive change is hotly debated, because whether or not you see continuity or change depends what criteria you focus on. But for now I’m less interested in the theoretical issue. Rather, I’m wondering how our pertinent beliefs affect our experience. A wise friend of mine once said that our beliefs about the future shape the present, for instance, in that such beliefs guide our current actions. If that’s correct, then continuists and discontinuists will be preparing for different future scenarios. Of course, the question which future scenario is more likely is a rather pressing one. What (else) will this virus do to us? Will the economy break down completely? Will we have civil unrests, wars over resources? Like you, I’m interested in these things, but lacking relevant knowledge I have nothing to say about them. What I want to address here is how being a continuist or discontinuist relates to your experience of the current situation.

Now how does having one or the other attitude affect your experience? As a continuist who retains your goals you will likely want to stick to your strategies and go back to normal if possible. The current restrictions (contact restrictions or lockdowns) will probably feel rather disruptive. By contrast, a discontinuist might welcome the disruption as way of preparing for an uncertain future. So my guess is that there is a correlation between being a discontinuist and having a more positive attitude towards the disruptive measures. Let’s illustrate this idea with an example. A controversial issue that arises for many people around me is productivity. While some people readily give tips on how to successfully remain productive at the home office and quickly switch to things like online teaching, others see these outbursts of productivity as a problematic distraction from more pressing issues. They worry, for instance, that the switch to online teaching will worsen the standing of academic teaching or the exploitation on the job market.
My idea is that we can pair up the conflicting approaches towards productivity with attitudes about (dis)continuity. While a continuist will remain productive, a discontinuist will be suspicious of such productivity as it seems likely to be jeopardised by the changes ahead. This doesn’t mean that the discontinuist will stop being productive tout court. It just means that the discontinuist will likely want to prepare for adjusting the means or even the goals, rather than keep going as before.

As this example shows, there is not only a difference but also a conflict between continuists and discontinuists. If you currently google the keywords “coronavirus” and “productivity” and look at the headlines, you’re clearly listening in on a fierce dispute. Should you work on improving your productivity? Or should you redirect your focus on different priorities? Continuists often seem to experience the restrictions as if their lives have been put on hold. The crisis might be very disruptive, but by and large the goals remain intact. This might also be mirrored in different attitudes of students: If you are an ambitious student and a continuist, your priority might still be to pass your exams well and quickly. If your university cancels the regular classes and exams (rather than running them online), you will likely be annoyed or worried. By contrast, discontinuists seem to experience the restrictions as the onset or emergence of a new situation; they will likely try to adjust their goals in line with hopes or guesses about the outcome. If you are an ambitious student and a discontinuist, your priority might be to understand and prepare for the new situation. Your focus or interests might change and you might appreciate a pertinent adjustment of teaching rather than the pursuit of former goals.

As I see it, this kind of conflict is often misrepresented. It often seems to be presented as a quest for the right way of responding to the crisis. Thus, depending on the predominant attitude around you, you might see your own response as a failure. Surrounded by continuists, the discontinuist will feel like being not sufficiently productive. Surrounded by discontinuists, the continuist will feel like insufficiently adapting to the new situation that will arise. However, as I see it the conflict between these two stances is not about the facts of the crisis or the predictable future but about values. Let me explain.

As I see it, the question whether there is a continuity after the crisis is not one that could be established by looking at current or estimated future facts. It would be fallacious to think that there is a definite cut off point that distinguishes continuity from discontinuity. In other words, whether a crisis like this allows for going “back to normal” or is a pervasive disruption is not an empirical question. If the crisis has very dire consequences, you can still claim that we’re going back to a “very impoverished normal”. If the crisis is not too disruptive, you can still claim the world is altered, if mainly by the prospect of the crisis returning. So it is the other way round: First you claim that there is a continuity or discontinuity, and then you quote empirical facts for support.

If this is correct, what is it then that makes the difference between continuists and discontinuists? As I said it’s a question of values. If you largely accept the norms of the status quo before the crisis you will evaluate the predicted situation as a deviation from these norms and find points of impoverished continuity. However, the discontinuist will see the norms of the former status quo as undermined. In fact, this is what allows for seeing discontinuity. So the future scenarios discontinuists see are ones in which new norms are established. They will be what we often call a “new normal”, for better or worse. Such a new normal might include, for instance, the restrictions that we anticipated in view of anthropogenic climate change and the Paris Agreement. Seen in this light the current measures taken against the corona crisis might appear as being in line with new norms to be consolidated.

What does this mean for the eerily positive attitude that some of us experience? Once you recognise that the belief in discontinuity is a matter of value, it’s plausible to assume that what empowers (some) people is the necessitated change of norms during lockdown. So while it might be right that the positive attitude correlates with former states of anxiety or depression, it would be dangerous to confine this to a psychological question of individuals. We shouldn’t overlook the societal values going hand in hand with such empowerment. Seen in line with societal values, the disruption of the status quo is not merely destructive. It holds the possibility to establish norms more in line with what many of us might desire in light of the challenges we face, for instance, with regard to climate change. It doesn’t mean that this possibility will become true. But as long as we’re not hit by total disaster, there is hope.

Nothing to lose? Other voices in the corona crisis

Many people, part of myself included, seem to see the current crisis as a loss of normality. That makes sense, because a lot of processes just came to a halt. But what if you didn’t feel at home in the world as it used to be some months ago? What if you felt like you didn’t belong or wouldn’t get anywhere in that world? If you felt guilty, day in day out, for not getting done what you ought to get done, you now wake up in a world in which hardly anyone gets anything done. The “new normal” that lockdowns create, then, might be a rather comforting kind of normal, at least more comforting than the old competitive world that we had to leave behind for the time being. – Writing my “search for a conversation”, I focused on the current situation as a disconnection from the recent past. Since then, I had a number of conversations that made me see quite a different perspective: The uncertainties and losses we partly experience might actually make us more equal, such that at least some aspects of the current situation might create more familiarity with one another than the former status quo. In what follows, I still don’t have much to say myself, rather I’d just like to give voice to this idea – an idea that I began to see thanks to the conversations I had so far.

Before I get into details, let me be clear about one thing: There is no cynicism or disregard for the current suffering. Quite the contrary. We need to distinguish between two facts: On the one hand, there is the spread of the virus and the disease it causes. On the other hand, there are our social responses and their impact on our lives. By “social responses” I mean measures such as the stay-at-home policy, the prohibition of public gatherings and events etc. In order to appreciate the perspective I’m trying to describe, it might help to imagine that the social responses were put in place independently of any threats to our health. So when I say that the crisis might be comforting for people, I mean that the social measures themselves (and not the disease) can afford comfort.

But how, you might ask, how does being locked up (more or less) afford comfort? The social measures have a number of effects. In my last post, I suggested they disconnect me from my recent past and all the norms that guided me. Of course, this might be disorienting. But think again! What if many of these norms are not helpful or even harmful? Take the general competition within various job markets. Now we’re asked to support each other rather than compete. Is that a bad thing? Take the priorities of academia. Is it so bad that we cannot churn out paper after paper, host workshop after workshop right now? Take the effects on the environment. Isn’t it a good thing that we can swiftly adapt to acting in accordance with measures that will play into ameliorating climate change?

If we focus on certain social, psychological and environmental effects, we can quickly see that there is a lot to be said in favour of our response to the crisis. But still, you might say, it is hard to let go of cherished conventions that guided our interactions. Isn’t it worth keeping them? The disconnection from our recent past might feel like a loss. But again, what if you didn’t feel at home in these conventions? What if you think that much of your life looks like a failure or non-normal in the light of the status quo we had to abandon? There will be way more interesting examples, but I guess that my own experience helps me in taking this perspective. Before I got a permanent job at Groningen in 2012, it became increasingly likely that I would end up with, well, not much. Remembering how it felt to live through various existential worries allows me to imagine myself in quite a different state. Unemployed, ignored, full of on-going self-doubts, would I have thought that I am losing much by the social distancing measures? Or would I perhaps have thought that everyone is now a bit more on the same page as I am?

I don’t know. But Jon, a reader of my last blog post, fleshed out pertinent thoughts about our response to the crisis in some detail. He kindly gave permission to quote his message to me:

After a short burst of anxiety I realised that for me life continues much the same. I’ve been probably unhappy and somewhat anxious for the last year and a half. Why? There was no obvious reason, but a sense of foreboding, that one was not prepared for some thing, that a house could likely never be owned, that I was too ordinary for my growing children to want to spend time with me as they got older, and that I was shaping up to be a net taker, rather than a giver. Confronted with a global event like this is in many ways is a relief, I think once a cancer patient knows what they have is terminal, what really is there to fear outside of the illness.? So, this has happened and it is what I’ve always feared, yet it is hardly that unpleasant. If one can keep ones home and have enough food, it is potentially entirely satisfying. Better still, to be free of the conflict a person can have when immersed in the guilt, the kind of guilt that tells us we should be working when we are not, we should be outside when we are in, we should be engaged in some activity that is full filling when really we can’t be bothered. I find this pandemic peaceful and an opportunity to gaze into the hearth of life, free of self criticism. I enjoy the small things I’ve not seen in people before, both good and bad, and shopping is a surprisingly pleasant experience where people are cordial and aware of you, in the supermarket itself there is space to move and a quiet, calm. On the farm where I live, the M25 that drones as if an ocean is just past the tree line, is quiet now. The ancient farmland seems to be remembering itself, walking around it in the evening it is once more in the depth of the countryside and not a faux village as it is in the modern era. I will miss this when it is over. While the pandemic itself is ruinous to a small number of people for the vast numbers of younger, healthier population, this is far safer an environment that the risks taken driving to work. It is astonishing to me the level of reaction, it is one we can not repeat, there will not be another time in which any sane government will consider suspending all economic activity for everyone on full pay – and on a global scale too! It is an extraordinary experience. I’ve been thinking the virus could be the narrative vehicle in a Pinter drama, the virus itself is relatively harmless, but the story is far more about a collection of disparate personalities confined in a small space. I tell people to savour these times, because scary or not it will be an unforgettable chapter in our lives and we will mourn it when it is over.

Jon’s account brings out aspects and possibilities that I wasn’t aware of when beginning to think about this situation. Taking them into account, I realise what an enormous privilege it is to think of the time before as something lost. Yes, I lived to some degree in accordance with the status quo, I was and still am a beneficiary of the system that is now under threat. But as I noted earlier, I see my alignment with this status quo as a then lucky accident. Things could have gone differently, quite differently. And then it would have been likely that I would have felt largely out of touch with what happened to become, to some degree, a sort of normality for me.

But whichever perspective is more in line with your biography, no matter whether you feel more at home in the abandoned status quo or the interim that we’re living in now, all of us have to face the time after, the time after “suspending all economic activity”. As Eric Schliesser points out today in a rather dark piece, we’ll likely find ourselves in “political turmoil”. As I see it, we might be facing a kind of post-war situation without having had a war. Economically speaking, things look pretty worrisome already. Socially and politically speaking, I fear we’ll be confronted with various myths that are already in the making. It remains to be seen which of these two perspectives will be a better preparation for the time to come. In any case, I find it vital to learn as much as possible about our various takes and hopes generated by the crisis, and look forward to many more exchanges.

Where are we now? In search of a conversation, beyond graphs and statistics

 “That the sun will not rise tomorrow is no less intelligible a proposition, and implies no more contradiction, than the affirmation, that it will rise.” Hume

I’ve been trying to begin this post for a number of days now. Everything I write feels like a failure. Why is that? Everything seems out of place. I don’t have any interesting opinions to share and I’ve even lost my own conventions for writing. Before March 2020, when I began to write something, I often felt a background against which I was thinking: What I wanted to say felt “worth noting”, “exaggerated”, “inappropriate”, “dull” or whatever, because there was a normality that afforded orientation, a canvas to which I could add my scribbles. Now it feels as if the canvas were gone. With everything in motion, all eyes fixed on the graphs showing the spread of the coronavirus, there is no normality. Of course, there are people trying to maintain business as usual, but funnily enough it is them who stand out a bit awkwardly now, because most of us live in uncertainty. (Here, I’m not talking about all the admirable professionals who work ceaselessly to limit the damage, but ordinary people tucked away in their homes.) We don’t know what’s coming next, whether curves go up or down, whether our neighbours or loved ones or we will be affected more drastically, whether we can return to the normal lives that we had to put on hold. What can we say, if we want to talk and say something beyond repeating or questioning statistics or predictions? It seems as if there were currently no proper place for a conversation between ordinary people. But we need to talk. At least I do.

It feels important to speak, not because I have anything to say, but simply because I think I need a continued conversation, I need to connect. I’m writing in the first person deliberately now. Firstly, because I’m not sure I can speak for anyone else. Secondly, because I begin to feel sick of the third-person perspective. The talk about our lives seems largely determined by graphs and pandemic plans, by talk about symptoms, spreaders, medical, social and economic challenges. Don’t get me wrong: I think this is vitally important. But beyond that, there doesn’t seem to be much in the way of ordinary exchange that acknowledges the current situation. The lack of normality makes most of what drove our interactions before March 2020 seem rather pointless. In a manner of speaking, I’m inclined to say that I no longer take it for granted that the sun will rise tomorrow. It’s custom and convention that made me expect so, but much of that custom and convention is shattered for the time being. Currently, my friends and colleagues are not friends and colleagues but potential spreaders. So am I. Thank God, then, we have these virtual tools now, but they do have limits that I need to get used to. What’s perhaps more, we neeed a mode of speaking that neither relies on the past status quo nor merely echoes the current medicalisation of our lives.

So how can we connect? How can we have such a conversation? I’m asking because there is so much exchange and advice on how to get through. This is good. But it concerns an uncertain future. I want to talk about the present. I want to understand, not just cope with, what is happening now. No, not in numbers. I want to understand the kind of life we’re living now. We keep acknowledging that these are “strange times”. But they are our times now. We are disconnected from our recent past and from the predictions that guided us then, up till February or March 2020. The catch is: I cannot understand anything on my own. Understanding is a joint project. It works through mutual acknowledgement, occasional disagreement, and refinement, creating a shared familiarity, fostering hopes and hangovers. – But while I have not much to say myself, I have found some really good pieces that helped me shaping my thoughts. (While there are many other good pieces, I’d like to focus on those acknowledging the situation and addressing to some degree concerns of our ordinary lives.) So why not simply list them (in order of appearance):

  • The first is “Academics, lead by example” by Boudewijn de Bruin. Amidst numerous (laudable) attempts to maintain business as usual, this piece felt as if someone had opened a window and let in some fresh air. A crucial idea is the acknowledgement that this is not a normal situation and that we need different resources to respond to challenges: “This is not the time to be competitive. This is not the time to tell everyone how productive you are, working 24/7. … So should we give up? Not at all. But we shouldn’t pretend that it’s just business as usual, only online. Seize the opportunity and lead by example and share your wisdom and humanity with your students and colleagues.”
    What I particularly liked about Boudewijn’s account is that he has a clear sense of how he arrived at this acknowledgement. It came through listening to others. Accordingly, he writes: “Listening to people with different views is more important now than ever. Think about all your students and colleagues of different nationalities, from different cultures, different religions. They will bring their own ways of dealing with uncertainty. They have their own views about life and death. They may have widely different expectations about the responsibilities of the state – or they may trust the state much less than you do, or more. Imagine what you would want to do if you were in Italy or Iran right now. You would want to go home. Many of our non-Dutch students and colleagues are in that situation. But many borders are closed. They need our care more than ever.”
  • The second is “Covid-19 and online teaching: mind the slope” by Andrea Sangiacomo. He stresses that, while it’s fine to try and remain functional, we must do so by respecting the context in which we function: “If the whole university system happens to run quite smoothly online, if in the end we end up enjoying this (and this might even happen, who knows?), another risk is forgetting why everything is moving online. Reminder: we’re amidst a world pandemic that so far killed almost 15.000 people worldwide (as for today, Monday 23rd March 2020). This is the context within which our online teaching is happening. Remembering this context is important in order not to loose perspective on the meaning of the events, including online teaching. Yes, education needs to keep going, not everything can now become explicitly about the pandemic. And yet, this pandemic is now the broader context within which people are teaching and learning about any subject. We’re teaching online because students (and everybody else) must remain home, trying to limit as much as possible social contacts, practicing social distancing and trying to slow down the spreading of the virus.
    There are pragmatic and existential downsides in forgetting about this context. From a pragmatic point of view, if one looses this context one might forget why staying home and avoid socializing is so vital for everybody at this time. Online teaching becomes just another way of getting distracted, trying to find something in which one can become absorbed, in order not to think to what is happening (about the absorption syndrome see here). But in situations of emergency like this (which will likely endure for some time), trying not to think is precisely the worse one could do. We need all our thinking capacity at this point in order to face whatever will happen.”
  • The last one I want to list for now is “It’s all just beginning” by Justin E.H. Smith. It’s a very rich and multifaceted piece. What struck me most is that it gave voice to a feeling that I struggled to express myself earlier to no avail: the strange disconnection to our own and shared pasts: “In spite of it all, we are free now. Any fashion, sensibility, ideology, set of priorities, worldview or hobby that you acquired prior to March 2020, and that may have by then started to seem to you cumbersome, dull, inauthentic, a drag: you are no longer beholden to it. You can cast it off entirely and no one will care; likely, no one will notice. Were you doing something out of mere habit, conceiving your life in a way that seemed false to you? You can stop doing that now. We have little idea what the world is going to look like when we get through to the other side of this, but it is already perfectly clear that the “discourses” of our society, such as they had developed up to about March 8 or 9, 2020, in all their frivolity and distractiousness, have been decisively curtailed, like the CO2 emissions from the closed factories and the vacated highways. …
    We created a small phenomenal world for ourselves, with our memes and streams and conference calls. And now—the unfathomable irony—that phenomenal world is turning out to be the last desperate repair of the human, within a vastly greater and truer natural world that the human had nearly, but not quite, succeeded in screening out.”

In sum, it’s the call to listen to others, to respect our context of emergency, and the insight into the disconnection to many habits and values that began to help me localising my own thoughts. What I begin to see is that my habits and my past don’t provide orientation in the current context. But it’s the current context I live in and need to understand. In order to live, we need customs and habits. If past habits don’t help us, we need to stabilise habits by building and sharing them with one another. Medical and other expert advice is crucial, but only goes so far. That’s why I want to search for voices speaking to the ordinary experience we likely share. For now, we need to establish something from scratch. – So I’d be grateful for any hints at attempts to make sense of the present situation.

Governmental gaslighting? Communication in the corona crisis

On 7 March 2020, a group of 900 students returned to Groningen from a mass skiing vacation in Piedmont, a place declared as a high-risk area for the coronavirus. Unsurprisingly, citizens were concerned, only to be met with reassuring phrases. Although China had documented cases of the coronavirus spread by asymptomatic patients some weeks earlier, a missive by the Groningen branch of the Dutch health authorities (GGD) still explicitly declared that people without symptoms cannot spread the virus. The document is unchanged to this day. Moreover, it seems to be part of a larger pattern. This kind of conduct strikes me as disturbing in two respects: On the one hand, it downplays potential threats and creates a false sense of safety. On the other hand, coming from a health authority such (false) statements help framing concerns and critical questions as overreactions. Questioning institutions and individuals who align their conduct with official guidance will make you stand out as “unduly critical” and “panicky”. No matter how many papers and news articles you’ve critically engaged with, frowned upon by adherents of what governmental institutions declare to be the status quo and quaestionis you might soon wonder whether you should question your own sanity. Beginning to wonder exactly that, I was reminded of the phenomenon called gaslighting, which is defined as psychological manipulation to the effect that people begin to question their sanity, memory, judgment etc. While this phenomenon is often recognised in abusive relationships, there are obviously variants of political gaslighting. In what follows, I’d like to suggest that gaslighting might be an important feature of crisis communication. This is not only important to recognise for restoring one’s sanity, but also in order to prepare for coping in the aftermath of the crisis.

The claim that there is no or little reason to be concerned is of course a trope in crisis communication, especially in the beginning. I vividly remember such claims around the time of Chernobyl. We are used to such claims, and perhaps most of us see them for what they are. There are good reasons to avoid overreactions or panic. However, things are different when more information becomes available. A claim, made unwaveringly against a background of contrary information, does not calm me down; rather it provides additional reasons for worrying. Why would authorities try to reassure me in the light of credible sources raising doubts? At this point, I normally begin to wonder whether the sources I consult might be limited in problematic ways. Am I overlooking something? Is there a serious debate? Of course, newly established facts will be questioned. What gave me pause in this incident was not the claim as such, but the fact that it was declared with such certainty, permitting no room for doubt. Given the large amount of uncertainties, it was the confidence itself that made the claim seem questionable and indeed as politically motivated, in the sense that the intention came across not as informing but downplaying. This assumption was confirmed by observations Naomi O’Leary and others shared on twitter, suggesting that there might have been a series of attempts to downplay the whole issue by health authorities and news outlets.

Finding confirmation (or contrary information) is an important part of assessing your own position. So finding papers and news articles that confirmed my perspective was important in many ways, especially as this crisis is still emerging. (As a side note: I’m immensely grateful for the fact that we have social media such as facebook and twitter. For all the wrongs they are known for, social media seem crucial now for sharing information, raising doubts, and not least for social bonding in times of physical distancing.) But however reassuring it was, it didn’t explain why the health authorities issued false claims. Surely, they must have known that the public wouldn’t be (completely) reassured by false claims whose refutation might just be a click away. Since I can’t read the minds of those issuing false claims, it might be better to focus on the effects of such gaslighting. Two effects strike me as particularly relevant. First, as we know from Trump’s and other famous cases, gaslighters might wish to divert attention from other facts. In the Dutch context, it might be construed as a means to divert attention from the idea of creating “herd immunity”, although, lo and behold, this is now explicitly denied. However, I’m not sure that this diversion was the underlying reason in the present case. So at this stage it might be more useful to focus on a second aspect: Arguably, a general effect of political gaslighting is to nudge people into adherence to a status quo. Panic is certainly recognised as a social problem. To cut a long story short, it will generally be seen as socially desirable to maintain calm (rather than panic). If you manage to portray critics as creating panic, you effectively depict them as having undesirable traits. If this is correct, people might be expected to silence their critical tendencies in order to appear socially desirable. That’s (political) coolness, in a nutshell.

Apart from the fact that making false claims is morally dubious and might incentivise counterproductive forms of conduct, the effects of such governmental gaslighting strike me immensely problematic for two further reasons. First, they generally polarise and thus might even reinforce nationalist tendencies, at least when the supposed status quo is construed as the achievement of a particular country. In the Netherlands, some crisis experts even went as far as claiming that Italy’s strategy of a lockdown was “incredibly stupid”, since it would damage the econony, while the Dutch way was the “only right one”. While I hope that this assessment remains an outlier, there is a second reason that, in my view, renders such gaslighting particularly pernicious: the creation of political myths that serve to polarise after the crisis. Let me explain.

In the last few days, I was often reminded of stories about the time after WW II, after the revolutions of 1989 and after the financial crisis of 2008. Of course, the corona crisis is vastly different in many respects. But what crises of such scale have in common is that they allow for fundamentally different ideas about how to go on afterwards. The fact that large parts of our social and economical customs break down is both devastating and ground for reconsiderations. Do you remember Occupy Wall Street, to pick just one example? It’s no exaggeration to say that this movement has been portrayed by and large as a failure. Whatever the merits or downsides of the movement, such portrayal has been prepared long before. Not intentionally, of course. But by depicting criticism of neoliberalism as a failure, a meme that is still with us, the grounds were prepared to portray countermovents as countering a supposedly desirable status quo.  – I am reminded of this and other stories whenever I listen to all these wonderful ideas about how the corona crisis might also inspire new forms of interaction, forms that are more in line with general political goals such as reacting to the climate crisis. Be they about new forms of online teaching, or larger ideas about environmental or economical questions. My worry is that downplaying the impact of the corona crisis today will serve as a force to retain a status quo from which to counter and suppress important movements after the crisis.

What does this mean for crisis communication? Of course, not every attempt to avoid panic is a form of gaslighting. But I generally think that governmental institutions might underestimate their audience. Here is a positive counter-example: The virologist Christian Drosten currently runs a regular podcast for the public (in German). In one episode, he openly explained how reading one single paper (by a historian on the Spanish Flu) made him change his mind about recommending the closure of schools. What impressed me was the general public response. In a moment of crisis, an expert calmly communicates the fragility and uncertainty in establishing scientific facts and policies. Of course, I’m relying largely on anecdata, but my impression was that this attitude was received as rather reassuring. This in stark contrast to the archetypical assertion that everything is under control, which often prompts worry or cynicism.

I stop here. These are just a few confused thoughts and associations that struck me while trying to come to terms with the new situation that is now affecting us in a number of very different ways. Let me end by saying that I wish you all the best.

Saying what is unsayable. Or in what sense there is a private language

If I try to tell you what’s on my mind, I’ll fail. This is partly because not everything can be said. When I tell you how I feel, you won’t feel how I feel. When I tell you what I see, you won’t see what I see. You’ll see the sun, too, but it doesn’t have the same colours and it doesn’t mean to you what it means to me. At the same time, I know that you know this, too. You know that you would fail much in the same way that I would. Although our experiences might be worlds apart, I know exactly that you will understand me when I say what I just said. Although I probably know hardly anything about you, I know that we are very much alike. At least in this, and probably in quite some more respects, too. In my last post, I tried to say why our thinking in language and images might not be suited to think about ourselves. However, I think that this attempt remains incomplete without pondering on the opposite, that is, on the attempt to express ourselves as best we can.

What shall we do with this insight? The insight that there are things we might wish to express but cannot say? The attempt to tackle this has been with me for a long time. It drove a desire to say what was unsayable, and it drove a desire to see whether it really is unsayable, and it drove a desire to understand why it might be unsayable. This desire seems to be everywhere. You, ordinary people, philosophers, poets, musicians, all kinds of artists and scientists have looked into this at some point. “You don’t understand me.” This simple sentence testifies the paradox. Yes, it is true, there is always something lacking. And yet, we all know that we don’t understand. The phenomenon has a long history, ranging at least from the ancient saying that “the individual is ineffable” to Wittgenstein’s famous argument against the possibility of a private language. The individual cannot be captured, at least not by concepts. – However, the idea that we should resign to this insight sparks fury. It strikes me as unacceptable.

Why unacceptable? At least among philosophers I sense that we have indeed resigned to this insight. After telling first-year students that language cannot be private or providing some other version of this insight, most of us move on and expect people to come to terms with it. Subjectivity is a loser! (Liam Bright has a nice post showing that the arguments we standardly advance against it are not even very good. And I agree.) – But this, I submit, is a mistake. While it is true that there are a number of good ideas and arguments in favour of this insight, it strikes me as a mistake to stop trying to push against it. Why? Even if you believe that the unsayable really is unsayable, there remain at least two unsettled issues: first, there remains the mystery why exactly it should be so; second, there is the desire to express the unsayable anyway. Let’s briefly look at these two issues:

  • Looking at the history of ideas, there are a number of very different reasons for the claim that there is something unsayable. Some believe it has to do with the contingent nature of the individual; others think that nothing is conceptually accessible as a simple given; others think it is the nature of language defying such access; others still think the individual is a fictitious construct or reification anyway. At the same time, there always have been ways of retaining something of what counts as inexpressible, for instance by referring to God, the individual par excellence, or by introducing non-conceptual content, or by advocating varieties of relativism. Simply resigning to the insight, taking it as a fact, strikes me as accepting that this is a matter of yes or no. But I don’t see a principled reason for seeing this as a categorical matter. Neither do I think it can be reduced to one single insight. If this is correct, there is no reason to give up on it, whatever ‘it’ may be.
  • Even if you believe that it might be illusory to believe that saying the unsayable is possible, it would be a mistake to deny the presence of this illusion. As already noted the problem does not only inspire philosophers to write refutations; it also gives rise to artistic and other attempts to express the inexpressible. Compare freedom: Many philosophers believe that libertarian freedom, alternative possibilities, is an illusory idea. But that doesn’t disqualify the topic. In the same way, the resignation to the insight strikes me as a mistake. Yes, Wittgenstein wrote that we should remain silent whereof we cannot speak. But there are good reasons to disagree.

Ok, you might say, fine! But how do we move on? What can be gained, and how? And what are you talking about anyway? – Right then, I can only try, and this is a first attempt. So here goes: As I said in my previous post, I think that thinking about ourselves works according to the money model. This means that the crucial point in thinking (and speaking) lies in interacting with others. (So, like Wittgenstein in the Philosophical Investigations, I assume that language is a social institution from the getgo.) Again, the point of crying, for instance, is normally not to express a very particular kind of pain, but to interact with others. Crying is a communicative act; and it receives an apt response if the person in question is consoled or something like that. In the same vein, language works as a form of interaction, not as a form of decoding of what has been encoded. Now, I think the same is true when we want to express something that we deem inexpressible. When I attempt to express my innermost experience or something I deem totally private or personal what changes is not the kind of content (from something public to something private). What changes is the (imagined) interlocutor. So what makes the difference between a private and a public speech act is the relation I have to my interlocutor. Thus, communicating something supposedly unsayable works by speaking to an interlocutor whom you trust very much or know very well. Ranging from a conversation, say, with your employer to one with your lover, your closest friend, your parents or even yourself – the levels of privacy can differ accordingly.

So, yes, I think there is a private language, not in that you can encode your private experiences but in that you can speak to someone in a very personal mode. It is the kind of interaction that creates the degree of privacy. To illustrate this, let me suggest a small experiment: Take the sentence “I love you.” We all know it’s a very common kind of sentence. Now imagine that sentence spoken to you by different people. My guess is that your experience and understanding of that sentence will be quite varied in accordance with the imagined interlocutor. So what makes the difference is the person talking to you; not the content of what is said as such.

To return to the more general point, then, I think that when we think about private language we often might be looking in the wrong place. Privacy is not constituted by a kind of content only knowable by you. That would indeed be something self-defeating. Privacy is constituted by the intimacy of the interlocutor. Of course, you can only use a public language even when speaking to yourself. But the conversation can become more intimate, richer and more private in associations and in its relation to concrete experience when we are sufficiently close to our interlocutor.

So if we look for modes of privacy, it might be good to look at different relations between interlocutors or people more generally. These don’t always have to be people close to us in an emphatic sense. If we look at ritualised forms of interaction, we might find other pertinent forms of privacy. So in adition to thinking about personal interlocutors, we might want to think about the relation between language and music, linguistic-musical rituals (such as prayer, meditation and interaction between musicians or other artists), the development of language use in children etc. – As I said, this is just a beginning. But it strikes me as important not to give up on the quest for modes of expressing what we deem unsayable.