On self-censorship

For a few years during the 80s, Modern Talking was one of the most well known pop bands in Germany. But although their first single “You’re my heart, you’re my soul” was sold over eight million times, no one admitted to having bought it. Luckily, my dislike of their music was authentic, so I never had to suffer that particular embarrassment. Yet, imagine all these people alone in their homes, listening to their favourite tune but never daring to acknowledge it openly. Enjoying kitsch of any sort brings the whole drama of self-censorship to the fore. You might be moved deeply, but the loss of face is more unbearable than remaining in hiding. What’s going on here? Depending on what precisely is at stake, people feel very differently about this phenomenon. Some will say that self-censorship just maintains an acceptable level of decency or tact; others will say that it reflects political oppression or, ahem, correctness. At some point, however, you might let go of all shame. Perhaps you’ve got tenure and start blogging or something like that … While some people think it’s a feature of the current “cancel culture”, left or right, I think it’s more important to see the different kinds of reasons behind self-censorship. In some cases, there really is oppression at work; in other cases, it’s peer pressure. Neither is fun. In any case, it’s in the nature of this phenomenon that it is hard to track in a methodologically sound way. So rather than draw a general conclusion, it might be better to go through some very different stories.

Bad thoughts. – Do you remember how you, as a child, entertained the idea that your thoughts might have horrible consequences? My memory is faint, but I still remember assuming that thinking of swear words might entail my parents having an accident. So I felt guilty for thinking these words, and tried to break the curse by uttering them to my parents. But somehow I failed to convince them of the actual function of my utterance, and so they thought I was just calling them names. Today, I know that this is something that happens to occur in children, sometimes even pathologically strong and thus known as “intrusive thoughts” within an “obsessive compulsory disorder”. Whatever the psychological assessment, my experience was that of “forbidden” thoughts and, simultaneously, the inability to explain myself properly. Luckily, it didn’t haunt me, but I can imagine it becoming problematic.

One emergence of the free speech debate. – When I was between 7 and 10 years old (thus in the 1970s), I sometimes visited a lonely elderly woman. She was an acquaintance of my mother, well in her 70s and happy to receive some help. When no one else was around she often explained her political views to me. She was a great admirer of Franz Josef Strauß whom she described to me as a “small Hitler – something that Germany really needs again”. She hastened to explain that, of course, the real Hitler would be too much, but a “small” one would be quite alright. She then praised how, back in the day, women could still go for walks after dark etc. Listening to other people of that generation, I got the impression that many people in Germany shared these ideas. In 2007, the news presenter Eva Herman explicitly praised the family values of Nazi Germany and was dismissed from her position. The current rise of fascism in Germany strikes me as continuous with the sentiments I found around me early on. And if I’m not mistaken these sentiments date back at least to the 1930s and 1940s. In my experience, Nazism was never just an abstract political view. Early on did I realise that otherwise seemingly “decent” people could be taken by it. But this concrete personal dimension made the sweaty and simplistic attitude to other people all the more repulsive. In any case, I personally found that people in the vicinity of that ideology are the most vocal people who like to portray themselves as “victims” of censorship, though they are certainly not censoring themselves. (When it comes to questions of free speech, I am always surprised that whistleblowers such as Snowden are not mentioned.)

Peer pressure and classism. – I recently hosted a guest post on being a first generation student that really made me want to write about this issue myself. But often when I think about this topic, I still feel uncomfortable writing about it. In some ways, it’s all quite undramatic in that the transition to academia was made very easy by my friends. For what shouldn’t be forgotten is that it’s not only your parents and teachers who educate you. In my case at least, I tacitly picked up many of the relevant habits from my friends and glided into being a new persona. Although I hold no personal grudges, I know that “clothes make people” or “the man” as Gottfried Keller’s story is sometimes translated. What I noticed most is that people from other backgrounds often have a different kind of confidence being around academics. Whether that is an advantage across the board I don’t know. What I do know is that I took great care to keep my own background hidden from most colleagues, at least before getting a tenured job.

Opportunism and tenure. – Personally, I believe that I wouldn’t dare publishing this very post or indeed any of my posts, had I not obtained a tenured position. Saying this, I don’t want to impart advice. All I want to say is that getting this kind of job is what personally freed me to speak openly about certain things. But the existential weight of this fact makes me think that the greatest problem about self-censorship lies in the different socio-economic status that people find themselves in. This is just my experience, but perhaps it’s worth sharing. So what is it about, you might wonder? There is no particular truth that I would not have told before but would tell now. It’s not a matter of any particular opinion, be it left or right. Rather, it affects just about everything I say. The fact that I feel free to talk about my tastes, about the kitsch I adore, about the music I dislike, about the artworks I find dull, alongside the political inclinations I have – talking about all of this openly, not just politics, is affected by the fact that I cannot be fired just so and that I do not have to impress anyone I don’t want to impress. It is this freedom that I think does not only allow us to speak but also requires us to speak up when others will remain silent out of fear.

The myth of authenticity. – The fact that many of us feel they have to withhold something creates the idea that there might be a vast amount of unspoken truths under the surface. “Yes”, you might be inclined to ask, “but what do you really think?” This reminds me of the assumption that, in our hearts, we speak a private language that we cannot make intelligible to others. Or of the questions immigrants get to hear when people inquire where they really come from. It doesn’t really make sense. While it is likely that many people do not say what they would say if their situation were different, I don’t think it’s right to construe this as a situation of hidden truths or lies. (Some people construe the fact that we might hide conceal our opinions as lies. But I doubt that’s a pertinent description.) For better or worse, the world we live in is all we have when it comes to questions of authenticity. If you choose to remain silent, there is no hidden truth left unspoken. It just is what it is: you’re not speaking up and you might be in agony about that. You might conceal what you think. But then it is the concealing that shapes the world and yourself, not the stuff left unspoken. Put differently, there are no truths, no hidden selves, authentic or not, that persist without some relation to interlocutors.

***

Speaking of which, I want to finish this post with a word of thanks. It’s now two years ago that I started this blog. By now I have written 118 posts. If I include the guest posts, it adds up to 131. Besides having the pleasure of hosting great guest authors, I feel enormously privileged to write for you openly. On the one hand, this is enabled by the relatively comfortable situation that I am in. On the other hand, none of this would add up to anything if it weren’t for you, dear interlocutors.

Why using quotation marks doesn’t cancel racism or sexism. With a brief response to Agnes Callard

Would you show an ISIS video, depicting a brutal killing of hostages, to the survivor of their murders? Of if you prefer a linguistic medium: would you read Breivik’s Manifesto to a survivor of his massacre? – Asking these questions, I’m assuming that none of you would be inclined to endorse these items. That’s not the point. The question is why you would not present such items to a survivor or perhaps indeed to anyone. My hunch is that you would not want to hurt or harm your audience. Am I right? Well, if this is even remotely correct, why do so many people insist on continuing to present racist, sexist or other dehumanising expressions, such as the n-word, to others? And why do we decry the take-down of past authors as racists and sexists? Under the label of free speech, of all things? I shall suggest that this kind of insistence relies on what I call the quotation illusion and hope to show that this distinction doesn’t really work for this purpose.

Many people assume that there is a clear distinction between use and mention. When saying, “stop” has four letters, I’m not using the expression (to stop or alert you). Rather, I am merely mentioning the word to talk about it. Similarly, embedding a video or passages from a text into a context in which I talk about these items is not a straightforward use of them. I’m not endorsing what these things supposedly intend to express or achieve. Rather, I am embedding them in a context in which I might, for instance, talk about the effects of propaganda. It is often assumed that this kind of “going meta” or mentioning is categorically different from using expressions or endorsing statements. As I noted in an earlier post, if I use an insult or sincerely threaten people by verbal means, I act and cause harm. But if I consider a counterfactual possibility or quote someone’s words, my expressions are clearly detached from action. However, the relation to possible action is what contributes to making language meaningful in the first place. Even if I merely quote an insult, you still understand that quotation in virtue of understanding real insults. In other words, understanding such embeddings or mentions rides piggy-back on understanding straightforward uses.

If this is correct, then the difference between use and mention is not a categorical one but one of degrees. Thus, the idea that quotations are completely detached from what they express strikes me as illusory. Of course, we can and should study all kinds of expressions, also expressions of violence. But their mention or embedding should never be casual or justified by mere convention or tradition. If you considered showing that ISIS video, you would probably preface your act with a warning. – No? You’re against trigger warnings? So would you explain to your audience that you were just quoting or ask them to stop shunning our history? And would you perhaps preface your admonitions with a defense of free speech? – As I see it, embedded mentions of dehumanising expressions do carry some of the demeaning attitudes. So exposing others to them merely to make a point about free speech strikes me as verbal bullying. However, this doesn’t mean that we should stop quoting or mentioning problematic texts (or videos). It just means that prefacing such quotations with pertinent warnings is an act of basic courtesy, not coddling.

The upshot is that we cannot simply rely on a clear distinction between quotation and endorsement, or mention and use. But if this correct, then what about reading racist or sexist classics? As I have noted earlier, the point would not be to simply shun Aristotle or others for their bigotry. Rather, we should note their moral shortcomings as much as we should look into ours. For since we live in some continuity with our canon, we are to some degree complicit in their racism and sexism.

Yet instead of acknowledging our own involvement in our history, the treatment of problematic authors is often justified by claiming that we are able to detach ourselves from their involvement, usually by helping ourselves to the use-mention distinction. A recent and intriguing response to this challenge comes from Agnes Callard, who claims that we can treat someone like Aristotle as if he were an “alien”. We can detach ourselves, she claims, by interpreting his language “literally”, i.e. as a vehicle “purely for the contents of his belief” and as opposed to “messaging”, “situated within some kind of power struggle”. Taken this way, we can grasp his beliefs “without hostility”, and the benefits of reading come “without costs”. This isn’t exactly the use-mention distinction. Rather, it is the idea that we can entertain or consider ideas without involvement, force or attitude. In this sense, it is a variant of the quotation illusion: Even if I believe that your claims are false or unintelligible, I can quote you – without adding my own view. I can say that you said “it’s raining” without believing it. Of course I can also use an indirect quote or a paraphrase, a translation and so on. Based on this convenient feature of language, historians of philosophy (often including myself) fall prey to the illusion that they can present past ideas without imparting judgment. Does this work?

Personally, I doubt that the literal reading Callard suggests really works. Let me be clear: I don’t doubt that Callard is an enormously good scholar. Quite the contrary. But I’m not convinced that she does justice to the study that she and others are involved in when specifying it as a literal reading. Firstly, we don’t really hear Aristotle literally but mediated through various traditions, including quite modern ones, that partly even use his works to justify their bigoted views. Secondly, even if we could switch off Aristotle’s political attitudes and grasp his pure thoughts, without his hostility, I doubt that we could shun our own attitudes. Again, could you read Breivik’s Manifesto, ignoring Breivik’s actions, and merely grasp his thoughts? Of course, Aristotle is not Breivik. But if literal reading is possible for one, then why not for the other?

The upshot is: once I understand that a way of speaking is racist or sexist, I cannot unlearn this. If I know that ways of speaking hurt or harm others, I should refrain from speaking this way. If I have scholarly or other good reasons to quote such speech, I shouldn’t do so without a pertinent comment. But I agree with Callard’s conclusion: We shouldn’t simply “cancel” such speech or indeed their authors. Rather, we should engage with it, try and contextualise it properly. And also try and see the extent of our own involvement and complicity. The world is a messy place. So are language and history.

“We don’t need no …” On linguistic inequality

Deviations from so-called standard forms of language (such as the double negative) make you stand out immediately. Try and use double negatives consistently in your university courses or at the next job interview and see how people react. Even if people won’t correct you explicitly, many will do so tacitly. Such features of language function as social markers and evoke pertinent gut reactions. Arguably, this is not only true of grammatical or lexical features, but also of broader stylistic features in writing, speech and even non-linguistic conduct. Some ways of phrasing may sound like heavy boots. Depending on our upbringing, we are familiar with quite different linguistic features. While none of this might be news, it raises crucial questions about teaching that I see rarely addressed. How do we respond to linguistic and stylistic diversity? When we say that certain students “are struggling”, we often mean that they deviate from our stylistic expectations. A common reaction is to impart techniques that help them in conforming to such expectations. But should we perhaps respond by trying to understand the “deviant” style?

Reading the double negative “We don’t need no …”, you might see quite different things: (1) a grammatically incorrect phrase in English; (2) a grammatically correct phrase in English; (3) part of a famous song by Pink Floyd. Assuming that many of us recognise these things, some will want to hasten to add that (2) contradicts (1). A seemingly obvious way to resolve this is to say that reading (1) applies to what is called the standard dialect of English (British English), while (2) applies to some dialects of English (e.g. African-American Vernacular English). This solution prioritises one standard over other “deviant” forms that are deemed incorrect or informal etc. It is obvious that this hierarchy goes hand in hand with social tensions. At German schools and universities, for instance, you can find numerous students and lecturers who hide their dialects or accents. In linguistics, the disadvantages of regional dialect speakers have long been acknowledged. Even if the prescriptive approach has long been challenged, it’s driving much of the implicit culture in education.

But the distinction between standard and deviant forms of language ignores the fact that the latter often come with long-standing rules of their own. Adjusting to the style of your teacher might then require you to deviate from the language of your parents. Thus another solution is to say that there are different English languages. Accordingly, we can acknowledge reading (2) and call African-American Vernacular English (AAVE) a language. The precise status and genealogy is a matter of linguistic controversy. However, the social and political repercussions of this solution come most clearly into view when we consider the public debate about teaching what is called “Ebonics” at school in the 90s (Here is a very instructive video about this debate). If we acknowledge reading (2), it means, mutatis mutandis, that many English speakers raised with AAVE can be considered bilingual. Educators realised that teaching standard forms of English can be aided greatly by using AAVE as the language of instruction. Yet, trying to implement this as a policy at school soon resulted in a debate about a “political correctness exemplar gone out of control” and abandoning the “language of Shakespeare”. The bottom-line is: Non-hierarchical acknowledgement of different standards quickly spirals into defences of the supposed status quo by the dominant social group.

Supposed standards and deviations readily extend to styles of writing and conduct in academic philosophy. We all have a rough idea what a typical lecture looks like, how a discussion goes and how a paper should be structured. Accordingly, attempts at diversification are met with suspicion. Will they be as good as our standards? Won’t they undermine the clarity we have achieved in our styles of reasoning? A more traditional division is that between so-called analytic and continental philosophy. Given the social gut reactions to diversifying linguistic standards, it might not come as a surprise that we find equal responses among philosophers: Shortly before the University of Cambridge awarded a honorary degree to Derrida in 1992, a group of philosophers published an open letter protesting that “Derrida’s work does not meet accepted standards of clarity and rigour.” (Eric Schliesser has a succinct analysis of the letter.) Rather than acknowledging that there might be various standards emerging from different traditions, the supposedly dominant standard of clarity is often defended like an eternal Platonic idea.

While it is easy to see and criticise this, it is much more difficult to find a way of dealing with it in the messy real world. My historically minded self has had and has the luxury to engage with a variety of styles without having to pass judgment, at least not explicitly. More importantly, when teaching students I have to strike a balance between acknowledging variety and preparing them for situations in which such acknowledgement won’t be welcome. In other words, I try to teach “the standard”, while trying to show its limits within an array of alternatives. My goal in teaching, then, would not be to drive out “deviant” stylistic features, but to point to various resources required in different contexts. History (of philosophy) clearly helps with that. But the real resources are provided by the students themselves. Ultimately, I would hope, not to teach them how to write, but how to find their own voices within their various backgrounds and learn to gear them towards different purposes.

But to do so, I have to learn, to some degree, the idioms of my students and try to understand the deep structure of their ways of expression. Not as superior, not as inferior, but as resourceful within contexts yet unknown to me. On the other hand, I cannot but also lay open my own reactions and those of the traditions I am part of. – Returning to the fact that language comes with social markers, perhaps one of the most important aspects of teaching is to convey a variety of means to understand and express oneself through language. Our gut reactions run very deep, and what is perceived as linguistic ‘shortcomings’ will move people, one way or another. But there is a double truth: Although we often cannot but go along with our standards, they will very soon be out of date. New standards and styles will emerge. And we, or I should say “I”, will just sound old-fashioned at best. Memento mori.