Seeing that I don’t write about things or topics but about what people say about things was one of the most important lessons I learned. I’ve said this a number of times, here and here, but a recent chat with a friend made me realise that it is perhaps worth highlighting again.
So, when you’re writing about stuff like justice, language, the supreme good or whatever, you don’t write about these things or phenomena, as it were. Rather you write about what people say about these phenomena. Or about what you yourself say (or think) about these phenomena. The point I’m trying to make is that what you’re targeting when you write is a piece of language: you’ll be writing about a claim or a passage, a specific argument, an example or a specific question.
Why is this worth noting? – Let’s begin with a pragmatic reason: As long as you think that you write about, say, freedom and necessity, you will be paralysed by the vast amount of things you could look at. Things provide no focus. A string of sentences by contrast gives you focus. Sentences pick out something; they leave open something else; and they deny something at least implicitly. In this way, they give you a dialectical field of positions and neglect. You can start immediately by picking on a word or phrase and ask what precisely it means. So instead of fretting where to begin you can start immediately by thinking about the phrases and what they evoke, by what they miss and by how you feel about them.
What you enter. – Once you realise that you’re not embarking on a boat tossed across the vast ocean of being, you will see that the idea of philosophy as a conversation is quite literally true. You are always dealing with someone’s (or you own) formulation. You will want to understand and thus ask for clarification, offering alternatives or counterexamples. The point is that the kind of skill you first and formost need is the skill of zooming in on the language.
Play with words. – Now of course this doesn’t mean that you can skip informing yourself about things. It just means that, in beginning to write (or talk) about these things, you will always target a formulation. You can begin with your own way of phrasing something and take it apart, one by one, or with someone elses and ask them about it. The skills that you can train for this are reading, reformulating (in other words, other terminologies, in other genres or examples or in formal language), translating, and, generally, playing with words. When you sit at your desk or in a talk wondering what is going on, don’t focus on the things, issues or phenomena. Rather focus on the words. That’s where you’ll enter.
So it begins. – So when you begin to plan and write your text or talk, I’d advise you to begin by quoting the paragraph or claim you want to focus on. And if it’s not someone elses point you want to focus on, then offer your best formulation. Write it down and begin to wander around it.
You think that this whole idea is odd? Perhaps I am just an old Kantian who thinks that the Ding an sich is not available to us.
By the way, this month this blog is three years old. Thanks for bearing with me.
Over the weekend I posted a piece of news according to which one of the last representations of academic psychoanalysis in Germany is under threat. What I found particularly interesting were the somewhat heated discussions that ensued on various social media. While some regretted the prospect of seeing psychoanalysis pushed out out of academia, others saw it as an instance of scientific advancement. More than once was it claimed that, after all, we wouldn’t have chairs in astrology either.* Lacking expertise in psychology, I am not the right person to make a case for the current role of psychoanalytic research, but I was struck by the frequent and ready dismissal in favour of a current status quo. Yet, what this insistence on the status quo obscures is the likelihood that future historians will see many of our current ideas as similarly outdated. Our most recent neuroscience will become tomorrow’s astrology. In this post, then, I’d like to ask you, dear reader, to imagine that our current theories and even our own beliefs will be deemed outdated. The idea behind such an embedded history** is to historicise the present and pave the way for seeing our very own ideas like a historian of thought, that is: seeing our beliefs in their contingent relations to our (social) world rather than as items in the space of reasons.
Condemning ideas. – What makes people condemn ideas or approaches? Our study of the mind has a long and complicated history. Many ideas are now outdated. Although Aristotle is held in high esteem, no one will want to maintain his views on the heart-brain system. However, controversial ideas present a different case. Disciplines like psychoanalysis are still evolving and are held in high esteem by many, but their precise status in the academic landscape has become dubious. The reasons for advacing doubts are varied: they might be internal to the discipline but also of a political or moral nature. Despite substantial criticisms, however, certain ideas not least from psychoanalysis pervade much of our current culture and are known, not only by experts, but the public at large. What’s interesting about ideas that are both common and controversial is that they present us with normative questions: They are held, yes, but should they be held (in the future)? Now the normative attitude according to which, for instance, psychoanalysis should be condemned to the past can itself be historicised. This is what a embedded historian would do. Rather than taking a side for or against a particular view, the embedded historian would try and historicise the controversy. For the embedded historian, discussions invoking perceived progress, then, would shed some light on our current normative historical attitudes, that is, attitudes about things that we begin to see as belonging to the past and that we (or some of us) think should no longer be present.
But how can we turn into embedded historians? – Peter Adamson once suggested seeing our current philosophy just as the latest stage of the history of philosophy. Naturally, I agree. As I see it, this approach not only helps us achieving a better understanding of the current philosophical landscape, it also shifts our attitudes in intriguing ways: Being convinced by an argument is quite different from explaining how someone like you (in your day and age) would encounter and be compelled by a certain argument in a certain context and style. This is what Bernard Williams called “making the familiar strange”. But how is it done? Having ideas is one thing. Rejecting ideas as belonging to the past is quite another thing; it carries the force of condemnation. But what if you find yourself on the other side? What I’d like you to imagine is that you hold ideas that future historians will think of as outdated. This, I submit, is how you can become an embedded historian about your own ideas. You can do this in two steps: first, you study a theory that is considered outdated, try to embrace it by looking at the best arguments for it, and then you look at the refutations. Second, you take the most forceful refutations and try to have them carry over such that they attack your own convictions. (The second move is of course much harder, but if you want to see it in action it might help to consider how Wittgenstein attacks some of his own ideas in the Philosophical Investigations.)
How can you attack your own convictions? – Somehow attacking your own convictions seems paradoxical, because they are your convictions. But are they (still) your convictions, if you can attack them? Here is a start: Think of the latest good idea that convinced you and try to give a reason for holding it. But now try to do this, not when you’re clear-headed, but rather when you get up at six in the morning, straight away. What I’m after is the difference between what we say on the fly as opposed to what we think we should be saying (i.e. our best version of our argument). This is the way many historians approach, not their own convictions, but the material they study: they take the explicit (badly formed) reasons, and then say what their author should have said but didn’t. (Historians shunning anachronism will then often go with the explicit badly formed reasons, while others opt for the best reasons because they apply the principle of charity.) Now just allow yourself the (bad) reasons you invoked on the fly. You can then imagine how a future historian will dissect your account easily.
Why should we do it? – Now that you have a beginning, you might still ask why such a thing is worth your time. Well, attacking your own convictions is the only way to create headspace for ideas that seem to be in opposition to your own. There are so many ideas that are out of touch with the current status quo that it would seem ridiculous to believe that we – we of all people – would have the best ideas and the best methods of approaching them or putting them to use. Rather than dismissing ideas quickly in the name of progress (= status quo), we should be triangulating for objectivity.*** And this we can do only with attempting to understand those who we consider controversial, outdated or opposed to what we believe. That said, there is yet another reason: Studying the ideas that we reject might uncover the reasons for rejections which, in turn, might uncover ideas that tacitly underpin our beliefs. After all, condemned ideas might become repressed ideas. But that’s for another day.
* While David Livingstone Smith, for instance, presents substantial criticism against most psychoanalytic traditions, at least a quick browse through the research done at Frankfurt leaves me with the impression that abolishing this kind of work would mean a severe impoverishment of academic psychology.
** The term “embedded history” is reclaimed from the term “embedded journalism” which, though a problematic practice in itself, captures intriguing aspects of the way we are involved when doing history and thinking about ourselves and others.
*** I use “triangulating” as a term of art from Davidson. Here is a lucid passage from his “Rational Animals” (also quoted in Jeff Malpas’ great introduction to the term): “If I were bolted to the earth, I would have no way of determining the distance from me of many objects. I would only know that they were on some line drawn from me towards them. I might interact successfully with objects, but I could have no way of giving content to the question where they were. Not being bolted down, I am free to triangulate. Our sense of objectivity is the consequence of another sort of triangulation, one that requires two creatures. Each interacts with an object, but what gives each the concept of the way things are objectively is the base line formed between the creatures by language. The fact that they share a concept of truth alone makes sense of the claim that they have beliefs, that they are able to assign objects a place in the public world.”
This is the fifth installment of my series Philosophical Chats. In this episode, I have a conversation with Anna Tropia who is an assistant professor of philosophy at the University of Prague. Following up on some earlier musings, we focus on issues of writing (philosophy) as they figure in my blogging. Here is a rough table of contents:
Introduction and the focus of “Handling Ideas” 0:00
How can and why should we avoid the delete button? 2:17
Dare to say something wrong! A general tip on writing 6:53
“What is the seal of attained liberty? To be no longer ashamedof oneself.” Friedrich Nietzsche
Like many fellow students around me, I learned writing by imitating others. How do I know about the others? Well, because there were no courses on learning how to write. So everyone was left to their own devices. Don’t get me wrong: there were and are many good guides on what desirable academic prose should look like. But these guides do not focus on the process of writing: on the despair, boredom, shame, and love that go into it. Actually, it was the lack of reflections on the process and the more doubtful stages that initially motivated me to start this blog. Speaking about these emotions is not meant as a form of venting or ranting about hardships (although they should have their place, too), but rather on the way these emotions can guide and inform our writing. In what follows, I want to say a bit more about this. I’ll start by looking at the way (emotional) experience figures in academic interaction and writing, and then zoom in on different forms of expressing thoughts.
Let’s begin with shame, though. – If you want to see how shame figures in guiding academic interactions, just start a course by asking what people did not understand in a set text. Most people will remain silent; the more experienced ones will point out passages that fail to be clear enough to be understood, passing the blame onto the text. – If you’re the odd one out who is willing to go for it, you’ll know that it takes courage to begin by admitting that you yourself do not understand. Shame is the fear of being seen or exposed in doing something undesirable (like making a mistake). When we speak or write, shame will drive us to avoid making mistakes. One way of doing that is remaining silent; another way is to pass the blame and criticise others rather than taking the blame. In writing or conversation, we can counter shame by developing technical skills, that is, by learning chops that make it look flawless, elegant, and professional. So we introduce technical jargon, demonstrating our analytical skills and what have you. While technical versatility is often equated with a sober or even neutral style, this asset might owe less to sobriety than to shame.
What’s love got do with it? – Iris Murdoch wrote somewhere that love is, amongst other things, the ability to see someone else as real. (See Fleur Jongepier’s great piece on Murdoch and love.) One way of taking this is that love is an ability, the ability to understand, not yourself and your desires, but the other. How do you do that? My hunch is that understanding others begins with trying to understand their experience. If you are able to express someone’s experience, the other might feel seen. In writing, this can be done in at least two ways. You can try to say what (you think) someone experiences or you can try to create an experience for the reader. Now you might think that this factor is totally absent from academic writing, but that isn’t true. Philosophers typically try to tap into experience by using examples or crafting thought experiments. What is rarely acknowledged is that these items do much more work than meets the eye. Strong examples and thought experiments often live on much longer than the arguments they’re supposed to back up. They are far more than mere illustrations of a point. Ideally, they allow the reader to experience a conceptual constraint on an almost physical level. Knowing a norm, for example, is one thing; being exposed (or imagining yourself) as having transgressed it is quite another.
How does this take on love as understanding the other play out in reading and writing? Returning to the example of asking people what they didn’t understand in a given text, it would be an act of love, in the sense explained, to acknowledge what you do not understand about the text. For if love is seeing the other as real, acknowledging the other’s reality would begin by acknowledging that there is something different, something you do not understand etc. In this sense, acknowledging the other (in the text) begins by admitting a weakness in yourself, the weakness of not understanding wholly. However, ultimately the point is not just to point out limitations but also to explore what constitutes these limits. This means that you also need to see what precisely blocks your understanding of the other (or the text). Seeing how factors in your personality, style, context and history enable or disable your understanding requires you to understand yourself. To use a radical example, if you have never been confronted with an optical illusion, examples of this sort of illustration wouldn’t work for you. Generally, if you never had access to certain kinds of experiences, these will constitute limits of understanding. Likewise, factors such as gender, race and class will inform the way a text speaks (or doesn’t speak) to us and limit the experiential resources available to draw on experience in writing. – It’s important to see that, in this sense, shame and love are in conflict. While love aims at seeing the other and involves the other (and thus ourselves, too) as being seen, shame drives us to disguise ourselves (at least in what we find undesirable) and perhaps even to blame the other for failing to be intelligible to us. In philosophical conversation, then, shame would make us avoid being seen (at least in undesirable aspects), while love would require us to lay bare our weakness of not understanding the other. As a result of this, shame and love play out in how we relate to (personal) experience. Arguably, shame blocks resorting to (personal) experience, while love as an approach to what constitutes borders between ourselves and others requires resorting to experience.
Expressing thoughts and experience. – If the forgoing makes some sense, we might say that shame and love inspire different attitudes in philosophical conversation: shame makes us shun (expressing thoughts by) personal experience, while love requires us to explore experience. Going from shame and love as two guiding emotions, then, we can easily discern two styles of reading and writing. Driven by shame, we find ourselves in a culture that often shuns resorting to experience and relies on techniques that correct for supposedly subjective factors. It is no surprise, then, that philosophers often highlight skills of so-called “critical thinking” as an asset of the discipline. More often than not these skills boil down to learning labels of fallacies that we can tag on texts. Looking at my student days, I often found myself indulging in technicalities to shun the fear of being seen for what I was: someone understanding very little. That said, such skills can be developed into a real art of analysis. Paired with patience, the careful study of arguments can yield great results. Then, it is no longer merely a way of avoiding shame but itself a set of tools for understanding. – Conversely, inspired by what I introduced as love, experience is crucial for understanding what sets us apart from others and the rest of the world. As I said earlier, this approach requires taking into account facors such as personalities, context and history. Crucially, such an approach cannot rely on the skillset of the writer or reader alone. It requires a dialogical readiness that might always undermine one’s own steps of understanding by what remains different. Perhaps it is not surprising that this approach is found mostly in areas that have traditionally enjoyed less acclaim, such as certain approaches in history, standpoint theory or experimental philosophy. – However, while it is important to tell such driving forces and styles apart, they are hardly ever distinct. As I said in an earlier post, if you open any of the so-called classics, you’ll find representations of both forms. Descartes’ Meditations offer you meditative exercises that you can try at home alongside a battery of arguments engaging with rival theories. Wittgenstein’s Tractatus closes with the mystical and the advice to shut up about the things that matter most after opening with a rather technical account of how language relates to the world. Yet, while both kinds are present in many philosophical works, it’s mostly the second kind that gets recognition in professional academic philosophy If this is correct, this means that experience doesn’t figure much in our considerations of reading and writing.
Can we teach failure? – Trying to pin down what characterises this sort of love as an approach in reading and writing, it ultimately seems to be a process of failure. Trying to understand others fails in that success is simply unthinkable. There is no exhaustive understanding of the other, a text, a person, a thing, whatever. Love, in this or perhaps in any sense, has nothing to do with success, but everything with dialogical trying and undermining. Of course, this can be taught. But it has no place in learning outcomes. As teachers of reading and writing, though, it might be helpful to point out that “analysing”, “reconstructing”, “discussing”, “contextualising”, “arguing” and such like are not success verbs. Showing how we fail in these attempts might go a long way in understanding and overcoming shame.
This is the fourth installment of my still fairly new series Philosophical Chats. In this episode, I have a conversation with Andrea Sangiacomo who is an associate professor of philosophy at Groningen University. In this conversation, we focus on meditation both as part of philosophical traditions as well as an approach that might be a resourceful factor impacting (academic) philosophy, teaching and academic culture. While Cartesian and Buddhist ideas* form a continuous resource in the background of our discussion, here is a list of themes in case you look for something specific:
Meditation and Descartes’ Meditations 2:20
The notion of experience – and objections against experience as a basis in philosophy 9:00
Meditation in teaching 21:14
Why aren’t we already using these insights in education? 37:00
How can we teach and learn effectively? 44:36
How can we guide and assess? 52:50
Where is this approach leading, also in terms of academic culture? 1:03:00
* The opening quotation is from Andrea’s blogpost What can we learn today from Descartes’ Meditations? Here is the passage: “Since last year, I appreciated the text of the Mediations as real meditation, namely, as a way of practicing a meditative kind of philosophy (for lack of better term), a philosophy more concerned with what it means to experience reality in this way or that way, rather than with what a certain set of propositions means.”
Everything we take to be history is, in fact, present right now. Otherwise we wouldn’t think about it.
When I was little, I often perceived the world as an outcome of historical progress. I didn’t exactly use the word “historical progress” when talking to myself, but I thought I was lucky to grow up in the 20th century rather than, say, the Middle Ages. Why? Well, the most obvious examples were advances in technology. We have electricity; they didn’t. That doesn’t change everything, but still a lot. Thinking about supposedly distant times, then, my childhood mind conjured up an image of someone dragging themselves through a puddle of medieval mud, preferably while I was placed on the sofa in a cozy living-room with the light switched on and the fridge humming in the adjacent kitchen. It took a while for me to realise that this cozy contrast between now and then is not really an appreciation of the present, but a prejudice about history, more precisely about what separates us from the past. For what my living room fantasy obscures is that this medieval mud is what a lot of people are dragging themselves through today. It would have taken a mere stroll through town to see how many homeless or other people do not live in the same world that I identified as my present world. Indeed, most things that we call “medieval” are present in our current world. Listening to certain people today, I realise that talk of the Enlightenment, the Light of Reason and Rationality is portrayed in much the same way as my living-room fantasy. But as with the fruits of technology, I think the praise of Enlightenment is not an appreciation of the present, but a prejudice about what separates us from the past. One reaction to this prejudice would be to chide the prejudiced minds (and my former self); another reaction is to try and look more closely at our encounters with these prejudices when doing history. That means to try and see them as encounters with ourselves, with the ideologies often tacitly drummed into us, and to understand how these prejudices form our expectations when reading old texts. Approaching texts in this latter way, means to read them both as historical philosophical documents as much as an encounter with ourselves. It is this latter approach I want to suggest as a way of reading and teaching what could be called outdated philosophy. According to at least some of my students’ verdicts about last term, this might be worth pursuing.
Let’s begin with the way that especially medieval philosophy is often introduced. While it’s often called “difficult” and “mainly about religion”, it’s also said to require so much linguistic and other erudition that anyone will wonder why on earth they should devote much time to it. One of the main take-away messages this suggests is an enormous gap between being served some catchy chunks of, you know, Aquinas, on the one hand, and the independent or professional study of medieval texts, on the other hand. Quite unlike in ethics or social philosophy, hardly any student will see themselves as moving from the intro course to doing some real research on a given topic in this field. While many medievalists and other historians work on developing new syllabi and approaches, we might not spend enough time on articulating what the point or pay-off of historical research might be. – I don’t profess to know what the point of it all is. But why would anyone buy into spending years on learning Latin or Arabic, palaeography or advanced logic, accepting the dearth of the academic job market, a philosophical community dismissing much of their history? For the sake of, yes, what exactly? Running the next edition of Aquinas or growing old over trying to get your paper on Hildegard of Bingen published in a top journal? I’m not saying that there is no fun involved in studying these texts and doing the work it takes; I’m wondering whether we make sufficiently explicit why this might be fun. Given the public image of history (of philosophy), we are studying what the world was like before there was electricity and how they then almost invented it but didn’t.
Trying to understand what always fascinated me about historical studies, I realised it was the fact that one learns as much about oneself as about the past. Studying seemingly outdated texts helped me understand how this little boy in the living room was raised into ideologies that made him (yes, me) cherish his world with the fridge in the adjacent kitchen, and think of history as a linear progress towards the present. In this sense, that is in correcting such assumptions, studying history is about me and you. But, you ask, even if this is true, how can we make it palpable in teaching? – My general advice is: Try to connect to your student-self, don’t focus on the supposed object of study, but on what it revealed about you. Often this isn’t obvious, because there is no obvious connection. Rather, there is disparity and alienation. It is an alienation that might be similar to moving to a different town or country. So, try to capture explicitly what’s going on in the subject of study, too, in terms of experience, resources and methods available. With such thoughts in mind, I designed a course on the Condemnation of 1277 and announced it as follows:
Condemned Philosophy? Reason and faith in medieval and contemporary thought
Why are certain statements condemned? Why are certain topics shunned? According to a widespread understanding of medieval cultures, especially medieval philosophy was driven and constrained by theological and religious concerns. Based on a close reading of the famous condemnation of 1277, we will explore the relation between faith and reason in the medieval context. In a second step we will look at contemporary constraints on philosophy and the role of religion in assessing such constraints. Here, our knowledge of the medieval context might help questioning current standards and prejudices. In a third step we will attempt to reconsider the role of faith and belief in medieval and contemporary contexts.
The course was aimed at BA students in their 3rd year. What I had tried to convey in the description is that the course should explore not only medieval ideas but also the prejudices through which they are approached. During the round of introductions many students admitted that they were particularly interested in this twofold focus on the object and the subject of study. I then explained to them that most things I talk about can be read about somewhere else. What can’t be done somewhere else is have them come alive by talking them through. I added that “most of the texts we discuss are a thousand years old. Despite that fact, these texts have never been exposed to you. That confrontation is what makes things interesting.” In my view, the most important tool to bring out this confrontation lies in having students prepare and discuss structured questions about something that is hard to understand in the text. (See here for an extensive discussion) The reason is that questions, while targeting something in the text, reveal the expectations of the person asking. Why does the question arise? Because there is something lacking that I would expect to be present in the text. Most struggles with texts are struggles with our own expectations that the text doesn’t meet. Of course, there might be a term we don’t know or a piece of information lacking, but this is easily settled with an internet search these days. The more pervasive struggles often reveal that we encounter something unfamiliar in the sense that it runs counter to what we expect the text to say. This, then, is where a meeting of the current students and historical figures takes place, making explicit our and their assumptions.
During the seminar discussions, I noticed that students, unlike in other courses, dared targeting really tricky propositions that they couldn’t account for on the fly. Instead of trying to appear as being on top of the material, they delineated problems to be addressed and raised genealogical questions of how concepts might have developed between 1277 and 2020. Interestingly, the assumption was often not that we were more advanced. Rather they were interested in giving reasons why someone would find a given idea worth defending. So my first impression after this course was that the twofold focus on the object and subject of study made the students’ approach more historical, in that they didn’t take their own assumptions as a yardstick for assessing ideas. Another outcome was that students criticised seeing our text as a mere “object of study”. In fact, I recall one student saying that “texts are hardly ever mere objects”. Rather, we should ultimately see ourselves as engaging in dialogue with other subjects, revealing their prejudices as much as our own.
The children in the living room were not chided. They were recognised in what they had taken over from their elders. Now they could be seen as continuing to learn – making, shunning and studying history.
Are all human beings equal? – Of course, that’s why we call them human. – But how do we know? – Well, it’s not a matter of empirical discovery, it’s our premise. – I see. And so everything else follows?
The opposition between empiricism and rationalism is often introduced as an epistemological dispute, concerning primarily the ways knowledge is acquired, warranted and limited. This is what I learned as a student and what is still taught today. If you’ve studied philosophy for a bit, you will also have heard that this opposition is problematic and coarse-grained when taken as a historical category. But in my view the problem is not that this opposition is too coarse-grained (all categories of that kind are). Rather, the problem lies with introducing it as a mere epistemological dispute. As I see it,* the opposition casts a much wider conceptual net and is rooted in metaphysical and even political ideas. Thus, the opposition is to be seen in relation to a set of disagreements in both theoretical and practical philosophy. In what follows, I don’t want to present a historical or conceptual account, but merely suggest means of recognising this wide-ranging set of ideas and show how the distinction helps us seeing the metaphysical implications and political choices related to our epistemological leanings.
Let me begin with a simple question: Do you think there is, ultimately, only one true description of the world? If your answer is ‘yes’, I’d be inclined to think that you are likely to have rationalist commitments. Why? Well, because an empiricist would likely reject that assumption for the reason that we might not be able to assess whether we lack important knowledge. Thus, we might miss out on crucial insights required to answer that question in the first place. This epistemological caution bears on metaphysical questions: Might the world be a highly contingent place, subject to sudden or constant change? If this is affirmed, it might not make sense to say that there is one true description of the world. How does this play out in political or moral terms? Rephrasing the disagreement a bit, we might say that rationalists are committed to the idea that the world is ordered in a certain way, while empiricists will remain open as to whether such an order is available to us at all. Once we see explanatory order in relation to world order, it becomes clear that certain commitments might follow for what we are and, thus, for what is good for us. If you believe that we can attain the one true description of the world, you might also entertain the idea that this standard should inform our sciences and our conduct at large. – Of course, this is quite a caricature of what I have in mind. All I want to suggest is that it might be rewarding to look whether certain epistemological leanings go hand in hand with metaphysical and practical commitments. So let’s zoom in on the different levels in a bit more detail.
(1) Epistemology: As I have already noted, the opposition is commonly introduced as concerning the origin, justification and limits of knowledge. Are certain ideas or principles innate or acquired through the senses? Where do we have to look in order to justify our assumptions? Can we know everything there is to be known, at least in principle, or are there realms that we cannot even sensibly hope to enter? – If we focus on the question of origin, we can already see how the opposition between empiricism and rationalism affects the pervasive nature-nurture debates: Are certain concepts and the related abilities owing to learning within a certain (social) environment or are the crucial elements given to us from the get-go? Now, let’s assume you’re a rationalist and think that our conceptual activity is mostly determined from the outset. Doesn’t it follow from this that you also assume that we are equal in our conceptual capacities? And doesn’t it also follow that rules of reasoning and standards of rationality are the same for all (rather than owing, say, to cultural contexts)? – While the answers are never straightforward, I would assume at least certain leanings into one direction or another. But while such leanings might already inform political choices, it is equally important to see how they relate to other areas of philosophy.
(2) Metaphysics: If you are an empiricist and assume that the main sources of our knowledge are our (limited) senses, this often goes and in hand with epistemic humility and the idea that we cannot explain everything. Pressed why you think so, you might find yourself inclined to say that the limits of our knowledge have a metaphysical footing. After all, if we cannot say whether an event is fully explicable, might this not be due to the fact that the world is contingent? Couldn’t everything have been otherwise, for instance because God interferes in events here and there? In other words, if you don’t assume there to be a sufficient reason for everything, this might be because you accept brute facts. Accordingly, the world is a chancy place and what our sciences track might be good enough to get by, but never provide the certainty that is promised by our understanding of natural laws. Depending on the historical period, such assumptions often go hand in hand with more or less explicit forms of essentialism. The lawful necessities in nature might be taken to relate to the way things are. Now essences are not only taken to determine what things are, but also how they ought to be. – Once you enter the territory of essentialism, then, it is only a small step to leanings regarding norms of being (together), of rationality, and of goodness.
(3) Theology / Sources of Normativity: If you allow for an essentialist determination of how things are and ought to be, this immediately raises the question of the sources of such essences and norms. Traditionally, we often find this question addressed in the opposition between theological intellectualism (or rationalism) and voluntarism: Intellectualists assume that norms of being and acting are prior to what God wills. So even God is bound by an order prior to his will. God acts out of reasons that are at least partly determined by the way natural things and processes are set up. By contrast, voluntarists assume that something is rational or right because God wills it, not vice versa. It is clear how this opposition rhymes with that of rationalism and empiricism: The rationalist assumes one order that even binds God. The empiricist remains epistemically humble, because she believes that rationality is fallible. Perhaps she believes this because she assumes that the world is a chancy place, which in turn might be owing to the idea that the omnipotent God can intervene anytime. It is equally clear how this opposition might translate into (lacking) justifications of moral norms or political power. – Unlike often assumed in the wake of Blumenberg and others, this doesn’t mean that voluntarism or empiricism straightforwardly translate into political absolutism. It is hardly ever a particular political idea that is endorsed as a result of empiricist or rationalist leanings. Nevertheless, we will likely find elements that play out in the justification of different systems.**
Summing up, we can see that certain ideas in epistemology go hand in hand with certain metaphysical as well as moral and political assumptions. The point is not to argue for systematically interwoven sets of doctrines, but to show that the opposition of empiricism and rationalism is so much more than just a disagreement about whether our minds are “blank slates”. Our piecemeal approach to philosophical domains might have its upsides, but it blurs our vision when it comes to the tight connections between theoretical and practical questions which clearly were more obvious to our historical predecessors. Seen this way, you might try and see whether you’ll find pertinently coherent assumptions in historical or current authors or in yourself. I’m not saying you’re inconsistent if you diverge from a certain set of assumptions. But it might be worth asking if and why you conform or diverge.
* A number of ideas alluded to here would never have seen the light of day without numerous conversations with Laura Georgescu.
“… solipsism strictly carried out coincides with pure realism. The I in solipsism shrinks to an extensionless point and there remains the reality co-ordinated with it.” Wittgenstein, TLP 5.64
When was the last time you felt really and wholly understood? If this question is meaningful, then there are such moments. I’d say, it does happen, but very rarely. If things move in a good direction, there is an overlap or some contiguity or a fruitful friction in your conversation. Much of the time, though, I feel misunderstood or I feel that I have misunderstood others. – Starting from such doubts, you could take this view to its extremes and argue that only you understand yourself or, more extreme still, that there is nothing external to your own mind. But I have to admit that I find these extreme brands of solipsism, as often discussed in philosophy, rather boring. They are highly implausible and don’t capture what I think is a crucial idea in solipsism. What I find crucial is the idea that each of us is fundamentally alone. However, it’s important to understand in what sense we are alone. As I see it, I am not alone in the sense that only I know myself or only my mind exists. Rather, I am alone insofar as I am different from others. Solitude, then, is not merely a feeling but also a fact about the way we are.* In what follows, I’d like to suggest reasons for embracing this view and how its acknowledgement might actually make us more social.
Throwing the baby out with the bathwater. – In 20th-century philosophy, solipsism has often had a bad name. Solipsism was and is mostly construed as the view that subjective experience is foundational. So you might think that you can only be sure about what’s going on in your own mind. If you hold that view, people will ridicule you as running into a self-defeating position, because subjective states afford no criteria to distinguish between what seems and what is right. Rejecting subjective experience as a foundation for knowledge or theories of linguistic meaning, many people seemed to think it was a bad idea altogether. This led to an expulsion of experience from many fields in philosophy. Yes, it does seem misguided to build knowledge or meaning on subjective experience. But that doesn’t stop experience from playing an important part in our (mental) lives. Let me illustrate this issue a bit more so as to show where I see the problem. Take the word “station”. For the (public) meaning of this word, it doesn’t matter what your personal associations are. You might think of steam trains or find the sound of the word a bit harsh, but arguably nothing of this matters for understanding what the word means. And indeed, it would seem a bit much if my association of steam trains would be a necessary ingredient for mastering the concept or using it in communication. This is a bit like saying: If we want to use the word “station” to arrange a meeting point, it doesn’t matter whether you walk to the station through the village or take the shortcut across the field. And yes, it doesn’t matter for the meaning or success of our use of the word whether you cut across the field. But hang on! While it doesn’t matter for understanding the use of the word, it does matter for understanding my interlocutor. Thinking of steam trains is different from not thinking of them. Cutting across the field is different from walking through the village. This is a clear way in which the experience of interlocutors matters. Why? Well, because it is different. As speakers, we have a shared understanding of the word “station”; as interlocutors we have different experiences and associations we connect with that word. As I see it, it’s fine to say that experience doesn’t figure in the (public) meaning. But it is problematic to deny that the difference in experience matters.
A typical objection to this point is that private or subjective experience cannot be constitutive for meaning. But this goes only so far. As interlocutors, we are not only interested in understanding the language that someone uses, but also the interlocutor who is using it. This is not an easy task. For understanding language is rooted in grasping sameness across different contexts, while understanding my interlocutor is rooted in acknowledging difference (in using the same words). This is not a point about emphatic privacy or the idea that our experience were to constitute meaning (it doesn’t). It’s a point about how differences can play out in practical interaction. To return to the earlier example “Let’s go to the station” can mean very different things, if one of you wants to go jointly but it turns out you have different routes in mind. So understanding the interlocutor involves not only a parsing of the sentence, but an acknowledgement of the differences in association. It requires acknowledging that we relate different experiences or expectations to this speech act. So while we have a shared understanding of language, we often lack agreement in associations. It is this lack of agreement that can make me vastly different from others. Accordingly, what matters in my understanding of solipsism is not that we have no public language (we do), but that we are alone (to some degree) with our associations and experiences.
Arguably, these differences matter greatly in understanding or misunderstanding others. Let me give an example: Since I started blogging, I can see how often people pick one or two ideas and run. Social media allow you to test this easily. Express an opinion and try to predict whether you’ll find yourself in agreement with at least a fair amount of people. Some of my predictions failed really miserably. But even if predictions are fulfilled, most communication situations lack a certain depth of understanding. Why is this the case? A common response (especially amongst analytically inclined philosophers) is that our communication lacks clarity. If this were true, we should improve our ways of communicating. But if I am right, this doesn’t help. What would help is acknowledging the differences in experience. Accordingly, my kind of solipsism is not saying: Only I know myself. Or: Only my mind exists. Rather it says: I am different (from others).
This “differential solipsism” is clearly related to perspectivism and even standpoint theory. However, in emerging from the acknowledgement of solitude, it has a decidedly existential dimension. If a bit of speculation is in order, I would even say that the tendency to shun solipsism might be rooted in the desire to escape from solitude by denying it. It’s one thing to acknowledge solitude (rooted in difference); it’s another thing to accept the solitary aspects of our (mental) lives. Let’s look more closely how these aspects play out.
Even if philosophers think that experience doesn’t figure in the foundations of knowledge and meaning, it figures greatly in many of our interactions.** We might both claim to like jazz, but if we go to a concert, it might be a disappointment when it turns out that we like it for very different reasons. So you might like the improvisations, while I don’t really care about this aspect, but am keen on the typical sound of a jazz combo. If the concert turns out to feature one but not the other aspect, our differences will result in disagreement. Likewise, we might disagree about our way to the station, about the ways of eating dinner etc. Now as I see it, the solitude or differences we experience in such moments doesn’t sting because of the differences themselves. What makes such moments painful is rather when we endure and paste over these differences without acknowledging them.
If I am right, then I don’t feel misunderstood because you don’t happen to care about the sound of the combo. I feel misunderstood, because the difference remains unacknowledged. Such a situation can typically spiral into a silly kind of argument about “what really matters”: the sound or the improvisation. But this is just silly: what matters for our mutual understanding is the difference, not one of the two perspectives. In a nutshell: True understanding does not lie in agreement, but in the detailed acknowledgement of disagreement.***
But why, you might ask, should this be right? Why would zooming in on differences in association or experience really amend the situation? The reason might be given in Wittgenstein’s claim that solipsism ultimately coincides with realism. How so? Well, acknowledging the different perspectives should hopefully end the struggle over the question which of the perspectives is more legitimate. Can we decide on the right way to the station? Or on the most salient aspect in a jazz concert? No. What we can do is articulate all the perspectives, acknowledging the reality that each view brings to the fore. (If you like, you can imagine all the people in the world articulating their different experiences, thereby bringing out “everything that is the case.”)
Writing this, I am reminded of a claim Evelina Miteva made in a conversation about writing literature: The more personal the description of events is, the more universal it might turn out to be. While this sounds paradoxical, the realism of differential solipsism makes palpable why this is true. The clear articulation of a unique experience does not block understanding. Quite the contrary: It allows for localising it in opposition to different experiences of the same phenomenon. In all these cases, we might experience solitude through difference, but we will not feel lonely for being invisible.
* Of course, the title “Solitude standing” is also a nod to the great tune by Suzanne Vega:
*** And once again, I am reminded of Eric Schliesser’s discussion of Liam Brights’s post on subjectivism, hitting the nail on the following head: “Liam’s post (which echoes the loveliest parts of Carnap’s program with a surprisingly Husserlian/Levinasian sensibility) opens the door to a much more humanistic understanding of philosophy. The very point of the enterprise would be to facilitate mutual understanding. From the philosophical analyst’s perspective the point of analysis or conceptual engineering, then, is not getting the concepts right (or to design them for ameliorative and feasible political programs), but to find ways to understand, or enter into, one’s interlocutor life world.”
Say you would like to learn something about Kant, should you start by reading one of his books or rather get a good introduction to Kant? Personally, I think it’s good to start with primary texts, get confused, ask questions, and then look at the introductions to see some of your questions discussed. Why? Well, I guess it’s better to have a genuine question before looking for answers. However, even before the latest controversy on Twitter (amongst others between Zena Hitz and Kevin Zollman) took off, I have been confronted with quite different views. Taken as an opposition between extreme views, you could ask whether you want to make philosophy about ideas or people (and their writings). It’s probably inevitable that philosophy ends up being about both, but there is still the question of what we should prioritise.
Arguably, if you expose students to the difficult original texts, you might frighten them off. Thus, Kevin Zollman writes: “If I wanted someone to learn about Kant, I would not send them to read Kant first. Kant is a terrible writer, and is impossible for a novice to understand.” Accordingly, he argues that what should be prioritised is the ideas. In response Zena Hitz raises a different educational worry: “You’re telling young people (and others) that serious reading is not for them, but only for special experts.” Accordingly, she argues for prioritising the original texts. As Jef Delvaux shows in an extensive reflection, both views touch on deeper problems relating to epistemic justice. A crucial point in his discussion is that we never come purely or unprepared to a primary text anyway. So an emphasis on the primary literature might be prone to a sort of givenism about original texts.
I think that all sides have a point, but when it comes to students wanting to learn about historical texts, there is no way around looking at the original. Let me illustrate my point with a little analogy:
Imagine you want to study music and your main instrument is guitar. It is with great excitement that you attend courses on the music of Bach whom you adore. The first part is supposed to be on his organ works, but already the first day is a disappointment. Your instructor tells you that you shouldn’t listen to Bach’s organ pieces themselves, since they might be far too difficult. Instead you’re presented with a transcription for guitar. Well, that’s actually quite nice because this is indeed more accessible even if it sounds a bit odd. (Taken as an analogy to reading philosophy, this could be a translation of an original source.) But then you look at he sheets. What is this? “Well”, the instructor goes on, “I’ve reduced the accompaniment to the three basic chords. That makes it easier to reproduce it in the exam, too. And we’ll only look at the main melodic motif. In fact, let’s focus on the little motif around the tonic chord. So, if you can reproduce the C major arpeggio, that will be good enough. And it will be a good preparation for my master class on tonic chords in the pre-classic period.” Leaving this music school, you’ll never have listened to any Bach pieces, but you have wonderful three-chord transcriptions for guitar, and after your degree you can set out on writing three-chord pieces yourself. If only there were still people interested in Punk!
Of course, this is a bit hyperbolic. But the main point is that too much focus on cutting things to ‘student size’ will create an artificial entity that has no relation to anything outside the lecture hall. But while I thus agree with Zena Hitz that shunning the texts because of their difficulties sends all sorts of odd messages, I also think that this depends on the purpose at hand. If you want to learn about Kant, you should read Kant just like you should listen to Bach himself. But what if you’re not really interested in Kant, but in a sort of Kantianism under discussion in a current debate? In this case, the purpose is not to study Kant, but some concepts deriving from a certain tradition. In this case, you might be more like a jazz player who is interested in building a vocabulary. Then you might be interested, for instance, in how Bach dealt with phrases over diminished chords and focus on this aspect first. Of course, philosophical education should comprise both a focus on texts and on ideas, but I’d prioritise them in accordance with different purposes.
That said, everything in philosophy is quite difficult. As I see it, a crucial point in teaching is to convey means to find out where exactly the difficulties lie and why they arise. That requires all sorts of texts, primary, secondary, tertiary etc.
I recognize that I could only start to write about this … once I related to it. I dislike myself for this; my scholarly pride likes to think I can write about the unrelatable, too.Eric Schliesser
Philosophy students often receive the advice that they should focus on topics that they have a passion for. So if you have fallen for Sartre, ancient scepticism or theories of justice, the general advice is to go for one of those. On the face of it, this seems quite reasonable. A strong motivation might predict good results which, in turn, might motivate you further. However, I think that you might actually learn more by exposing yourself to material, topics and questions that you initially find remote, unwieldy or even boring. In what follows, I’d like to counter the common idea that you should follow your passions and interests, and try to explain why it might help to study things that feel remote.
Let me begin by admitting that this approach is partly motivated by my own experience as a student. I loved and still love to read Nietzsche, especially his aphorisms in The Gay Science. There is something about his prose that just clicks. Yet, I was always sure that I couldn’t write anything interesting about his work. Instead, I began to study Wittgenstein’s Tractatus and works from the Vienna Circle. During my first year, most of these writings didn’t make sense to me: I didn’t see why they found what they said significant; most of the terminology and writing style was unfamiliar. In my second year, I made things worse by diving into medieval philosophy, especially Ockham’s Summa Logicae and Quodlibeta. Again, not because I loved these works. In fact, I found them unwieldy and sometimes outright boring. So why would I expose myself to these things? Already at the time, I felt that I was actually learning something: I began to understand concerns that were alien to me; I learned new terminology; I learned to read Latin. Moreover, I needed to use tools, secondary literature and dictionaries. And for Ockham’s technical terms, there often were no translations. So I learned moving around in the dark. There was no passion for the topics or texts. But speaking with hindsight (and ignoring a lot of frustration along the way), I think I discovered techniques and ultimately even a passion for learning, for familiarising myself with stuff that didn’t resonate with me in the least. (In a way, it seemed to turn out that it’s a lot easier to say interesting things about boring texts than to say even boring things about interesting texts.)
Looking back at these early years of study, I’d now say that I discovered a certain form of scholarly explanation. While reading works I liked was based on a largely unquestioned understanding, reading these unwieldy new texts required me to explain them to myself. This in turn, prompted two things: To explain these texts (to myself), I needed to learn about the new terminology etc. Additionally, I began to learn something new about myself. Discovering that certain things felt unfamiliar to me, while others seemed familiar meant that I belonged to one kind of tradition rather than another. Make no mistake: Although I read Nietzsche with an unquestioned familiarity, this doesn’t mean that I could have explained, say, his aphorisms any better than the strange lines of Wittgenstein’s Tractatus. The fact that I thought I understood Nietzsche didn’t give me any scholarly insights about his work. So on top of my newly discovered form of explanation I also found myself in a new relation to myself or to my preferences. I began to learn that it was one thing to like Nietzsche and quite another to explain Nietzsche’s work, and still another to explain one’s own liking (perhaps as being part of a tradition).
So my point about not studying what you like is a point about learning, learning to get oneself into a certain mode of reading. Put more fancily: learning to do a certain way of (history of) philosophy. Being passionate about some work or way of thinking is something that is in need of explanation, just as much as not being passionate and feeling unfamiliar about something needs explaining. Such explanations are greatly aided by alienation. As I said in an earlier post, a crucial effect of alienation is a shift of focus. You can concentrate on things that normally escape your attention: the logical or conceptual structures for instance, ambiguities, things that seemed clear get blurred and vice versa. In this sense, logical formalisation or translation are great tools of alienation that help you to raise questions, and generally take an explanatory stance, even to your most cherished texts.
As a student, discovering this mode of scholarly explanation instilled pride, a pride that can be hurt when explanations fail or evade us. It was remembering this kind of pain, described in the motto of this post, that prompted these musings. There is a lot to be said for aloof scholarship and the pride that comes with it, but sometimes it just doesn’t add up. Because there are some texts that require a more passionate or intuitive relation before we can attain a scholarly stance towards them. If the passion can’t be found, it might have to be sought. Just like our ears have to be trained before we can appreciate some forms of, say, very modern music “intuitively”.