The idea that “the self is an illusion” has become the bien pensant common sense of the day in many educated circles in the West. I think this is profoundly misguided, but I can see why it seems so compelling. In the English language, nouns label things while verbs label the things that they do. If you mistook this linguistic quirk for a deep philosophical truth then you would of course conclude that the self doesn’t exist, because there is no single thing that “I” or “you” refers to. That is because on close examination “I” turns out to be a verb disguised as a noun.
That might be an unconventional way of putting it, but the basic idea behind this claim is now the dominant view of the self held by professionals in Western philosophy and psychology. It’s a view that goes back to at least John Locke, who argued that while “human being” is a biological category, “person” is not. A parrot or alien that exhibited the same intelligence and sense of self as a typical human would not be a human but it would be a person, that is, “a thinking intelligent being, that has reason and reflection, and can consider itself as itself, the same thinking thing, in different times and places; which it does only by that consciousness which is inseparable from thinking, and, as it seems to me, essential to it.” This understanding defines persons not in terms of what they are physically, but in terms of what they do, psychologically. So although “person” is a noun, it is one whose reference is fixed by verbs: the acts of thinking, remembering, feeling, acting.
David Hume developed a similar argument to that of Locke, arguing that there was no indivisible self or ego but that the self was a “bundle” of thoughts, feelings and perceptions, sufficiently interrelated to create a sense of a single entity. The self is therefore not so much a thing as a series of mental events.
Although ‘person’ is a noun, it is one whose reference is fixed by verbs: the acts of thinking, remembering, feeling, acting.
Contemporary psychology supports this conception. It tells us that there is no command and control center in the brain, no place where the sense of self can be located. Rather, the feeling of selfhood emerges though the interrelation of parallel and overlapping brain processes. This again links selfhood to mental activity: a brain in which nothing is going on is not a self.
We can therefore see self as active rather than static, ever-changing rather than permanent. This kind of self is not an illusion because all the activity is real. There is only an illusion if you mistake the activity for a thing, believing that because “I” takes the form of the noun it indicates the existence of a singular object. That we often do this merely shows that there are illusions of self, not that self is an illusion.
The centrality of the activity of living for selfhood is why there has been so much emphasis in recent decades on the importance of “narrative” for personal identity. This can be over-egged: some people appear to be very good at living in the present with little thought to their past and future selves. Nonetheless, it certainly seems to be true that without some kind of inner story based on memories we cannot have a proper sense of who we are as individuals.
In the apparently more individualistic West there is less concern given to the cultivation of the individual self than in the supposedly collectivist East.
The narrative metaphor suggests that there is good sense in Roger Ames’s suggestion that we would be better to talk in terms of human becoming rather than human being. If we are like books that are still being written or plays in mid-performance, then clearly we are still in the process of becoming who we are. But this is not the kind of becoming that ends in being, like the building of a house which ends with a complete construction. Rather, our only being is becoming: when the becoming ends, we end, too. A performance is thus a better metaphor for the human narrative than a novel.
This is an idea which, if understood correctly, could help bring together apparently conflicting ideas in Eastern and Western understandings of self. To simplify somewhat, the dominant tendency in Eastern thinking is to think of the self relationally, as situated in a network of social connections. This contrasts with the Western emphasis on individuality and the potential for self-actualization. These are of course only tendencies and it would be absurd to see East and West as binary opposites here. The Japanese have no problem identifying and valuing individuals and “no man is an island” is not an old Chinese proverb but one of the biggest clichés in the Western world.
There is, however, another key difference. Eastern philosophies tend to emphasize the need to cultivate and develop the self, whereas in the West our individuality is taken as a given. This suggests a curious paradox: in the apparently more individualistic West there is less concern given to the cultivation of the individual self than in the supposedly collectivist East.
If the West re-conceives the value of individualism in this way then it would in turn start to look less alien to those in the East.
The idea of “human becoming” could help bridge these differences. To the West, it is a reminder that if indeed our identities are not fixed at birth, our futures remain open and that who we become depends a lot on what we choose to do. Too often this is read as trite claim that we can “be whatever we want to be.” But even if we replace this with the lesser claim that we have the capacity to realize more than one potential, this will only happen if we try to cultivate the selves we want to become. This idea of “work on the self” is what is missing from a West which is both self-centered and curiously unfocused on the self. An enlightened individualism is one which sees individuals as needing to work on their own becoming rather than simply taking their actual being as the last word.
Once it is accepted that work on the self is necessary, it should become obvious that this in turn requires good social relations. Self-actualization cannot be achieved by individual selves alone: it requires education, family support, secure social structures. Even in Hollywood, that great celebrator of the rugged American individual, always provides its rags-to-riches heroes-to-zeros with mentors and inspirations. Hence Eastern ideas of “relational selves” and “work on the self” which might seem alien begin to make sense as richer versions of more familiar ideals of becoming our best selves.
If the West re-conceives the value of individualism in this way then it would in turn start to look less alien to those in the East. Seeing how the idea of becoming has a natural home in the West is a reminder that individualism at its best is not some nasty Western import, but has a version with its own deep roots in Eastern tradition,s too. The Western kind of work on self also provides a useful counterweight to versions which over-emphasize relationality to the extent that people come to believe their only task is to uncritically and unthinkingly understand their pre-determined social role and stick to it.
If we see ourselves as human beings, it seems we have to either accept that we will at some point cease to be or else hope that we live on indefinitely. For many in the modern world … this is a choice between the gloomily inevitable and the optimistically implausible.
I am not of course suggesting that the idea of human becoming dissolves all differences between Eastern and Western conceptions of self. Rather it helps each tradition to see what it can learn from the other and to see that they are not mirror images of each other but simply occupy different places on a continuum.
The idea of human becoming has other advantages too, whatever tradition we identify with. It encourages us to let the past go, since whatever we once were, we have now become something else. This has long been a central ethical teaching in Buddhism, which has a view of self (anattā) remarkably similar to the Western, Humean one. In the popular imagination, however, the idea of self of verb has not taken hold and people tend to assume that the self is some kind of unchanging Cartesian essence. Recognizing that, in fact, the Western tradition has its own indigenous version of anattā could be a useful way to counter the kind of consumerist greed and “grasping” which appalls many in the West as well as the East.
Finally, perhaps it can also help us to come to terms with death. In many religious traditions, there is a belief that death is not the end, although what exactly is to come is not always clear. In Christianity, for example, the afterlife is a usually thought of as a relatively straightforward continuation of the personal self, whereas in Buddhism, that which continues is more like a flow of psychic energy than an individual. If we see ourselves as human beings, it seems we have to either accept that we will at some point cease to be or else hope that we live on indefinitely. For many in the modern world who accept the biological basis of our being, this is a choice between the gloomily inevitable and the optimistically implausible. But if we switch to seeing ourselves in terms of human becoming, it might be easier to accept that one day the becoming will come to an end. Why yearn for indefinite being when we have no being that persists indefinitely? It would be too much to suggest that it can make us totally sanguine about death: after all, the hallmark of any great performance is that we don’t want it to finish. But recognizing that all must end might at least help us to reconcile ourselves to that melancholic inevitability.
Also on WorldPost:
— This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.