SCOPE: Personal individuality (1)

Arya’s Faces

“A girl is no one.”
— Game of Thrones

Most of us have all we can do to be who we need to be, to juggle the few social roles that seem necessary, and to hang on to, and nurture, some kind of authentic inner self.  Like true primates, though, we like to fiddle with things.  When we want to mentally fiddle with what more complex identity problems might be, we turn to fiction.

Fiction lacks punch without conflict.  The plot may engender conflict, but it is the internalization of that conflict in characters that really interests us.  We wonder what will the character do: stay or go, fight or flee, love or withdraw, take revenge or forgive.  When we are really engaged we are identifying with the character and thus are trying on an alternative self, risk free, like trying on a shirt at the store.

Currently the world is captivated by the story being told in the television serialization, Game of Thrones, based on George R.R. Martin’s Song of Ice and Fire novels.  In that boiling stew of social strife no character tells us more about  identity conflict than Arya Starck, princess of Winterfell.

Arya begins with a conflict between her own nature, which is active and warrior-like, and the expectations of her family and culture, which is for her to be a traditionally feminine princess, suitable for being traded in marriage as payment for political alliances.  Her sister Sansa in fact is betrothed to the future king Joffrey, blind to his severe character flaws, but Arya sees through him and stands up to him, at her own peril.  So there is also in Arya some of the truly honorable character of her father Eddard, Lord of the North in Westeros.  Eddard is honest, just and fair, upright almost to a fault, lacking the casual cruelty that is rife in Westeros’s ruling class.

Arya takes the first steps towards a violent future when she tries to teach herself fencing with a servant boy.  Later her father realizes her desire for fighting skill.  He hires her an elite fencing master who can teach her techniques suitable to her young age (around 11) and petite stature.  She acquires her own small sword. naming it “Needle” as a mockery of the needlework that her mother expected her to pursue.  She might have turned out like her father, becoming a fighting princess and thus harmonizeing her “true self” and her social position.  But the death of the king and treachery from Joffrey sees her witness the unjust beheading of her father Eddard.

In the long odyssey that follows her escape from her fathers’ fate, we see her adopting multiple aliases to avoid being identified.  She experiences many horrible things as she travels across a land overrun by a multi-sided war of succession.  Beneath the various commoner roles she has to play for survival, she is still the wronged aristocrat, clinging to a list of enemies whose names she recites before going to bed.  She thus keeps burning an internal fire for revenge against those who have wronged her, her family, and her friends.  She gradually adopts a darker side of honor that her noble father never had.  She kills an innocent stable boy accidentally.  Thus freed to be a killer, she finds various ways to kill people on the list, dispatching some of them by her own hand.  When she can, she surfaces the Arya identity so her victims know why they are dying.

She has an encounter with a criminal, Jaqen, who helps her kill some enemies because he owes her his life.  He invites her to join his cult to learn better techniques.  She demurs, but when they part she is astonished to see him transform into a person with an entirely different face.  He leaves her a coin that can be used as a token to find him. After other misadventures and the loss of her mother and brother to more treachery, she abandons a quest to find her remaining brother, and, seemingly on a whim, heads overseas to Braavos, the home city of Jaqen and his death cult, the Faceless Men.

The cult maintains a temple, the House of Black and White.  They offer painless death to people seeking a way out of life.  By magic means they store the faces of these people. The faces can be worn by their operatives who engage in the cult’s other “service”: killings for hire.  These are done for money but can also have some twisted rationale about which contracts should be accepted or refused.

Jaqen points out that Arya could continue her life in several ordinary ways, but to become a Faceless Man “a girl must be no one.”  Arya persists, throwing away clothes and money belonging to Arya Starck, but secretly stashing Needle.  The cult proceeds to try to eradicate Arya’s birth identity and its baggage, like her enemies list.  She is taught to go out on the street and observe, adopting various street identities.  The goal is to learn how to become someone new for long enough to carry out assassinations contracts.  Arya, long used to being someone else, has no trouble with the new identities, but gets distracted when she unexpectedly meets and kills people on her enemies list.

In an attempt to finally rid Arya of a permanent identity, the cult uses a drug to force her to be blind for a period of time.  She is also stressed by being forced into repeated mock combats with a sighted assassin known as The Waif. Finally the time comes when, asked who she is, Arya says, “a girl is no one”, and Jaqen believes her.  He will give her one more chance, to assassinate a popular actress.  Arya poisons the actress’s drink but, conscience struck, knocks it aside at the last second, and then implicates the jealous rival actress who had commissioned the act.

Marked for death as a failed acolyte, Arya is wounded by The Waif, but then defeats her. Jaqen offers to let her now be a Faceless Man, but Arya decides to become Ned Starck’s daughter again.  She goes back to Westeros and uses their face transplanting magic to destroy the entire extended family that had murdered much of her clan in the treacherous Red Wedding.

Arya’s fictional life stretches the idea of identity past the limits of what might be seen in any real life.  She has an underlying character, or true self, that gradually morphs from honorable warrior aristocrat to vengeful killer.  Her willingness and ability to use force go beyond what a girl of her age should be able to do.

She adopts many aliases, even changing gender for a time.  Her early aliases last for long periods measured in days or weeks, but at night her underlying identity briefly surfaces as she recites her litany of enemy names.  Under social pressure from the cult, she experiments with having a null identity, being “no one”, on top of which various poses such as urchin, street vendor, or beggar can be put on and off like masks.  Her ability to do this arises from social pressure in the form of military-like hazing and brainwashing, a forced sensory isolation via temporary blinding, as well as her own   meditation-like mental practice and concentration.

Put to the test of carrying out an unjust killing however, she returns to her former mode of being, a dynamically dual identity.  Inside she is Arya Starck, a princess of an honorable but persecuted family.  Outside she now is a master at adopting the false personas that are the tool for destroying her enemies.

Few characters in fiction go to Arya’s extremes, and even fewer in real life. But we love stories about spies who have powers of identity transformation that we can’t imagine having.  Martin portrays this as happening under the enormous pressure of war and historical upheaval.  We shall see later many less extreme circumstances in which identity fission or plurality may occur.

BUNDLE: Identity problems (1)

What’s Been Studied

We have seen what causes people to have a unified personal identity. We’ve also looked at the parts of the mind.  We can use this knowledge to look deeper into the construction of identity.  Under the rock of unified identity there are some interesting problems crawling around.

The Three Problems.

The consciousness theorists glibly divide their studies into a Hard Problem (subjectivity) and an Easy Problem (everything else). Identity studies, in contrast, are often divided according to the time scale involved.  Table 1 shows a division of analysis into three parts: longer term continuity, shorter term coherence, and multiplicity. These divisions are often seen in the literature about the origins and significance of personal identity.

Table 1: Problems in Understanding Personal Identity

Problem Focus Denoted by Characterized by
Lifetime Diachronic: longer term cohesion.

Psychological continuity (John Locke)

True self.



Physical body (bodily self).

Interpersonal uniqueness (social self).

Life story/biography (narrative self).

Right Now Synchronic: Shorter term coherence.

Who I am right now.

Minimal self (Shaun Gallagher).






Agency (volitional self).

Mental stance/POV (perspectival self).

Ownership (bounded self)

Plural Different selves within one person. Sub-self.


Coexistence, level of integration.

Competition, exclusion and control.

Degree of development.

The problem that we addressed in the UNITY chapter is the maintenance of longer term cohesion, which we might call the Lifetime Problem. This was the problem posed so clearly by John Locke, and then Thomas Reid, with his example of the callow youth, young lieutenant, and old general, all the same person.  When we talk about the person, their personality, or their true self we are in the territory of the Lifetime Problem.

The Right Now Problem is about shorter term coherence of personality.  We take as granted that at any one time in our consciousness that is happening happens to a single person, ourself.  This happens on the “thin moment of the present” time scale of the perceiving self, fed by the Self’s body-grounded senses of agency and ownership. However, for identity there is something more, an immediate sense, not just that “I am”, but “who I am (right now).”

The first two problems have technical names: diachronic (across time) and synchronic (at one time) identity. The third problem might be called the Plural Problem. We can seem, within ourselves and to others, to have multiple identities. These come and go on different time scales.  Different identities may overlap or alternate in time, compete or coexist.  Some identities can be stronger or more intense.  Multiplicity can vary: from mundane normality (such as playing different social roles or having imaginary playmates) to shocking pathology (dissociative identity disorder). The existence of multiplicity gives real spice to how we understand “who I am (right now)”.

The Lifetime Problem.

The Lifetime Problem of Identity, as we saw in the UNITY chapter, is fairly well understood.  Biological boundaries and our social environment are ultimately responsible for the coherence of a personal identity over one’s lifetime.  Throughout life we retain the material boundaries of body, brain and immune system, while social contacts and institutions reinforce our history to make us a particular social being. The Lifetime Problem is “easy”, but only in the analytic sense that we have access to the information needed to study it: the observations of others and of oneself about oneself.  It’s not like the social scientists, psychologists, philosophers and others haven’t fought over both theories and evidence since the time of John Locke.

Nor is it easy in terms of managing one’s life. First of all, far too many people find that maintaining an identity is a struggle against marginalizing prejudice and oppression. The magnitude and scope of this problem is such that, in the late twentieth century, schools of social criticism pronounced a coherent identity to be impossible, destroyed by power structures and the impotence of language itself.

Even for the more privileged in the world, life transitions such as graduation or parenthood are sometimes actual, problematic transitions and not just the next step. Secondly, in our current culture lots of us often need to or want to reinvent ourselves. After a mess of some kind, we may need to start over.  Or we might be lucky enough to have means and opportunities to, for instance, have a second career, or even a sequence or parallel collection of short-lived careers.  The old “modernist” industrial treadmill of one life trajectory no longer binds us, and, necessarily, no longer leads to a single lifetime identity.


Researchers have recently turned their attention to understanding mind wandering.  This is a mental activity that occupies some 2/3 of our conscious time.  Hundreds of times a day we involuntarily turn our mental attention from our outside surroundings or current task and start to mentally meander.  Sometimes a wander is a minor interruption from which we return to a more focused kind of thought.  Often it is reflections on our past, present, or future that may ultimately change how we behave. And for a few people a wandering mind can be pathologically diverting from real life. But the main and normal function is now believed to be knitting together pieces of our memories and imaginings, creating, in our friend T. Metzinger’s words[1],an “adaptive form of self-deception, namely, an illusion of personal identity across time.”

So for Metzinger and others mind wandering is the Right Now mental activity that solves the Lifetime Problem for us.  It’s our bridge between the present moment and our need for a continuous, cohesive identity.  We should perhaps consider it as the core activity of the narrative self.  We shall learn more later about how mind wandering fits in the spectrum of conscious states.

The Right Now Problem.

“The only Existence, of which we are certain, are Perceptions. … I never can catch myself at any Time without a Perception, and never can observe any Thing but the Perception … I may venture to affirm of the rest of Mankind, that they are nothing but a Bundle of Perceptions … ” David Hume[2], 1745

Identity’s Right Now problem perhaps began with David Hume reducing the self to a structureless bundle of perceptions.  He seems, posthumously, to have triumphed.  If there is any substantial consensus in the study of the conscious self, it is what is commonly referred to as “bundle theory” (honoring Hume).  As we have seen, experts say today that our minds are just a bubbling stew of disconnected, competing functions and mechanisms, adding up to an illusion like the Great Oz behind the curtain.  So you could say that bundle theory leads to the belief that the conscious self is an illusion. IF bundle theory THEN illusory self.

But what actually was behind that curtain?  In the Wizard of Oz, it was a man, a coherent and purposeful, if goofy, particular agent.  He had a life story.  Intuitively you might think that whether or not the self is an illusion, personal identity is a different kind of thing to understand, and it has its own properties and reality. Experts tend to agree – those who focus on the Self tend not to talk about identity, and vice-versa.  Nevertheless, the Right Now Problem of identity has a lot of overlap with study of the self and consciousness because they both are focused on the present moment.

Thus ideas we have already seen — being the agent of ones actions, ownership of ones body and perceptions, and the first person point of view — are important for understanding the internal sense of identity.  These are what give us the feeling of being a particular someone, a single coherent entity, at any given time.

The influence of bundle theory has also led research to focus instead on lower level issues: why there is (A) only one consciousness (B) with the illusion of a now, a present moment.  The estimable Thomas Metzinger calls these the one world problem and the now problem. You might have noticed that philosophers prize Problems over money.

The answers to these are being sought by research into brain activity happening during conscious experiences.  I mentioned gamma band synchrony before in connection with meditation.  Researchers have studied rhythmic synchrony (like the gamma band), other kinds of synchrony such as simultaneous activation of multiple broad brain regions, and exotic measures of the complexity of brain activity.  These are all taken to be “neural correlates”: brain measures that occur at the same time as conscious events and thus underly or explain those events.  There are quite a few theories[3] that try to explain how the mechanisms of these neural events account for features such as the unity of conscious experience or the sense of now-ness and the passage of time.  However, no clear winner has emerged.

So for the Right Now Problem of Identity we have two lines of investigation.  We can look at the neuroscience about the present, unitary conscious moment.  We can also look at psychological studies of agency and ownership.  You might say that these emphasize the “right now” part of “who I am right now.”  We shall come back to how to approach the “who I am” later.

The Plural Problem.

‘Dear, dear! How queer everything is to-day! And yesterday things went on just as usual. I wonder if I’ve been changed in the night? Let me think: was I the same when I got up this morning? I almost think I can remember feeling a little different. But if I’m not the same, the next question is, Who in the world am I? Ah, THAT’S the great puzzle!’  – Alice, speaking from Wonderland

The Oxford don Charles Dodgson must have read the previous century’s philosophical arguments about identity depending on memory.  His character Alice was often confused about who she was, and the word “remember” appears 21 times in Alice’s Adventures in Wonderland[4].  Dodgson himself seems to have felt a deep divide between being a popular writer and being an unhappy mathematician.  He also juggled being a reluctant Anglican deacon, a noted photographer, a casual inventor, a love poet without any obvious lover, and — in his words — “a vile and worthless sinner”, whatever that might have meant.

We all have our divisions.  Some of us are different people when we’re hungry and when we’ve eaten. Many of us have conflicts between social roles, like Lewis Carroll/Charles Dodgson.  Our conflicts can also be between current and future selves.  A few people lead such extraordinary or difficult lives that Arya Starck in The Game of Thrones saga might seem like at most a caricature for them.

The Plural Problem (last row in Table 1) is about accounting for all these different selves.  It tends to be the province of psychologists, psychiatrists, and people who share their orientation.  Consider that a recent encyclopedic history[5] of the idea of self and personal identity, written by two philosophers, has only a few lines about multiple personalities, and a few pages out of 380 about plurality of selves.

Plurality gets studied as a side effect of psychological curiosity about why we do what we do.  Over and over, new theories come up that try to account for the maddening inconsistency of our behavior.  America may be the home of rugged individualism, but it was two American philosophers, William James and George Herbert Mead, who brought the world’s attention to the now-obvious fact that we get nearly all of our identity from interaction with other people.  It was the philosophical interest in this kind of stuff that spun off the fields of psychology and sociology.  Indeed, James was also an early psychologist and Mead an early sociologist.

Since their beginnings in the early twentieth century, the two fields have struggled to find the right scientific methods.  How do you find a parsimonious explanation of the bewildering variety of our behavior and, even more challenging — the contradictions inherent therein?  The models that have been willing to tackle that problem have usually had some flavor of identity plurality in them.

“Life is a cabaret, old chum” — Cabaret, 1966

The song, Cabaret, has had many interpretations, but the broadest is that life is bittersweet and is like a stage production.  The musical came 10 years after the seminal dramaturgical theory[6] of the self from Erving Goffman.  He explained our playing of social roles with (you guessed it) an extended theatrical metaphor, in which we behave front stage, back stage, and off stage to make impressions on our social audience that harmonize with the roles they expect of us.  Even 60 years later this view of self presentation permeates culture, with Facebook, selfies, and personal “branding” expanding our opportunities to be “on stage.”

Goffman emphasized the self’s attempt to influence the other.  But much earlier a peer of James and Mead, Charles Horton Cooley, theorized[7] that we are in fact molded by how we imagine that others see us.  He called this the “looking glass self” because, although others are involved they don’t so much tell us directly what we ought to be. Instead we tell ourselves, based on our imaginative interpretation of what the others expect (remember his “I am what I think that you think I am”).  This is a subtle and nuanced idea for its time, early in the history of social science.

The latter half of the twentieth century was spent working out the details of these social connections that make each of us such a multifaceted person.  A lot of this development clung to the idea of a core or true self and how our multiple roles derived from, or interfered with, that central part.  We were integrated or fragmented, authentic or counterfeit, rational or automatic.  Some theorists emphasized developmental stages with major changes in our lifespan.  This was plurality where a single self held sway during a stage.  True plurality (multiple selves coexisting or alternating over a shorter time frame) would then occur in transitions between stages, when the new you and the old you were fighting it out.

Eventually, leading up to now, researchers started attributing divisions in self more to divisions in the underlying mind, as we have seen.  However, interest in plurality also cropped up from two other directions.

The first of these was microeconomics.  It started out with models of ideally rational behavior of individuals but these were found to be simplistic. Better models took into account the fact that we have conflicts behind our decisions.  Perhaps the first notice of or impetus for this work was an influential essay from Nobel economist Thomas Schelling.  He contributed this scientific insight to the erudite readers of the Journal of Political Economy back in 1961[8]: “… everybody behaves like two people, one who wants clean lungs and long life and another who adores tobacco, or one who wants a lean body and another who wants dessert.”  It came to be understood that these types of conflicts are between our immediate self (Kahneman’s experiencing self, the I*) and our longer-term self (Kahneman’s remembering, narrating, planning self, the Me*). Theorists took special notice of one self’s tricks to force “self control” on the other self.  This would include locking up the booze, or the cigs, or the potato chips, or the gun. In our house the chocolate gets stashed in the fridge crisper along with the lettuce and parsley.  There’s also the dodge of mailing yourself the smartphone to keep it away for a while. Or let somebody else do the policing: send the kid to boarding or military school; tax tobacco.

The other plurality-emphasizing movement came from depth psychology: Carl Jung and the Jungians in particular.  These theorists saw the conflicts within us not as due to social expectations, economic dilemmas, split brains, or (we’ll see this later) unconscious priming.  No, they attribute it to a mysterious shared unconscious full of mythical archetypes.  Different archetypes can compete to take over consciousness.  Jungian theory gradually became to be seen as non-scientific, but a valuable tool in a certain type of psychoanalysis.

In the 90’s a new theory came along, focused completely on plurality, that seemed to have a bit of all preceding theories in it.  Dialogical Self Theory (DST) proposes that each a human mind consists of multiple “self-positions” that act as semi-independent agents.  Some self positions are internalized versions of specific outside parties such as significant others.  Other self positions may be thought of as originating internally and include traditional social roles (son of my father, church goer, artist, soldier) and even points of view or interests (rock’n‘roll fan, foody).  These positions are “dialogical” in the sense that they each have a voice inside your head, and will engage in dialog with some of the other self positions.

DST has been used mainly for psychotherapy.  It uses questioning techniques designed to uncover self positions, and then works to externalize their dialogs so that issues and conflicts can be processed in a therapeutic setting.  When DST came along there had been two decades of strong interest in something called multiple personality disorder.  A backlash was underway, with therapists being sued for apparently promoting the creation of multiple personalities.  DST seems to be a way to accept a therapist’s ability to promote either the uncovering or creation of multiple subselves as a positive goal.  But you can also see echos of other theories.  The self-positions talk to each other like characters in a play (Goffman’s dramaturgical theory, narrative self); external characters are internalized (Williamm James, George H. Mead, Cooley’s Looking Glass Self, and many others); internal conflict (the microeconomics theorists, Jungians and many others); and there are multiple, sometimes archetypal, internal voices (Jungians).

There is very little research on DST, though its adherents would like that to happen. It appears to have some grassroots appeal to lay and self-taught therapists. We shall see later that it has been used in a cultural anthropology context to explain people who believe that they engage in “shape-shifting.”

The founders of DST acknowledge the influence of Mikhail Bakhtin, Russian philosopher, semioticist and literary critic.  They cite Bakhtin’s analysis of literary “polyphony” in which an author actually has, within particular work, multiple voices as an author, but embedded in his character creations.  This hearkens back to Shakespeare’s use of sixth order Theory of Mind intentionality in his plays.  Presumably we would have to go up to seventh order for some works (”Dostoevksy envisions that the reader must understand that Raskolnikov believes <in the truth of some narrative chain of points of view> …”).

There is a lot more to say about plurality. There are many kinds of plurality, some are obscure but not always that rare. Some of these aspects of self are claimed to come from that invisible reservoir, the unconscious, but this can happen in different ways.  Later chapters will dig deeper.  But next we need to introduce what might be considered a neglected problem of personal identity, related to plurality and quite likely caused by the late death of the Self.

[1] The myth of cognitive agency: subpersonal thinking as a cyclically recurring loss of mental autonomy., Thomas Metzinger, Frontiers in Psychology, 2013.

[2] A Letter from a Gentleman to His Firiend in Edinburgh, David Hume, 1745

[3] Models of Consciousness, Anil Seth, Scholarpedia, 2(1):1328, 2007.

[4] Alice’s Adventrues in Wonderland, Lewis Carroll,

[5] The Rise and Fall of Soul and Self: an Intellectual History of Personal Identity. Raymond Martin and John Barresi, 2006.

[6] The Presentation of the Self in Everyday Life, Erving Goffman, 1956.

[7] Human Nature and the Social Order, Charles Horton Cooley,

[8] Egonomics, or the Art of Self-management.  Thomas C. Schelling, Journal of Political Economy, 1961.

UNITY: Coherence of the manifold self (5)

Third Person Narrative

Experts now don’t deny the importance of memory in maintaining our personal identity over time, but they do not find memory to be any more sufficient for the purpose than is the mere persistence of the body, the human organism itself. Philosophers and neuroscientists alike find it necessary to fill the gap by joining the social scientists, who have long asserted, while pounding their lecterns, that identity is a social construct. We can skip their theories about the social origin of identity. It’s enough just to look at the external facts.

Take the above-mentioned interruptions in memory. If we get any help at all in filling the gaps, it will come from other people (”You should have seen what you did just before you passed out” “I remember when you were just two and a half and you looked up at me and said …”).

Culture surrounds us with reminders, talismans, and even enforcement of our identity. This starts very early. We all know the bitter fight about when in the period prior to birth that a nascent human becomes a person. The usual pattern is that a family prepares the way for a baby, both in setting up material possessions for its care, and announcing to their social circle that the new person is coming, and possibly its sex and name. Then birth certificates nail down who we are, who our parents are, and where we entered the world. We become a recognized person with legal rights. This document is drawn on throughout life to validate our identity in new contexts. It is also common for hospitals to store a part of a newborn, in the form of cells from a cheek swab or blood from a heel prick. DNA findings at this time can reveal health conditions, knowledge of which might need to be retained for life. Parents might decide to store umbilical cord blood for stem cells that can be used to repair the body of this new person indefinitely into the future.

The end of life is interesting because it is not the end of identity, although for most people, at least up until now, identity gradually gets unwound, making a smaller and smaller cultural footprint. Other people may memorialize us shortly after death, and collect memories and artifacts that demonstrate the continuity of our identity over time. A few more accomplished or notorious people have their lives and deeds more or less immortalized. An open question today is whether digital culture might grant a longer post-mortality to the non-famous, particularly people who are active on social media. Certainly those media are starting to move in that direction as more of their patrons die.

As we play our different roles in life — parent, student, customer, worker, boss, citizen — each of the corresponding  constituencies want to mold us, often pulling us in conflicting directions. But none of them want our continuity as a person to change, and indeed they reinforce it over and over. We are always showing up, wearing the badge, signing our work. If we go away on vacation, those identities will be waiting for us, eagerly wagging their tails in greeting, or cracking the whip to catch up and meet deadlines. When he was inventing the modern theory of identity, John Locke said that it was “a forensic term, appropriating actions and their merit” as well as “all the right and justice of reward and punishment”. In other words, our identity grounds our accountability to society. Social science now would add that it also goes the other way: our accountability to others is a big contributor to our identity.

Society (at its whim, of course) punishes falseness of identity. Mistaken and stolen identities were a big thing in Elizabethan times, amid a historical rise in individualism. Think of all the mistaken identities in Shakespeare’s plays. Only 10 years before Elizabeth I became queen, the culture of the time was rocked with an archetypal case of identity theft.

Martin Guerre disappeared from his Pyrenees home in 1548. Years later another man showed up and took Martin’s place, living with his wife and family as if he were Martin. Eventually suspicions mounted and his validity was questioned in the legal system. Suddenly the real Martin Guerre showed up. The impostor, who was actually from a neighboring village (it was a small world back then) was hanged. So much of this story intrigues that it has been re-hashed many times as fact and in fiction. We wonder — how could his wife and kids not know? This is a puzzle because we know, down to our bones, how embedded our continuing identity is in the minds of those close to us.

Society still enforces its interest in our identity with occasional harshness. Some impersonations can be felonies, others, if they fail, just prevent you from buying booze. Every time you are arrested, the cops do their best to nail down who you really are. These days it’s woe to you if your faked passport is detected.

Turning back to the inward Self, what circumstances of social isolation would cause drift that is significant enough to erase identity? We may not know enough to predict this, but we are fascinated by stories of hermits in the woods, being washed up on a desert island, chained in a dungeon, and the like. The common belief is that people have to exert extreme mental discipline to come out the same person at the other end. This at least reflects our conviction, often implicit, that our identity is maintained by contact with others.

Unlike the Self, which is internal by definition, identity is two-faced. There’s your social identity, visible to the outside world, and tagged by various markers, like debit cards and diplomas; artifacts, like your clothes and your money, and your narratives, spoken and written . Other people see you as friend, mate, rival, voter, and you internalize this, owning it or resisting it as you struggle to build and harmonize your internal identity. The Self reflects identity back to the outside as you attempt to reinforce the identity that you want others to believe. This was a big emphasis in twentieth century social science: people had “identity crises”. Then the postmodernists said that it was all out of control, that the pressures were too great, the influences too pervasive, so that identity was “fractured.”

Technology has given us new channels through which we can project our image. For all too many, the channel flows inward as “celebrities” fight to capture their attention. Celebrity worship now has a new name, parasocial relationships, used by the marketers to normalize the practice and its cynical manipulation. The rest of us are encouraged to seize the same social media channels and promote ourselves. It’s an antidote to twentieth century dependence on one’s employer/job for identity. But these days not only is there no bad publicity, there is also no bad attention, so drivel and shock multiply like maggots in meat. The best teachers to counter this trend will be those who show how to use the medium to present yourself with authenticity, which allows genuine reinvigoration and reinforcement of identity.

UNITY: Coherence of the manifold self (4)

First Person Singular

What do you see when you turn out the light? I can’t tell you but I know it’s mine.- The Beatles

You strive and sweat to maintain your physical body. It in turn protects and feeds a mind that is like a whole ecology inside your head, with myriads of actors like organisms, major and minor. Is there anyone in charge? If not, what accounts for our feeling that we are distinct, durable entities?

Many ancient mystical traditions say that our essence is a non-material soul. Plato made this a concept for all future philosophy when he said that our immaterial soul was indivisible, a whole without any parts. Anything that has no parts cannot decay, so the soul was therefore eternal, without beginning or end. Hundreds of years later this idea was still so popular that it was adopted by third century Christian authorities as dogma. The Church, as well as many other mystics and thinkers, still believe in a soul.

Not, however, the Buddhists. They say that the existence of a first person, perceiving self is an illusion, and there is also no unchanging, permanent thing, material or not, that could be called a soul in humans or other living beings. Indeed they believe that our constant embrace of this illusion is the source of all suffering. The sciences of the mind have at least come to largely agree with the illusory aspect.

There is no discrete self or ego living like a Minotaur in the labyrinth of the brain. And the feeling that there is — the sense of being perched somewhere behind your eyes, looking out at a world that is separate from yourself — can be altered or entirely extinguished
[Waking Up: A guide to spirituality without religion. Sam Harris, Simon and Schuster, 2014.]

Some scientists also embrace the Buddhist belief. Nearly all scientific work on meditation, for example, uses the ancient Buddhist technique of vipassana, usually translated as mindfulness. The latter term is also widely used and misused (watered down) in pop culture. Sam Harris, the atheist cultural gadfly, is a lifelong vipassana meditator and advocate of the practice. Theoretical biologist Franceso Varela (mentioned in part three of this chapter) co-founded The Mind and Life Institute with the 14th Dalai Lama to foster dialog and research between scientists and contemplative practitioners.

Speaking of illusions, the soul concept does morph to reappear in some current accounts that say the brain is not the substrate of the mind, that there must be something else, some other mysterious thing involved. Overwhelmingly, scientists don’t buy that. There is too much evidence that specific mental functions correlate with measured brain activity, and changes in mental functions with brain lesions. Ditto for mental changes due to direct artificial stimulation of the brain, either with electrical current or psychoactive chemicals. This sort of evidence might have been on the Dalai Lama’s mind when he wondered that even the highest form of meditative awareness was dependent on (i.e., had as a material cause) the activity of the brain. For a religious leader this is a shocking break with deep tradition.

What’s new about the new sciences of the mind is that it is not as common as it once was to be a “reductionist”: to claim that the mind is “nothing but” brain activity.  More and more the lab people and the theory people are writing about the contents of consciousness and how they are being studied as mental phenomena. These two types of boffins now often work together. There seems to be a broad understanding that the correlation between mental activity and brain activity is a special realm, where each side can inform the other’s work.

Even forty years ago the mentalists and the physicalists were not speaking. Of course many still aren’t. The change might have happened, in part, due to a sort of hedge philosopher named Ken Wilber, (for a readable intro, see A Brief History of Everything [A Brief History of Everything, Ken Wilber, Shambala Publications, Inc., 2000.]) who started writing in the mid seventies. One of Wilber’s main themes as an integrative philosopher/psychologist is that mental experience is just as real as the tangible concrete things that we perceive outside of ourselves, objective and measurable, the traditional food of science. Unlike some spiritual guides, who eschew the tangible as uninteresting or as a useless illusion, Wilber believes that we should study the mental and the physical as equally valid sources of knowledge for personal and social development. Citation of Wilber in popular books on the mind is hard to find. He is barely mentioned (one line) in the Internet Encyclopedia of Philosophy. He apparently is ignored by mainstream scholars, yet his 25 books have sold well enough that translations have been made in 30 languages. It’s hard for me to believe that he had no influence, direct or indirect, on the respectability of mental phenomena in current science.

Near death experiences and round trips to heaven remain a bone of contention with those of a bent that “there must be something else” to the mind, but their examples do not require that conclusion. Suppose someone has a flat EEG (”brain wave”) and their heart has stopped. Are they dead yet? Initially no, although they might be soon. But until they are dead some metabolic energy is still available inside cells, which will continue to try to function. That includes brain cells, which one reasonably might assume will have enough energy for a stunted level of functioning. Such a low level of activity might not add up enough to create measurable electrical signal (the EEG) at the outside of the head. As for anyone who has been in a vegetative state and lived to say they visited another realm, well — they actually lived, didn’t they?  Therefore their brain was keeping the automatic body processes going.  It was there, it was busy, it could have hallucinated as well.  There is also hard evidence.  Steven Pinker reported back in 2007 that “… a team of Swiss neuroscientists reported that they could turn out-of-body experiences on and off by stimulating the part of the brain in which vision and bodily sensations converge.”

Pinker’s article in a popular magazine is a concise and accessible review for anyone wanting to catch up on the sciences of mind and the attempts to understand consciousness. The territory of these studies is tripartite.

All Gaul is divided into three parts.
– Julius Caesar, The Gallic Wars

First of all, the mind is not all conscious. There’s a huge amount of computational work that is unconscious, therefore often called automatic, but that can be shown to exist logically or in experiments. The unconscious mind is constantly and quickly piecing together data about what we see, hear and touch, coordinating the contraction of our muscles and assessing our position in space. There’s even well-developed research showing that many decisions we make actually occur neurologically before we are aware of consciously deciding them. In other words, conscious decisions are rationalizations after the fact, not deliberate causes of action.

Stuff like this is what makes the idea of free will seem like a farce. Inside each of us there is a big unconscious machine, chugging away based on inputs from our memories and the world around us, coming up with what we must inevitably do or think. What would widespread understanding of this mean to a world that needs, desperately, equally widespread moral guidance of behavior?  Pinker quotes Tom Wolfe on the consequences of science killing the soul: “the lurid carnival that will ensue may make the phrase ‘the total eclipse of all values’ seem tame.”  But Pinker thinks that the flip side is that widespread understanding of consciousness will increase empathy, reducing our ability to demonize, dehumanize or ignore other people. My guess is that it will depend on how new knowledge of consciousness is taught, and whether the knowledge itself gets vilified like other inconvenient truths have been of late.

The conscious part of the mind, these days usually called “the Self” (capitalization is part of the term), has been understood in a number of different ways. Typically it’s considered to have two aspects (our other two “parts” of the mind), but experts do not all agree on what they are. However, the modern origin of all such divisions of the conscious mind seems to come from an analysis by the nineteenth century American philosopher William James. He was an early practitioner of the direct introspective analysis of mental contents (”stream of consciousness” is his term) , which was a practice that the behaviorists eclipsed for a while in America after WW II.  James said that on the one hand there’s the subject, thinking part, the knowing “I.”  On the other hand there’s the “Me”, which is an object, the known part: what the “I” knows about itself.

For Bruce Hood [The Self Illusion: How the Social Brain Creates Identity, Oxford University Press, 2012] and others, the “Me” is a narrative, the running story of a person’s life by which the “I” maintains the continuity of its identity. Sam Harris describes the difference this way. When you wake up from sleep, you initially are just aware of sensations: groggy, bad taste in your mouth, need to pee, etc. That’s the “I”. But the “I” quickly starts thinking about the “Me” which just had a stupid dream, has to get ready for work soon, remembers a social conflict to resolve, and wants to eat more healthy food today, but not for breakfast.

You could try to say that the “I” is just a perceiving machine, what Thomas Metzinger and others think of as the moment to moment apprehender of conscious reality, whose point of view is only a second or two wide. But during those moments it is also thinking about the “Me”, spinning those tales of the Self, dredging up memories, and making decisions. Clearly the subject/object distinction is at least a little muddy.  In defense of the difference, note that the “Me” addresses the question of the continuity of personal identity, while the “I” seems to reflect the uniqueness of identity. This is because nobody — but nobody — else has access to your “I.”  It’s unique and it’s yours alone.

Consciousness may be one of the most challenging topics of all time. One positive aspect is that the thing itself is the very definition of easily accessible. It’s with you virtually all the time. You are not always conscious, but note that even dreams are conscious. The serious obstacles to study are: your consciousness is not accessible to me, and none of us can step outside of our consciousness to examine it as a process. We can only examine what the philosophers call the contents of consciousness, the stuff that happens in that private world.

In 1994 a young denim-clad, decidedly non-stuffy Australian philosopher, David Chalmers, blew the minds of a gathering of consciousness luminaries in Tucson.   He said that research and analysis would let  us eventually understand much of the mind and the brain, but there was one and only one hard problem: how each of us has that singular, first person perception of our own world.  [Why can’t the world’s greatest minds solve the mystery of consciousness? Oliver Burkman, The Guardian, 2015] . This issue had been posed twenty years earlier by another philosopher, Thomas Nagel, who noted that his peers often ignored the question of why it is like (i.e., it feels like) something to be conscious — why does subjectivity exist [ What is it like to be a bat?  Thomas Nagel, Philosophical Review, 1974].  But in 94 the time was right.  Soon every expert interested in conscious was slapping themselves on the forehead, realizing that this was indeed (capitals not optional) the Hard Problem of Consciousness.  Hardly a learned or popular discussion of the mind since then has failed to make note of the Hard Problem.

The most lucid and engaging account I have found about the Hard Problem and the related concept of the Self is Thomas Metzinger’s The Ego Tunnel.  [The Ego Tunnel: the Science of Mind and the myth of the self. Thomas Metzinger, Basic Books, 2009.] He starts by noting that reality is incomparably richer and more complex than we perceive. This is not necessarily a spiritual or drug-induced idea, but is just what we have been able to infer from the study of physics. In our environment here are no colors or objects as such, only differing activities and densities of innumerable particles, sparking in and out of existence at unbelievable speeds. That we see, hear and touch things is because our senses, indeed any animal’s senses, make a very simplified model of our immediate environment. This method of knowing the world is not just an option, a choice from among other methods made by evolution.  No, Metzinger points out that for philosophers knowledge itself is the fact of representation. What evolution did choose for our model of reality is still extraordinary in its apparent detail and in how we are able to use it.

Metzinger calls it the phenomenal self model, or PSM. The content of the human PSM also includes a model of our Self as a representational system (i.e., as a “knowing” entity). The PSM contains not only our moment to moment perceptions but access to memories and a sense of location within our body. There is a focusing mechanism to reflect on particular memories or perceptions, so that we can run the model backwards, thinking about the past, and forwards, thinking about the future. The scope of the PSM thus encompasses both James’s “I” and “Me”. The trick of consciousness is that it is “transparent”, which is philosopher-speak for the fact that we see right through it, without any access to the fact that it is a model. This transparency creates our first person point of view, which Metzinger and others call the “Ego”. When we open our eyes, the world is just there, immediately, without any mental access to the underlying truth that it is a continuous, real-time construction.

This Ego is the central mystery of the sciences of mind: how do we explain that patterns of neuronal firing create experience (the Hard Problem from a science point of view)?  We can find neural correlates of experience, but all agree that there is an “explanatory gap” about how the experiencing Ego can be produced from physical events in the nervous system. Quite a few experts subscribe to what is, perhaps jocularly, called “mysterianism”, the possibility that we will never know. That’s heady stuff from scientists. No wonder others are willing to step in and bring back the soul as an explanatory prop.

With this background let’s return to our concern about what creates the durability of personal identity. The beginning of modern thought on the subject is often traced to John Locke, the great British empiricist whose wide influence included the views on liberty held by the founders of the United States. Locke had the idea, revolutionary for the time (1690) that what sustains a person’s identity is psychological continuity, which he said was based on personal memories.

Psychological characteristics as the source of our continuity seem to folks today to be the obvious common sense answer. Studies usually present subjects with some kind of story about brains being transplanted to another body, or some variation on the Star Trek transporter. Subjects are then asked whether, after transfer of psychological characteristics to some new body, whether a person moved to the new body and what this implies. People generally think that the person’s location will follow their memories to a new body. However, they may waffle if the story includes something about bad consequences to the body that was left behind. Common sense is not consistent, especially when presented with stories about mind or brain transfer that really are currently impossible. However, it’s easy to prove to someone that their body and brain are not sufficient for survival of their Self.  Just ask them if their Self would survive permanent coma or senile dementia.

As a philosopher Locke was looking for certainty beyond common sense. A man of his times, he still believed in the soul, but as the earliest Empiricist he wanted to ground his philosophy in facts of human experience. He claimed that, whatever the soul was, its experience of continuity was based somehow on remembering previous aspects of the person’s life.

Locke, who was English, did not live to receive the big smackdown of this idea, which came a few decades later at the hands of a truly dour looking Scotsman, one Thomas Reid. The essence of his contradiction of Locke came in a simple parable. He imagined an elderly general who remembers a courageous charge when he was a young cavalry officer (this was getting personal, for Locke’s father was a captain of cavalry!). The young officer remembers himself as a lad, getting beaten for stealing apples. The general, however, has no recollection of the apple incident. Reid said that Locke’s theory fails to confirm that the general and the lad are the same person, therefore memory is not sufficient to sustain the enduring Self.

Reid’s example is just one such, since we know of numerous ways that memory just cannot be the whole story. Most of us can hardly remember anything at all before we were three years old. At the other end of life, memories fade or else they are made into hash by dementia. Our psychological continuity is interrupted by dreamless sleep, by drunkenness, by being knocked out, by anesthesia, by coma, by disease, by brain damage, by dementia, by psychological trauma, and by sudden drops into, and back out of, amnesia for no apparent reason.

Furthermore, research shows that memory itself is all too often a tissue of lies, a second-order construction only loosely based on our first-order model, the PSM. A common view is that memory is a narrative whose details are invented to summarize and make sense of things, not to record them in veridical detail.  I first ran into this at a seminar by Julian Jaynes, a biologist who is famed for promoting a wild theory [The Origin of Consciousness in the Breakdown of the Bicameral Mind, 1976] that humans have only been conscious for about the last couple of thousand years. Jaynes said to us, “I want you to imagine as best you can, what it was like when you last went swimming.” He then asked us whether we remembered (A) our conscious point of view with things like eyes at water level, being wet, the feeling of moving and breathing in the water, or (B) an image from a point of view looking down at ourselves, a body in the water in the setting where we swam. For most of us it was, you guessed it, (B), an experience that we never had, but instead was a fictitious mental photograph, a third person perspective “edited for clarity.”