Autism Resources & Community (ARC)

Identity First: On Knowing Who You Are

Written by Signe M. Kastberg | 9/4/21 4:53 PM

 

What is the difference between “people-first” and “identity-first” language with regard to Autism Spectrum Disorder (ASD)? How do culture, identity, and statistics play a role in this question? How do you define yourself; how do you prefer to be recognized; and how do you know what form of respectful address to use with others?

For the past decade or so, professionals and scholars have preferred a “people first” approach; one in which the individual is at the center, and any characteristics of that individual are secondary. The people-first approach would be stated this way: “person with autism.” Many autistic persons prefer to avoid the implication that autism is a negative descriptor, but rather that autism is a central and positive part of who they are. An identity-first approach would be stated this way: “autistic person.” Who’s right? Spoiler alert: there’s a happy ending to this story, but along the way, we’ll explore the origins of this complex question, including history, culture, psychology, identity, values, and even statistics.

 

Identity and Culture

Many years ago, I had the wonderful opportunity to hear Rayna Green speak. Green was the curator of the American Indian Program at the Smithsonian Institute. Green identified herself as Native American, Jewish, and German. She spoke of how each of those disparate groups would prefer that she identify herself only with that group; for example, only as Jewish. However, she said, “I am all of those things.” She refused the fragmentation of being identified by others; she embraced every part of her identity. She told us that “you have to name your name and claim your home,” in order to be fully who you are and to move forward in life. In other words, know who you are and where you come from, and own it with pride.

Roy Grinker, author of Nobody’s Normal, says that stigma is a cultural invention, and we can say with some certainty that ASD has been stigmatized along with other mental disorders. In particular, Grinker suggests that the real culprit in stigmatizing mental disorders is capitalism. In capitalistic cultures, the true value of a person is in their ability to work, “producing for oneself and the economy.”1 In individualistic societies, living independently is prized, and the inability to do these things is seen as a character defect. This historically has led persons with mental disorders to feel ashamed and flawed and to attempt to hide their disorder or to simply remain out of view to avoid public humiliation. It is important to note that persons with physical disabilities have fought this fight for just as long; Cara Leibowitz writes about wanting to be seen as a whole person rather than a person attached to a walker.2 She emphasizes that while she has mobility limitations, the true disabling condition is created by society’s failure to provide accessibility for all (for example, the lack of ramps on sidewalks or into buildings). For Leibowitz and others, using identity-first language honors all of her.

It is interesting to note that in some cultures, especially those with collectivistic values, individuals tend to identify themselves first and foremost by identification with their tribe or village. That identifies them as an integral part of a particular group, and the group’s importance supersedes the value of the individual. In the 1960s, Alan Watts famously stated that "the prevalent sensation of oneself as a separate ego enclosed in a bag of skin is a hallucination."3 In other words, we are all interconnected and individuality is an illusion, which is a belief embraced by many cultures and world religions. In the current era in the United States, we identify our tribes by political party, State, profession, religious affiliation, and many other characteristics, but individual identification in this cultural reality supersedes group status.

 

Psychology and Statistics

What the heck is “normal” anyway? “Normal” is really a statistical convention; that is, normal really means “average” and more specifically, a numerical average. The common model for normal is called “the bell curve” or more commonly “the normal curve.” It is shaped like a bell, as seen below:

Along the horizontal dimension of the graph would be “how much” of whatever characteristic is being measured; the vertical dimension is “how many” or the quantity at each point of measurement. At the very center is the “average” of what is being measured. One example occurs naturally: height. Most of us are in the mid-range, with some very tall people on one end and some quite short people on the other (there are separate averages for males and females). Intelligence quotient (IQ) tests are a prime example of how this is used. The midpoint would be an IQ of 100; this is considered “normal” or average. The big hump in the middle represents the majority of people who are average in intelligence; according to this theory, 67% of the population has an IQ between 85 and 115. At the far-right end of the curve are people with an exceptionally high IQ, while at the far-left end are those with a very low IQ; these far ends each represent roughly 1% of the population. But let’s be clear: the idea of an IQ of 100 being “average” or normal is a fictional invention, albeit one with good intentions aiming for the measurement of psychological differences. 

I used IQ as an example for a particular reason. The folks with genius intelligence are not normal; by definition, they are abnormal. The folks with very low IQ are also abnormal because it’s statistically unlikely to be so far away from “average.” But as a society, we seem to shame and blame those with sub-normal intelligence, as if they were born with the wish, “I’d like to sign up for the lower IQ, please!” Those with high IQs did not earn it; neither did those with low IQs. This is where Grinker’s claim rings true: as a society, we stigmatize people for perceived deficiencies as if they are to blame for those deficiencies, and we elevate people with inborn assets as if they earned them. It is also true that IQ only measures a very narrow band of intelligence; Daniel Goleman and others have suggested that emotional intelligence is just as valuable as the kind of intelligence measured by standard IQ tests.

What is the medical/behavioral health community’s role in creating this focus on “normal” and its opposite, “abnormal?” To help people who are struggling, medical and mental health professionals created measurement standards and criteria for diagnoses. “Looks abnormal to me” is not a scientific method. Over not only decades but centuries, providers have attempted to better understand and treat mental disorders. With industrialization beginning in the mid-1800s came an interest in statistics to measure productivity, among other things. The word “normal” historically was a mathematical concept; however, in the early 1800s, it became (along with “pathological”) a way of describing points along a spectrum of differences in human behavior.4 Increasing numbers of people were committed to asylums in that era, often by family members. When one member’s behavior was so inconvenient that it required another member’s supervision, it was seen as a step towards “social instability” because the economy required everyone to produce.5 Institutionalization of persons with yet-to-be-named disorders such as mania, depression, and psychosis, came to rely on a definition: what is included in this category? What should be excluded? The current Diagnostic and Statistical Manual of Mental Disorders (DSM), now in edition five, had its start in the US military, where the World Wars generated many cases of what we now call Post-Traumatic Stress Disorder (PTSD), and providers needed to identify causes, standardize diagnoses, and provide treatments.6 

While there is still disagreement about what the terms “normal” and “abnormal” really mean, diagnostic criteria have become increasingly specific. Standardization of diagnoses created a shared language that helped providers communicate with each other as they sought to alleviate suffering in patients. For example, Bipolar Disorder should mean the same thing in Nebraska as in London. Particularly given our increasingly mobile society, in which geographic moves and health insurance shifts cause people to change providers regularly, a shared lexicon of diagnoses and their meanings is essential. ASD is just one of many examples of the effort to standardize and define disorders.

 

In the process of standardization and definition, mistakes were made and corrected, often in response to research but also to the culture’s growth in understanding and tolerance for differences. For example, “hysteria” (translation: wandering uterus) was diagnosed in the 1800s as the symptom of a woman wanting a career. Homosexuality was once considered a mental disorder but has not been for several decades. Gender Identity Disorder described an individual who believed they were the opposite gender of the sexual anatomy with which they were born, and people were institutionalized for this!7 But with research has come a greater understanding of non-binary and transgendered individuals, and thus is no longer considered a disorder.

What does this have to do with statistics? Remember our friend, the normal curve. While many historical practices of diagnoses were based on conceptions of normal and abnormal, in the current era with greater tolerance of differences, the mental health community arrived at the idea of a spectrum of disorder, a continuum rather than a simple normal/abnormal dichotomy. In addition, there is a guideline for determining all diagnoses. That standard states that, whatever the unusual thought, feeling, or behavior of an individual, it must cause problems in one or more of their major spheres of life functioning in order to be considered a disorder. Do your symptoms cause problems in your love life? In your job or school? In your relationships with family and friends? Does it cause issues with your physical health and well-being? You might be “abnormal” in some way, but if it causes no problems in your life (not just according to you, but to significant others in your life) then you’re fine. If, however, you acknowledge problems in functioning; or your family, employer, teachers, spouse, or doctor identify problems that you deny, this is cause for a closer look and potential diagnosis according to scientifically established criteria.

 

Self and Values

We’ve talked about identity as it is situated within a cultural framework: country, tribe, language. How is identity situated within a psychological framework? For example, are you a daughter, wife, mother, son, or grandfather? How important are these roles to you? Are you an employee, a business owner, or a student? Are you Caucasian, Asian, African, or Hispanic? Are you highly educated, economically privileged, homeless, retired, or diabetic? In different social settings, we might describe ourselves in different ways, but how we perceive ourselves over time is based on our core values. 

What is most important about you? You probably have a hierarchy of values. Typically, our values become a central predictor of how we define ourselves. Some aspects of your value structure are more vital to you than others. What do you put front and center as you describe and define yourself? If you are autistic, is that central to who you are? Are there other attributes that are important to you? There is no right or wrong answer here; it is what is true for you

As a teacher of a graduate course in multicultural counseling, I can recall telling my students the importance of addressing people with the descriptor that person prefers. And that is, in fact, quite challenging, because one size does not fit all! I remember speaking with a number of persons about race, and how they preferred to be referenced. One person of color preferred “Black,” another preferred “African-American,” one specified “Jamaican-American,” one said “African,” and another said “Negro.” Many of my students reacted quite negatively to the idea of calling someone “Negro” in this century, although that particular older man was very clear on his preference. How, my students wondered, could they know which descriptor to use without being offensive? Are indigenous peoples Native Americans? Indians? Am I a woman? A female? A girl? Is a person homosexual, gay, or queer? And that’s how we come full circle. You define yourself. Do you clearly communicate to others how you see yourself, and how you would like others to refer to your identity?

 

Acts of Resistance

Throughout history, people who have felt oppressed have pushed back against others’ definitions of them. Beginning with the suffragettes and continuing through the 1960s era of feminism in which “the personal is political,” many women were determined to claim their right to define their values, to dictate their experiences, and to have a holistic view of themselves in both private and public spheres. The feminist movement refused to accept others’ attributions of what a woman is or should be. African-American men carried signs early in the civil rights era, stating “I am a Man” in response to hundreds of years of being called “boy.” Many gay men took back their power by reappropriating the formerly contemptuous term, “queer.” Claiming a former term of insult is one way of gaining empowerment and fighting against stigma.  

As a practicing therapist and teacher of therapists, I can say with confidence that most of us fight against stigma every day. Part of the fight is to put the individual at the center, and certainly with strengths-based approaches, we put the assets of individuals at the forefront of our therapeutic efforts. The “people first” approach is one way that medical and mental health professionals acknowledge the ongoing struggle against the stigma with which persons with diagnosed disorders must deal, and honor their strengths. At the same time, I am more than happy to utilize an “identity first” approach and refer to persons as they request. If I don’t know what someone’s preference is, I hope that person will recognize that I am being respectful within the bounds of the information I have. 

 

Conclusion 

The professionals are right and autistic persons are right. It is courteous to use people-first language, and it’s empowering to use identity-first language. These are both approaches that emphasize respect, counter stigma, and attempt to honor all persons.

Perhaps this is the time for autistic persons who have felt oppressed by the medical establishment, to claim ownership of self-definition. Remember Rayna Green? Name your name, and claim your home. Embrace all of you.

This is a shared responsibility: your role is to tell your provider how you prefer to be referenced, and your provider’s responsibility is to be respectful of your right to make that determination and to call you what you prefer. In this way, progress towards equality will be made.

References

1. Grinker, Roy. (2021) Nobody’s Normal: How Culture Created the Stigma of Mental Illness. W.W. Norton and Company: London, p. xxv.

2. Leibowitz, Cara (March 20, 2015) On Identity-First Versus People-First Language. The Body is Not an Apology. https://thebodyisnotanapology.com/magazine/i-am-disabled-on-identity-first-versus-people-first-language/

3. Watts, Alan W. (1966). The Book: On the Taboo Against Knowing Who You Are. Collier-MacMillan Books, p. ix.

4. Smoller, Jonathan (2012) The Other Side of Normal: How Biology is Providing the Clues to Unlock the Secrets of Normal and Abnormal Behavior. Harper-Collins Publishers: New York, p. 6.

5. Grinker, Roy. (2021) Nobody’s Normal: How Culture Created the Stigma of Mental Illness. W.W. Norton and Company: London, p. 46-47.

6. Ibid, p. xxviii.