Understanding the components and regional variations of cultural patterns and processes are critical to human geography. We studied the concepts of culture and cultural traits and learned how geographers assess the spatial and place dimensions of cultural groups as defined by language, religion, ethnicity, and gender, in the present as well as the past.
This module also explored cultural interaction at various scales, along with the adaptations, changes, and conflicts that may result. The geographies of language, religion, ethnicity, and gender are studied to identify and analyze the patterns and processes of cultural differences. We distinguished between languages and dialects, ethnic religions and universal religions, and folk and popular cultures, as well as between ethnic political movements. These distinctions help students understand the forces that affect the geographic patterns of each cultural characteristics.
Another significant emphasis of the module was the way culture shapes relationships between humans and the environment. We learned how culture is expressed in landscapes and how land use, in turn, represents cultural identity. Built environments enable the geographer to interpret cultural values, tastes, symbolism, and beliefs.
3.1 Understanding Race and Ethnicity
Trayvon Martin was a seventeen-year-old black teenager. On the evening of February 26, 2012, he was visiting with his father and his father’s fiancée in the Sanford, Florida multi-ethnic gated community where his father’s fiancée lived. Trayvon left her home on foot to buy a snack from a nearby convenience store. As he was returning, George Zimmerman, a white Hispanic male and the community’s neighborhood watch program coordinator, noticed him. In light of a recent rash of break-ins, Zimmerman called the police to report a person acting suspiciously, which he had done on many other occasions. The 911 operator told Zimmerman not to follow the teen, but soon after Zimmerman and Martin had a physical confrontation. According to Zimmerman, Martin attacked him, and in the ensuing scuffle Martin was shot and killed (CNN Library 2014).
A public outcry followed Martin’s death. There were allegations of racial profiling—the use by law enforcement of race alone to determine whether to stop and detain someone—a national discussion about “Stand Your Ground Laws,” and a failed lawsuit in which Zimmerman accused NBC of airing an edited version of the 911 call that made him appear racist. Zimmerman was not arrested until April 11, when he was charged with second-degree murder by special prosecutor Angela Corey. In the ensuing trial, he was found not guilty (CNN Library 2014).
The shooting, the public response, and the trial that followed offer a snapshot of the social constructs of race. Do you think race played a role in Martin’s death or in the public reaction to it? Do you think race had any influence on the initial decision not to arrest Zimmerman, or on his later acquittal? Does society fear black men, leading to racial profiling at an institutional level? What about the role of the media? Was there a deliberate attempt to manipulate public opinion? If you were a member of the jury, would you have convicted George Zimmerman?
Defining Race and Ethnicity
The idea of race refers to superficial physical differences that a particular society considers significant, while ethnicity describes shared culture. Moreover, the term “minority groups” describe subordinate groups, or that lack power in society regardless of skin color or country of origin. For example, in modern U.S. history, the elderly might be considered a minority group due to a diminished status that results from widespread prejudice and discrimination against them. Ten percent of nursing home staff admitted to physically abusing an older person in the past year, and 40 percent admitted to committing psychological abuse (World Health Organization 2011). In this chapter, we focus on racial and ethnic minorities.
Race, in biological terms, refers to a socially constructed way to identify humans based on physical characteristics, resulting from genetic ancestry. Shared genetic ancestry is a result of geographical isolation. Geographic isolation, since the era of colonization and even before then, has significantly decreased in most areas of the world. Less geographic isolation results in the mixing of racial groups. Thus, classifying people by their race with any accuracy is difficult.
Most biologists, geographers, and social scientists have all taken an official position rejecting the biological explanations of race. Over time, the typology of race that developed during early racial science has fallen into disuse, and the social construction of race is a more sociological way of understanding racial categories. Research in this school of thought suggests that race is not biologically identifiable and that previous racial categories were arbitrarily assigned, based on pseudoscience, and used to justify racist practices (Omi and Winant 1994; Graves 2003). When considering skin color, for example, the social construction of race perspective recognizes that the relative darkness or fairness of skin is an evolutionary adaptation to the available sunlight in different regions of the world.
Contemporary conceptions of race, therefore, which tend to be based on socioeconomic assumptions, illuminate how far removed modern understanding of race is from biological qualities. In modern society, some people who consider themselves “white” actually have more melanin (a pigment that determines skin color) in their skin than other people who identify as ”black.” In some countries, such as Brazil, class is more important than skin color in determining racial categorization. People with high levels of melanin may consider themselves “white” if they enjoy a middle-class lifestyle. On the other hand, someone with low levels of melanin might be assigned the identity of “black” if he or she has little education or money.
The social construction of race is also reflected in the way names for racial categories change with changing times. It is worth noting that race, in this sense, is also a system of labeling that provides a source of identity; specific labels fall in and out of favor during different social eras. For example, the category ”Negroid,” popular in the nineteenth century, evolved into the term “negro” by the 1960s, and then this term fell from use and was replaced with “African American.” This latter term was intended to celebrate the multiple identities that a black person might hold, but the word choice is a poor one: it lumps together a large variety of ethnic groups under an umbrella term while excluding others who could accurately be described by the label but who do not meet the spirit of the term. For example, actress Charlize Theron is a blonde-haired, blue-eyed “African American.”
PBS has created an exciting website called RACE – The Power of an Illusion that looks at whether race indeed is a biological characteristic of humans or a social construct. Take the Sorting People quiz and watch The Human Family Tree and Black in Latin America: An Island Divided to “witness” how migration and geography play a role in the complex issues surrounding race and ethnicity. Pay attention to how the racial and ethnic landscape of the island of Hispaniola impacts cultural identity and the geopolitics both within Hispaniola and beyond its shores.
Ethnicity is a term that describes shared culture – the practices, values, and beliefs of a group. This culture might include shared language, religion, and traditions, among other commonalities. Like race, the term ethnicity is difficult to describe, and its meaning has changed over time. Moreover, as with race, individuals may be identified or self-identify with ethnicities in complex, even contradictory, ways. For example, ethnic groups such as Irish, Italian American, Russian, Jewish, and Serbian might all be groups whose members are predominantly included in the “white” racial category.
Shared geography, language, and religion can often, but not always, factor into ethnic group categorizations. Ethnic groups distinguish themselves differently from one period to another. Ethnic identity can be used by individuals to identify themselves with others who have shared geographic, cultural, historical, linguistic, and religious ancestry; however, like race, ethnicity has been defined by the stereotypes created by dominant groups as a method of “Othering.” Othering is a process in which one group, usually the dominant group, views and represents themselves as “us/same” and another group as “them/other.”
Ethnicity, like race, continues to be an identification method that individuals and institutions use today—whether through the census, affirmative action initiatives, nondiscrimination laws, or simply in day-to-day personal relations.
Defining Minority Groups
Sociologist Louis Wirth (1945) defined a minority group as “any group of people who, because of their physical or cultural characteristics, are singled out from the others in the society in which they live for differential and unequal treatment, and who therefore regard themselves as objects of collective discrimination.” The term minority connotes discrimination, and in its social scientists use, the term subordinate group can be used interchangeably with the term minority, while the term dominant group is often substituted for the group that’s in the majority. These definitions correlate to the concept that the dominant group is that which holds the most power in a given society, while subordinate groups are those who lack power compared to the dominant group.
Note that being a numerical minority is not a characteristic of being a minority group; sometimes, larger groups can be considered minority groups due to their lack of power. It is the lack of power that is the predominant characteristic of a minority, or subordinate group. For example, consider apartheid in South Africa, in which a numerical majority (the black inhabitants of the country) were exploited and oppressed by the white minority.
According to Charles Wagley and Marvin Harris (1958), a minority group is distinguished by five characteristics: (1) unequal treatment and less power over their lives, (2) distinguishing physical or cultural traits like skin color or language, (3) involuntary membership in the group, (4) awareness of subordination, and (5) high rate of in-group marriage. Additional examples of minority groups might include the LGBTQ+ community, religious practitioners whose faith is not widely practiced where they live, and people with disabilities.
Scapegoat theory, developed initially from Dollard’s (1939) Frustration-Aggression theory, suggests that the dominant group will displace its unfocused aggression onto a subordinate group. History has shown us many examples of the scapegoating of a subordinate group. An example from the last century is the way Adolf Hitler was able to blame the Jewish population for Germany’s social and economic problems. In the United States, recent immigrants have frequently been the scapegoat for the nation’s—or an individual’s—woes. Many states have enacted laws to disenfranchise immigrants; these laws are popular because they let the dominant group scapegoat a subordinate group.
Stereotypes, Prejudice, and Discrimination
The terms stereotype, prejudice, discrimination, and racism are often used interchangeably in everyday conversation. Stereotypes are oversimplified generalizations about groups of people. Stereotypes can be based on race, ethnicity, age, gender, sexual orientation – almost any characteristic. They may be positive (usually about one’s own group, such as when women suggest they are less likely to complain about physical pain) but are often negative (usually toward other groups, such as when members of a dominant racial group suggest that a subordinate racial group is stupid or lazy). In either case, the stereotype is a generalization that does not take individual differences into account.
New stereotypes are rarely created; instead, they are recycled from subordinate groups that have assimilated into society and are reused to describe newly subordinate groups. For example, many stereotypes that are currently used to characterize black people were used earlier in American history to characterize Irish and Eastern European immigrants.
Prejudice and Racism
Prejudice refers to the beliefs, thoughts, feelings, and attitudes that someone holds about a group. Prejudice is not based on experience; instead, it is a prejudgment, originating outside experience. A 1970 documentary called Eye of the Storm illustrates how prejudice develops, by showing how defining one category of people as superior (children with blue eyes) results in prejudice against people who are not part of the favored category.
While prejudice is not necessarily specific to race, racism is a stronger type of prejudice used to justify the belief that one racial category is somehow superior or inferior to others; it is also a set of practices used by a racial majority to disadvantage a racial minority. The Ku Klux Klan is an example of a racist organization; its members’ belief in white supremacy has encouraged over a century of hate crime and hate speech.
Institutional racism refers to how racism is embedded in the fabric of society. For example, the disproportionate number of black men arrested, charged, and convicted of crimes may reflect racial profiling, a form of institutional racism.
Colorism is another kind of prejudice, in which someone believes one type of skin tone is superior or inferior to another within a racial group. Studies suggest that darker skinned African Americans experience more discrimination than lighter skinned African Americans (Herring, Keith, and Horton 2004; Klonoff and Landrine 2000). For example, if a white employer believes a black employee with a darker skin tone is less capable than a black employee with a lighter skin tone, that is colorism. At least one study suggested the colorism affected racial socialization, with darker-skinned black male adolescents receiving more warnings about the danger of interacting with members of other racial groups than did lighter-skinned black male adolescents (Landor et al. 2013).
While prejudice refers to biased thinking, discrimination consists of actions against a group of people. Discrimination can be based on age, religion, health, and other indicators; race-based laws against discrimination strive to address this set of social problems.
Discrimination based on race or ethnicity can take many forms, from unfair housing practices to biased hiring systems. Overt discrimination has long been part of U.S. history. In the late nineteenth century, it was not uncommon for business owners to hang signs that read, “Help Wanted: No Irish Need Apply.” Moreover, southern Jim Crow laws, with their “Whites Only” signs, exemplified overt discrimination that is not tolerated today.
However, we cannot erase discrimination from our culture just by enacting laws to abolish it. Even if a society managed to eradicate racism from each individual’s psyche, society itself would maintain it. Social scientist, Émile Durkheim calls racism a social fact, meaning that it does not require the action of individuals to continue. The reasons for this are complex and relate to the educational, criminal, economic, and political systems that exist in our society.
For example, when a newspaper identifies by race individuals accused of a crime, it may enhance stereotypes of a particular minority. Another example of racist practices is racial steering, in which real estate agents direct prospective homeowners toward or away from specific neighborhoods based on their race. Racist attitudes and beliefs are often more insidious and harder to pin down than specific racist practices.
Prejudice and discrimination can overlap and intersect in many ways. To illustrate, here are four examples of how prejudice and discrimination can occur. Unprejudiced nondiscriminators are open-minded, tolerant, and accepting individuals. Unprejudiced discriminators might be those who unthinkingly practice sexism in their workplace by not considering females for certain positions that have traditionally been held by men. Prejudiced nondiscriminators are those who hold racist beliefs but do not act on them, such as a racist store owner who serves minority customers. Prejudiced discriminators include those who actively make disparaging remarks about others or who perpetrate hate crimes.
Discrimination also manifests in different ways. The scenarios above are examples of individual discrimination, but other types exist. Institutional discrimination occurs when a societal system has developed with embedded disenfranchisement of a group, such as the U.S. military’s historical nonacceptance of minority sexualities (the “do not ask, do not tell” policy reflected this norm).
Institutional discrimination can also include the promotion of a group’s status, such in the case of white privilege, which is the benefits people receive simply by being part of the dominant group.
While most white people are willing to admit that nonwhite people live with a set of disadvantages due to the color of their skin, very few are willing to acknowledge the benefits they receive.
Theories of Race and Ethnicity
We can examine issues of race and ethnicity through three major perspectives: functionalism, conflict theory, and symbolic interactionism. As you read through these theories, ask yourself which one makes the most sense and why. Do we need more than one theory to explain racism, prejudice, stereotypes, and discrimination?
In the view of functionalism, racial and ethnic inequalities must have served an essential function in order to exist as long as they have. This concept, of course, is problematic. How can racism and discrimination contribute positively to society? A functionalist might look at “functions” and “dysfunctions” caused by racial inequality. Nash (1964) focused his argument on the way racism is functional for the dominant group, for example, suggesting that racism morally justifies a racially unequal society. Consider the way slave owners justified slavery in the antebellum South, by suggesting that black people were fundamentally inferior to white and preferred slavery to freedom.
Another way to apply the functionalist perspective to racism is to discuss the way racism can contribute positively to the functioning of society by strengthening bonds between in-group members through the ostracism of out-group members. Consider how a community might increase solidarity by refusing to allow outsiders access. On the other hand, Rose (1951) suggested that dysfunctions associated with racism include the failure to take advantage of talent in the subjugated group, and that society must divert from other purposes the time and effort needed to maintain artificially constructed racial boundaries. Consider how much money, time, and effort went toward maintaining separate and unequal educational systems before the civil rights movement.
Conflict theories are often applied to inequalities of gender, social class, education, race, and ethnicity. A conflict theory perspective of U.S. history would examine the numerous past and current struggles between the white ruling class and racial and ethnic minorities, noting specific conflicts that have arisen when the dominant group perceived a threat from the minority group. In the late nineteenth century, the rising power of black Americans after the Civil War resulted in draconian Jim Crow laws that severely limited black political and social power. For example, Vivien Thomas (1910–1985), the black surgical technician who helped develop the groundbreaking surgical technique that saves the lives of “blue babies” was classified as a janitor for many years, and paid as such, even though he was conducting complicated surgical experiments. The years since the Civil War have shown a pattern of attempted disenfranchisement, with gerrymandering and voter suppression efforts aimed at predominantly minority neighborhoods.
Social scientists, Patricia Hill Collins (1990), further developed intersection theory, originally articulated in 1989 by Kimberlé Crenshaw, which suggests we cannot separate the effects of race, class, gender, sexual orientation, and other attributes. When we examine race and how it can bring us both advantages and disadvantages, it is essential to acknowledge that the way we experience race is shaped, for example, by our gender and class. Multiple layers of disadvantage intersect to create the way we experience race. For example, if we want to understand prejudice, we must understand that the prejudice focused on a white woman because of her gender is very different from the layered prejudice focused on a poor Asian woman, who is affected by stereotypes related to being poor, being a woman, and her ethnic status.
For symbolic interactionists, race and ethnicity provide powerful symbols as sources of identity. Some interactionists propose that the symbols of race, not race itself, are what lead to racism. Famed Interactionist Herbert Blumer (1958) suggested that racial prejudice is formed through interactions between members of the dominant group: Without these interactions, individuals in the dominant group would not hold racist views. These interactions contribute to an abstract picture of the subordinate group that allows the dominant group to support its view of the subordinate group, and thus maintains the status quo.
An example of this might be an individual whose beliefs about a particular group are based on images conveyed in popular media, and those are unquestionably believed because the individual has never personally met a member of that group. Another way to apply the interactionist perspective is to look at how people define their races and the race of others. As we discussed the social construction of race, since some people who claim a white identity have a greater amount of skin pigmentation than some people who claim a black identity, how did they come to define themselves as black or white?
Culture of Prejudice
Culture of prejudice refers to the theory that prejudice is embedded in our culture. We grow up surrounded by images of stereotypes and casual expressions of racism and prejudice. Consider the casually racist imagery on grocery store shelves or the stereotypes that fill popular movies and advertisements. It is easy to see how someone living in the Northeastern United States, who may know no Mexican Americans personally, might gain a stereotyped impression from such sources as Speedy Gonzalez or Taco Bell’s talking Chihuahua. Because we are all exposed to these images and thoughts, it is impossible to know to what extent they have influenced our thought processes.
Intergroup relations (relationships between different groups of people) range along a spectrum between tolerance and intolerance. The most tolerant form of intergroup relations is pluralism, in which no distinction is made between minority and majority groups, but instead, there is equal standing. At the other end of the continuum are amalgamation, expulsion, ethnic cleansing, and even genocide – stark examples of intolerant intergroup relations.
Ethnic Cleansing and Genocide
The 20th Century was also the deadliest century, regarding war, in human history. This century experienced two world wars, multiple civil wars, genocides in Rwanda (Tutsis and moderate Hutus), Sudan, Yugoslavia, and the Holocaust that decimated the Jewish population in Europe during WWII. In addition to WWI and WWII, this century experienced the Korean War, the Vietnam War, the Cold War, and the first Gulf War. Additionally, this century saw regional and civil conflicts such as those experienced in the Congo (6 million people died), as well as an upsurge in child soldiers and modern slavery.
Some of the worst acts by humans have been concerning ethnic cleansing and genocide. The United Nations Security Council established Resolution 780, which states that ethnic cleansing is “a purposeful policy designed by one ethnic or religious group to remove by violent and terror-inspiring means the civilian population of another ethnic or religious group from certain geographic areas.”
Genocide is usually defined as the intentional killing of large sums of people targeted because of their ethnicity, political ideology, religion, or culture. At first glance, it appears that ethnic cleansing and genocide are similar. With ethnic cleansing, the aim is to remove a group of people with similar ethnic backgrounds from a specific geographic region by any means possible. This could include forced migration, terror and rape, destruction of villages, and large-scale death. With genocide, the real intent is the death of a group of people at any scale possible until they are extinct. This has happened many times in recent history including Bosnia-Herzegovina, Burma, Cambodia, the Democratic Republic of the Congo, Rwanda, Sudan, and now Syria. Sadly, with all these ethnic conflicts, most were not officially declared as genocides by the United Nations Security Council, but the conditions on the ground and the reasons why they were occurring fit the definition.
Possibly the most well-known case of genocide is Hitler’s attempt to exterminate the Jewish people in the first part of the twentieth century. Also known as the Holocaust, the explicit goal of Hitler’s “Final Solution” was the eradication of European Jewry, as well as the destruction of other minority groups such as Catholics, people with disabilities, and LGTBQ+ individuals. With forced emigration, concentration camps, and mass executions in gas chambers, Hitler’s Nazi regime was responsible for the deaths of 12 million people, 6 million of whom were Jewish. Hitler’s intent was clear, and the high Jewish death toll certainly indicates that Hitler and his regime committed genocide. However, how do we understand genocide that is not so overt and deliberate?
The treatment of aboriginal Australians is also an example of genocide committed against indigenous people. Historical accounts suggest that between 1824 and 1908, white settlers killed more than 10,000 native Aborigines in Tasmania and Australia (Tatz 2006).
Another example is the European colonization of North America. Some historians estimate that Native American populations dwindled from approximately 12 million people in the year 1500 to barely 237,000 by the year 1900 (Lewy 2004). European settlers coerced American Indians off their lands, often causing thousands of deaths in forced removals, such as occurred in the Cherokee or Potawatomi Trail of Tears.
Settlers also enslaved Native Americans and forced them to give up their religious and cultural practices. However, the primary cause of Native American death was neither slavery nor war nor forced removal: it was the introduction of European diseases and Indians’ lack of immunity to them. Smallpox, diphtheria, and measles flourished among indigenous American tribes who had no exposure to the diseases and no ability to fight them. Quite simply, these diseases decimated the tribes. How planned this genocide was remains a topic of contention. Some argue that the spread of disease was an unintended effect of conquest, while others believe it was intentional citing rumors of smallpox-infected blankets being distributed as “gifts” to tribes.
Genocide is not just a historical concept; it is practiced today. Recently, ethnic and geographic conflicts in the Darfur region of Sudan have led to hundreds of thousands of deaths. As part of an ongoing land conflict, the Sudanese government and their state-sponsored Janjaweed militia have led a campaign of killing, forced displacement, and systematic rape of Darfuri people. Although a treaty was signed in 2011, the peace is fragile.
Today, there are a few situations that may be classified as a genocide. The first is in Myanmar, where the Buddhist govrenment has been systematically driving out Muslim populations called Rohingya.
There is also the situation in Yemen, where Saudi Arabia is bombing cities and towns using U.S. weaponry to target Iranian militants has killed over 10,000 people and over 40,000 injured as of 2019. Many human rights advocates claim the situation is approaching a genocide. On top of that, the civil war is creating a situation that could lead to the larget famine the world has seen in over a century.
In July 2011, South Sudan became the world’s newest country when it voted to break away from Sudan. Yet by December 2013, fighting between the new government and rebel fighters created a new civil war within the new country. Thousands of civilians have been killed, with millions more displaced by the violence. Like Yemen, there is now growing concern that the civil war will create a nationwide famine.
Expulsion refers to a subordinate group being forced, by a dominant group, to leave a particular area or country. As seen in the examples of the Trail of Tears and the Holocaust, expulsion can be a factor in genocide. However, it can also stand on its own as a destructive group interaction. Expulsion has often occurred historically with an ethnic or racial basis. In the United States, President Franklin D. Roosevelt issued Executive Order 9066 in 1942, after the Japanese government’s attack on Pearl Harbor. The Order authorized the establishment of internment camps for anyone with as little as one-eighth Japanese ancestry (i.e., one great-grandparent who was Japanese). Over 120,000 legal Japanese residents and Japanese U.S. citizens, many of them children, were held in these camps for up to four years, even though there was never any evidence of collusion or espionage. (In fact, many Japanese Americans continued to demonstrate their loyalty to the United States by serving in the U.S. military during the War.) In the 1990s, the U.S. executive branch issued a formal apology for this expulsion; reparation efforts continue today.
Segregation refers to the physical separation of two groups, particularly in residence, but also in the workplace and social functions. It is essential to distinguish between de jure segregation (segregation that is enforced by law) and de facto segregation (segregation that occurs without laws but because of other factors). A stark example of de jure segregation is the apartheid movement of South Africa, which existed from 1948 to 1994. Under apartheid, black South Africans were stripped of their civil rights, and forcibly relocated to areas that segregated them physically from their white compatriots. Only after decades of degradation, violent uprisings, and international advocacy was apartheid finally abolished.
De jure segregation occurred in the United States for many years after the Civil War. During this time, many former Confederate states passed Jim Crow laws that required segregated facilities for blacks and whites. These laws were codified in 1896’s landmark Supreme Court case Plessy v. Ferguson, which stated that “separate but equal” facilities were constitutional. For the next five decades, blacks were subjected to legalized discrimination, forced to live, work, and go to school in separate—but unequal—facilities. It was not until 1954 and the Brown v. Board of Education case that the Supreme Court declared that “separate educational facilities are inherently unequal,” thus ending de jure segregation in the United States.
De facto segregation, however, cannot be abolished by any court mandate. Segregation is still alive and well in the United States, with different racial or ethnic groups often segregated by neighborhood, borough, or parish. Social scientists use segregation indices to measure racial segregation of different races in different areas. The indices employ a scale from zero to 100, where zero is the most integrated and 100 is the least. In the New York metropolitan area, for instance, the black-white segregation index was seventy-nine for the years 2005–2009. This means that 79 percent of either blacks or whites would have to move in order for each neighborhood to have the same racial balance as the whole metro region (Population Studies Center 2010).
Pluralism is represented by the ideal of the United States as a “salad bowl”: a great mixture of different cultures where each culture retains its own identity and yet adds to the flavor of the whole. Genuine pluralism is characterized by mutual respect on the part of all cultures, both dominant and subordinate, creating a multicultural environment of acceptance. In reality, true pluralism is a challenging goal to reach. In the United States, the mutual respect required by pluralism is often missing, and the nation’s past pluralist model of a melting pot posits a society where cultural differences aren’t embraced as much as erased.
Assimilation describes the process by which a minority individual or group gives up its own identity by taking on the characteristics of the dominant culture. In the United States, which has a history of welcoming and absorbing immigrants from different lands, assimilation has been a function of immigration.
Most people in the United States have immigrant ancestors. In relatively recent history, between 1890 and 1920, the United States became home to around 24 million immigrants. In the decades since then, further waves of immigrants have come to these shores and have eventually been absorbed into U.S. culture, sometimes after facing extended periods of prejudice and discrimination. Assimilation may lead to the loss of the minority group’s cultural identity as they become absorbed into the dominant culture, but assimilation has minimal to no impact on the majority group’s cultural identity.
Some groups may keep only symbolic gestures of their original ethnicity. For instance, many Irish Americans may celebrate Saint Patrick’s Day, many Hindu Americans enjoy a Diwali festival, and many Mexican Americans may celebrate Cinco de Mayo (a May 5 acknowledgment of Mexico’s victory at the 1862 Battle of Puebla). However, for the rest of the year, other aspects of their originating culture may be forgotten.
Assimilation is antithetical to the “salad bowl” created by pluralism; rather than maintaining their cultural flavor, subordinate cultures give up their traditions in order to conform to their new environment. Social scientists measure the degree to which immigrants have assimilated to a new culture with four benchmarks: socioeconomic status, spatial concentration, language assimilation, and intermarriage. When faced with racial and ethnic discrimination, it can be difficult for new immigrants to assimilate fully. Language assimilation, in particular, can be a formidable barrier, limiting employment and educational options and therefore constraining growth in socioeconomic status.
Amalgamation is the process by which a minority group and a majority group combine to form a new group. Amalgamation creates the classic “melting pot” analogy; unlike the “salad bowl,” in which each culture retains its individuality, the “melting pot” ideal sees the combination of cultures that results in a new culture entirely.
Amalgamation, also known as miscegenation, is achieved through intermarriage between races. In the United States, anti-miscegenation laws flourished in the South during the Jim Crow era. It was not until 1967’s Loving v. Virginia that the last anti-miscegenation law was struck from the books, making these laws unconstitutional.
3.2 Understanding Culture
What are the rules when you pass an acquaintance at school, work, in the grocery store, or in the mall? Generally, we do not consider all of the intricacies of the rules of behavior. We may simply say, “Hello!” and ask, “How was your weekend?” or some other trivial question meant to be a friendly greeting. Rarely do we physically embrace or even touch the individual. In fact, doing so may be viewed with scorn or distaste, since as people in the United States we have fairly rigid rules about personal space. However, we all adhere to various rules and standards that are created and maintained in culture. These rules and expectations have meaning, and there are ways in which you may violate this negotiation. Consider what would happen if you stopped and informed everyone who said, “Hi, how are you?” exactly how you were doing that day, and in detail. You would more than likely violate rules of culture and specifically greeting. Perhaps in a different culture the question would be more literal, and it may require a response. Or if you are having coffee with a good friend, perhaps that question warrants a more detailed response. These examples are all aspects of culture, which is shared beliefs, values, and practices, that participants must learn. Sociologically, we examine in what situation and context certain behavior is expected, and in which situations perhaps it is not. These rules are created and enforced by people who interact and share culture.
In everyday conversation, people rarely distinguish between the terms culture and society, but the terms have slightly different meanings, and the distinction is important to a geographer. A society describes a group of people who share a community and a culture. By “community,” social scientists refer to a definable region—as small as a neighborhood (Brooklyn, or “the east side of town”), as large as a country (Ethiopia, the United States, or Nepal), or somewhere in between (in the United States, this might include someone who identifies with Southern or Midwestern society). To clarify, a culture represents the beliefs and practices of a group, while society represents the people who share those beliefs and practices. Neither society nor culture could exist without the other. In this chapter, we examine the relationship between culture and society in greater detail and pay special attention to the elements and forces that shape culture, including diversity and cultural changes. A final discussion touches on the different theoretical perspectives from which human geographers research culture.
Humans are social creatures. Since the dawn of Homo sapiens nearly 250,000 years ago, people have grouped into communities in order to survive. Living together, people form everyday habits and behaviors – from specific methods of childrearing to preferred techniques for obtaining food. In modern-day Paris, many people shop daily at outdoor markets to pick up what they need for their evening meal, buying cheese, meat, and vegetables from different specialty stalls. In the United States, the majority of people shop once a week at supermarkets, filling large carts to the brim. How would a Parisian perceive U.S. shopping behaviors that Americans take for granted?
Almost every human behavior, from shopping to marriage to expressions of feelings, is learned. In the United States, people tend to view marriage as a choice between two people, based on mutual feelings of love. In other nations and in other times, marriages have been arranged through an intricate process of interviews and negotiations between entire families, or in other cases, through a direct system, such as a “mail-order bride.” To someone raised in New York City, the marriage customs of a family from Nigeria may seem strange or even wrong. Conversely, someone from a traditional Kolkata family might be perplexed with the idea of romantic love as the foundation for marriage and lifelong commitment. In other words, how people view marriage depends mostly on what they have been taught.
Behavior based on learned customs is not a bad thing. Being familiar with unwritten rules helps people feel secure and “normal.” Most people want to live their daily lives, confident that their behaviors will not be challenged or disrupted — however, even action as seemingly simple as commuting to work evidences a great deal of cultural propriety.
Culture consists of thoughts and tangible things. Material culture refers to the objects or belongings of a group of people. Nonmaterial culture, in contrast, consists of the ideas, attitudes, and beliefs of a society. Material and nonmaterial aspects of culture are linked, and physical objects often symbolize cultural ideas. These material and nonmaterial aspects of culture can vary subtly from region to region.
Often, a comparison of one culture to another will reveal obvious differences. However, all cultures also share common elements. Cultural universals are patterns or traits that are globally common to all societies. One example of a cultural universal is the family unit: every human society recognizes a family structure that regulates sexual reproduction and the care of children. Even so, how that family unit is defined and how it functions vary. In many Asian cultures, for example, family members from all generations commonly live together in one household. In these cultures, young adults continue to live in the extended household family structure until they marry and join their spouse’s household, or they may remain and raise their nuclear family within the extended family’s homestead. In the United States, by contrast, individuals are expected to leave home and live independently for a period before forming a family unit that consists of parents and their offspring. Other cultural universals include customs like funeral rites, weddings, and celebrations of births. However, each culture may view the ceremonies quite differently.
Anthropologist George Murdock first recognized the existence of cultural universals while studying systems of kinship around the world. Murdock found that cultural universals often revolve around basic human survival, such as finding food, clothing, and shelter, or around shared human experiences, such as birth and death or illness and healing. Through his research, Murdock identified other universals, including language, the concept of personal names, and, interestingly, jokes. Humor seems to be a universal way to release tensions and create a sense of unity among people (Murdock 1949). Social scientists consider humor necessary to human interaction because it helps individuals navigate otherwise tense situations.
Ethnocentrism and Cultural Relativism
Despite how much humans have in common, cultural differences are far more prevalent than cultural universals. For example, while all cultures have language, analysis of particular language structures and conversational etiquette reveal tremendous differences. In some Middle Eastern cultures, it is common to stand close to others in conversation. North Americans keep more distance and maintain an ample “personal space.” Even something as simple as eating and drinking varies significantly from culture to culture. If your professor comes into an early morning class holding a mug of liquid, what do you assume she is drinking? In the United States, it’s most likely filled with coffee, not Earl Grey tea, a favorite in England, or Yak Butter tea, a staple in Tibet.
The way cuisines vary across cultures fascinates many people. Some travelers pride themselves on their willingness to try unfamiliar foods, like celebrated food writer Anthony Bourdain, while others return home expressing gratitude for their native culture’s fare. Often, people in the United States express disgust at other cultures’ cuisine and think that it is gross to eat meat from a dog or guinea pig, for example, while they do not question their habit of eating cows or pigs. Such attitudes are an example of ethnocentrism, or evaluating and judging another culture based on how it compares to one’s cultural norms. Ethnocentrism, as social scientists William Graham Sumner (1906) described the term, involves a belief or attitude that one’s own culture is better than all others. Almost everyone is a little bit ethnocentric. For example, Americans tend to say that people from England drive on the “wrong” side of the road, rather than on the “other” side. Someone from a country where dog meat is standard fare might find it off-putting to see a dog in a French restaurant—not on the menu, but as a pet and patron’s companion. An example of ethnocentrism is referring to parts of Asia as the “Far East.” One might question, “Far East of where?”
A high level of appreciation for one’s own culture can be healthy; a shared sense of community pride, for example, connects people in a society. However, ethnocentrism can lead to disdain or dislike for other cultures and could cause misunderstanding and conflict. People with the best intentions sometimes travel to a society to “help” its people, because they see them as uneducated or backward – inherently inferior. In reality, these travelers are guilty of cultural imperialism, the deliberate imposition of one’s own cultural values on another culture. Europe’s colonial expansion, begun in the sixteenth century, was often accompanied by a severe cultural imperialism. European colonizers often viewed the people in the lands they colonized as uncultured savages who needed European governance, dress, religion, and other cultural practices. A more modern example of cultural imperialism may include the work of international aid agencies who introduce agricultural methods and plant species from developed countries while overlooking indigenous varieties and agricultural approaches that are better suited to the particular region.
Ethnocentrism can be so strong that when confronted with all of the differences of a new culture, one may experience disorientation and frustration, called culture shock. A traveler from Chicago might find the nightly silence of rural Montana unsettling, not peaceful. An exchange student from China might be annoyed by the constant interruptions in class as other students ask questions – a practice that is considered rude in China. Perhaps the Chicago traveler was initially captivated with Montana’s quiet beauty, and the Chinese student was initially excited to see a U.S.-style classroom firsthand. However, as they experience unanticipated differences from their own culture, their excitement gives way to discomfort and doubts about how to behave appropriately in the new situation. Eventually, as people learn more about a culture, they recover from culture shock.
Culture shock may appear because people are not always expecting cultural differences. Anthropologist Ken Barger (1971) discovered this when he conducted a participatory observation in an Inuit community in the Canadian Arctic. Initially, from Indiana, Barger hesitated when invited to join a local snowshoe race. He knew he would never hold his own against these experts. Sure enough, he finished last, to his mortification. However, the tribal members congratulated him, saying, “You really tried!” In Barger’s own culture, he had learned to value victory. To the Inuit people, winning was enjoyable, but their culture valued survival skills essential to their environment: how hard someone tried could mean the difference between life and death. Throughout his stay, Barger participated in caribou hunts, learned how to take shelter in winter storms, and sometimes went days with little or no food to share among tribal members. Trying hard and working together, two nonmaterial values, were indeed much more important than winning.
During his time with the Inuit tribe, Barger learned to engage in cultural relativism. Cultural relativism is the practice of assessing a culture by its own standards rather than viewing it through the lens of one’s own culture. Practicing cultural relativism requires an open mind and a willingness to consider, and even adapt to, new values and norms. However, indiscriminately embracing everything about a new culture is not always possible. Even the most culturally relativist people from egalitarian societies — ones in which women have political rights and control over their own bodies — would question whether the widespread practice of female genital mutilation in countries such as Ethiopia and Sudan should be accepted as a part of cultural tradition. Human geographers attempting to engage in cultural relativism, then, may struggle to reconcile aspects of their own culture with aspects of a culture that they are studying.
Sometimes when people attempt to rectify feelings of ethnocentrism and develop cultural relativism, they swing too far to the other end of the spectrum. Xenocentrism is the opposite of ethnocentrism, and refers to the belief that another culture is superior to one’s own. (The Greek root word xeno, pronounced “ZEE-no,” means “stranger” or “foreign guest.”) An exchange student who goes home after a semester abroad or a geographer who returns from the field may find it difficult to associate with the values of their own culture after having experienced what they deem a more upright or nobler way of living.
Perhaps the greatest challenge for geographers and other social scientists studying different cultures is the matter of keeping a perspective. It is impossible for anyone to keep all cultural biases at bay; the best we can do is strive to be aware of them. Pride in one’s own culture does not have to lead to imposing its values on others. Moreover, an appreciation for another culture should not preclude individuals from studying it with a critical eye.
Elements of Cultural Values and Beliefs
The first, and perhaps most crucial, elements of culture we will discuss are its values and beliefs. Values are a culture’s standard for discerning what is good and just in society. Values are deeply embedded and critical for transmitting and teaching a culture’s beliefs. Beliefs are the tenets or convictions that people hold to be true. Individuals in a society have specific beliefs, but they also share common values. To illustrate the difference, Americans commonly believe in the American Dream—that anyone who works hard enough will be successful and wealthy. Underlying this belief is the American value that wealth is useful and important.
Values help shape a society by suggesting what is right and wrong, beautiful and ugly, sought, or avoided. Consider the value that the United States places upon youth. Children represent innocence and purity, while a youthful adult appearance signifies sexuality. Shaped by this value, individuals spend millions of dollars each year on cosmetic products and surgeries to look young and beautiful. The United States also has an individualistic culture, meaning people place a high value on individuality and independence. In contrast, many other cultures are collectivist, meaning the welfare of the group and group relationships are a primary value.
Living up to a culture’s values can be difficult. It is easy to value good health, but it is hard to quit smoking. Marital monogamy is valued, but many spouses engage in infidelity. Cultural diversity and equal opportunities for all people are valued in the United States, yet the country’s highest political offices have been dominated by white men.
Values often suggest how people should behave, but they do not accurately reflect how people do behave. Values portray an ideal culture; the standards society would like to embrace and live up to. However, ideal culture differs from real culture, the way society actually is, based on what occurs and exists. In an ideal culture, there would be no traffic accidents, murders, poverty, or racial tension. However, in real culture, police officers, lawmakers, educators, and social workers continuously strive to prevent or repair those accidents, crimes, and injustices
One way societies strive to put values into action is through rewards, sanctions, and punishments. When people observe the norms of society and uphold their values, they are often rewarded. A boy who helps an elderly woman board a bus may receive a smile and a “thank you.” A business manager who raises profit margins may receive a quarterly bonus. People sanction certain behaviors by giving their support, approval, or permission, or by instilling formal actions of disapproval and nonsupport. Sanctions are a form of social control, a way to encourage conformity to cultural norms. Sometimes people conform to norms in anticipation or expectation of positive sanctions: good grades, for instance, may mean praise from parents and teachers. From a criminal justice perspective, properly used social control is also inexpensive crime control. Utilizing social control approaches pushes most people to conform to societal rules, regardless of whether authority figures (such as law enforcement) are present.
When people go against a society’s values, they are punished. A boy who shoves an older woman aside to board the bus first may receive frowns or even a scolding from other passengers. A business manager who drives away customers will likely be fired. Breaking norms and rejecting values can lead to cultural sanctions such as earning a negative label—lazy, no-good bum—or to legal sanctions, such as traffic tickets, fines, or imprisonment.
Values are not static; they vary across time and between groups as people evaluate, debate, and change collective societal beliefs. Values also vary from culture to culture. For example, cultures differ in their values about what kinds of physical closeness are appropriate in public. It is rare to see two male friends or coworkers holding hands in the United States, where that behavior often symbolizes romantic feelings. However, in many nations, masculine physical intimacy is considered natural in public. This difference in cultural values came to light when people reacted to photos of former president George W. Bush holding hands with the Crown Prince of Saudi Arabia in 2005. A simple gesture, such as hand-holding, carries significant symbolic differences across cultures.
So far, the examples in this chapter have often described how people are expected to behave in certain situations – for example, when buying food or boarding a bus. These examples describe the visible and invisible rules of conduct through which societies are structured, or what social scientists call norms. Norms define how to behave in accordance with what a society has defined as good, right, and important, and most members of the society adhere to them.
Formal norms are established, written rules. They are behaviors worked out and agreed upon in order to suit and serve the most people. Laws are formal norms, but so are employee manuals, college entrance exam requirements, and “no running” signs at swimming pools. Formal norms are the most specific and clearly stated of the various types of norms, and they are the most strictly enforced. However, even formal norms are enforced to varying degrees and are reflected in cultural values.
For example, money is highly valued in the United States, so monetary crimes are punished. It is against the law to rob a bank, and banks go to great lengths to prevent such crimes. People safeguard valuable possessions and install anti-theft devices to protect homes and cars. A less strictly enforced social norm is driving while intoxicated. While it is against the law to drive drunk, drinking is, for the most part, an acceptable social behavior. Moreover, though there are laws to punish drunk driving, there are few systems in place to prevent the crime. These examples show a range of enforcement regarding formal norms.
There are plenty of formal norms, but the list of informal norms – casual behaviors that are generally and widely conformed to – is longer. People learn informal norms through observation, imitation, and general socialization. Some informal norms are taught directly, while others are learned by observation, including observations of the consequences when someone else violates a norm. However, although informal norms define personal interactions, they extend into other systems as well. Most people do not commit even benign breaches of informal norms. Informal norms dictate appropriate behaviors without the need for written rules.
Norms may be further classified as either mores or folkways. Mores (mor-ays) are norms that embody the moral views and principles of a group. Violating them can have serious consequences. The strongest mores are legally protected with laws or other formal norms. In the United States, for instance, murder is considered immoral, and it is punishable by law (a formal norm). However, more often, mores are judged and guarded by public sentiment (an informal norm). People who violate mores are seen as shameful. They can even be shunned or banned from some groups. The mores of the U.S. school system require that a student’s writing be in the student’s own words or use special forms (such as quotation marks and a whole system of citation) for crediting other writers. Writing another person’s words as if they are one’s own has a name—plagiarism. The consequences of violating this norm are severe and usually, result in expulsion.
Unlike mores, folkways are norms without any moral underpinnings. Rather, folkways direct appropriate behavior in the day-to-day practices and expressions of a culture. They indicate whether to shake hands or kiss on the cheek when greeting another person. Many folkways are actions we take for granted. People need to act without thinking in order to get seamlessly through daily routines; they cannot stop and analyze every action (Sumner 1906). Those who experience culture shock may find that it subsides as they learn the new culture’s folkways and can move through their daily routines more smoothly. Folkways might be small manners, learned by observation and imitated, but they are by no means trivial. Like mores and laws, these norms help people negotiate their daily lives within a given culture.
Fold and Popular Culture
It may seem obvious that there is a multitude of cultural differences between societies in the world. After all, we can easily see that people vary from one society to the next. It is natural that a young woman from rural Kenya would have a very different view of the world from an older man in Mumbai—one of the most populated cities in the world. Additionally, each culture has its own internal variations. Sometimes the differences between cultures are not nearly as significant as the differences inside cultures.
Do you prefer listening to opera or hip-hop music? Do you like watching horseracing or NASCAR? Do you read books of poetry or celebrity magazines? In each pair, one type of entertainment is considered highbrow and the other lowbrow. Social scientists use the term high culture to describe the pattern of cultural experiences and attitudes that exist in the highest-class segments of a society. People often associate high culture with intellectualism, political power, and prestige. In America, high culture also tends to be associated with wealth. Events considered high culture can be expensive and formal—attending a ballet, seeing a play, or listening to a live symphony performance.
The term popular culture, also called pop-culture, refers to the pattern of cultural experiences and attitudes that exist in mainstream society. Popular culture events might include a parade, a baseball game, or the season finale of a television show. Rock and pop music – “pop” is short for “popular” – are part of popular culture. Popular culture is often expressed and spread via commercial media such as radio, television, movies, the music industry, publishers, and corporate-run websites. Unlike high culture, popular culture is known and accessible to most people. You can share a discussion of favorite football teams with a new coworker or comment on American Idol when making small talk in line at the grocery store. However, if you tried to launch into an in-depth discussion on the classical Greek play Antigone, few members of U.S. society today would be familiar with it.
Although high culture may be viewed as superior to popular culture, the labels of high culture and popular culture vary over time and place. Shakespearean plays, considered pop culture when they were written, are now part of our society’s high culture. Five hundred years from now, will our descendants associate Breaking Bad with the cultural elite?
Subculture and Counterculture
A subculture is just what it sounds like – a smaller cultural group within a broader culture; people of a subculture are part of the broader culture but also share a specific identity within a smaller group.
Thousands of subcultures exist within the United States. Ethnic and racial groups share the language, food, and customs of their heritage. Other subcultures are united by shared experiences. Biker culture revolves around a dedication to motorcycles. Some subcultures are formed by members who possess traits or preferences that differ from the majority of a society’s population. The body modification community embraces aesthetic additions to the human body, such as tattoos, piercings, and certain forms of plastic surgery. In the United States, adolescents often form subcultures to develop a shared youth identity. Alcoholics Anonymous offers support to those suffering from alcoholism. However, even as members of a subculture band together, they still identify with and participate in the larger society.
Human geographers and sociologists distinguish subcultures from countercultures, which are a type of subculture that rejects some of the larger culture’s norms and values. In contrast to subcultures, which operate relatively smoothly within the larger society, countercultures might actively defy larger society by developing their own set of rules and norms to live by, sometimes even creating communities that operate outside of greater society.
Cults, a word derived from culture, are also considered counterculture groups. The group “Yearning for Zion” (YFZ) in Eldorado, Texas, existed outside the mainstream and the limelight, until its leader was accused of statutory rape and underage marriage. The sect’s formal norms clashed too severely to be tolerated by U.S. law, and in 2008, authorities raided the compound and removed more than two hundred women and children from the property.
Culture is always evolving. Moreover, new things are added to material culture every day, and they affect nonmaterial culture as well. Cultures change when something new (say, railroads or smartphones) opens up new ways of living and when new ideas enter a culture (say, as a result of travel or globalization).
Innovation: Discovery and Intervention
Innovation refers to an object or concept’s initial appearance in society – it is innovative because it is markedly new. There are two ways to come across an innovative object or idea: discover it or invent it. Discoveries make known previously unknown but existing aspects of reality. In 1610, when Galileo looked through his telescope and discovered Saturn, the planet was already there, but until then, no one had known about it. When Christopher Columbus encountered America, the land was, of course, already well known to its inhabitants. However, Columbus’s discovery was new knowledge for Europeans, and it opened the way to changes in European culture, as well as to the cultures of the discovered lands. For example, new foods such as potatoes and tomatoes transformed the European diet, and horses brought from Europe changed hunting practices of Native American tribes of the Great Plains.
Inventions result when something new is formed from existing objects or concepts—when things are put together in an entirely new manner. In the late 1800s and early 1900s, electric appliances were invented at an astonishing pace. Cars, airplanes, vacuum cleaners, lamps, radios, telephones, and televisions were all new inventions. Inventions may shape a culture when people use them in place of older ways of carrying out activities and relating to others, or as a way to carry out new kinds of activities. Their adoption reflects (and may shape) cultural values, and their use may require new norms for new situations.
Consider the introduction of modern communication technology, such as mobile phones and smartphones. As more and more people began carrying these devices, phone conversations no longer were restricted to homes, offices, and phone booths. People on trains, in restaurants, and other public places became annoyed by listening to one-sided conversations. Norms were needed for cell phone use. Some people pushed for the idea that those who are out in the world should pay attention to their companions and surroundings. However, technology-enabled a workaround such as texting, which enables quiet communication and has surpassed phoning as the leading way to meet today’s highly valued ability to stay in touch anywhere, everywhere.
When the pace of innovation increases, it can lead to generation gaps. A skeptical older generation sometimes dismisses technological gadgets that catch on quickly with one generation. A culture’s objects and ideas can cause not just generational but cultural gaps. Material culture tends to diffuse more quickly than nonmaterial culture; technology can spread through society in a matter of months, but it can take generations for the ideas and beliefs of society to change. Sociologist William F. Ogburn coined the term culture lag to refer to this time that elapses between the introduction of a new item of material culture and its acceptance as part of nonmaterial culture (Ogburn 1957).
Culture lag can also cause tangible problems. The infrastructure of the United States, built a hundred years ago or more, is having trouble supporting today’s more densely populated and fast-paced life. There is a lag in conceptualizing solutions to infrastructure problems. Rising fuel prices, increased air pollution, and traffic jams are all symptoms of culture lag. Although people are becoming aware of the consequences of overusing resources, the means to support changes take time to achieve.
Diffusion and Globalization
The integration of world markets and technological advances of the last decades have allowed for greater exchange between cultures through the processes of globalization and diffusion. Beginning in the 1980s, Western governments began to deregulate social services while granting greater liberties to private businesses. As a result, world markets became dominated by multinational companies in the 1980s, a new state of affairs at that time. We have since come to refer to this integration of international trade and finance markets as globalization. Increased communications and air travel have further opened doors for international business relations, facilitating the flow not only of goods but also of information and people as well (Scheuerman 2014 (revised)). Today, many U.S. companies set up offices in other nations where the costs of resources and labor are cheaper. When a person in the United States calls to get information about banking, insurance, or computer services, the person taking that call may be working in another country.
Alongside the process of globalization is diffusion, or the spread of material and nonmaterial culture. While globalization refers to the integration of markets, diffusion relates to a similar process in the integration of international cultures. Middle-class Americans can fly overseas and return with a new appreciation of Thai noodles or Italian gelato. Access to television and the Internet has brought the lifestyles and values portrayed in U.S. sitcoms into homes around the globe. Twitter feeds from public demonstrations in one nation have encouraged political protesters in other countries. When this kind of diffusion occurs, material objects and ideas from one culture are introduced into another.
Theoretical Perspectives on Culture
Music, fashion, technology, and values—all are products of culture. However, what do they mean? How do human geographers perceive and interpret culture based on these material and nonmaterial items? Let us finish our analysis of culture by reviewing them in the context of three theoretical perspectives: functionalism, conflict theory, and symbolic interactionism.
Functionalists view society as a system in which all parts work—or function—together to create society as a whole. In this way, societies need culture to exist. Cultural norms function to support the fluid operation of society, and cultural values guide people in making choices. Just as members of a society work together to fulfill a society’s needs, culture exists to meet its members’ basic needs.
Functionalists also study culture in terms of values. Education is an essential concept in the United States because it is valued. The culture of education—including material culture such as classrooms, textbooks, libraries, dormitories—supports the emphasis placed on the value of educating a society’s members.
Conflict theorists view social structure as inherently unequal, based on power differentials related to issues like class, gender, race, and age. For a conflict theorist, culture is seen as reinforcing issues of “privilege” for certain groups based upon race, sex, class, and so on. Women strive for equality in a male-dominated society. Senior citizens struggle to protect their rights, their health care, and their independence from a younger generation of lawmakers. Advocacy groups such as the ACLU work to protect the rights of all races and ethnicities in the United States.
Inequalities exist within a culture’s value system. Therefore, a society’s cultural norms benefit some people but hurt others. Some norms, formal and informal, are practiced at the expense of others. Women were not allowed to vote in the United States until 1920. Gay and lesbian couples have been denied the right to marry in some states. Racism and bigotry are very much alive today. Although cultural diversity is supposedly valued in the United States, many people still frown upon interracial marriages. Same-sex marriages are banned in most states, and polygamy—common in some cultures—is unthinkable to most Americans.
At the core of conflict theory is the effect of economic production and materialism: dependence on technology in rich nations versus a lack of technology and education in emerging nations. Conflict theorists believe that a society’s system of material production affects the rest of the culture. People who have less power also have less ability to adapt to cultural change. This view contrasts with the perspective of functionalism. In the U.S. culture of capitalism, to illustrate, we continue to strive toward the promise of the American dream, which perpetuates the belief that the wealthy deserve their privileges.
Symbolic interactionism is a sociological perspective that is most concerned with the face-to-face interactions between members of society. Interactionists see culture as being created and maintained by the ways people interact and in how individuals interpret each other’s actions. Proponents of this theory conceptualize human interactions as a continuous process of deriving meaning from both objects in the environment and the actions of others. This is where the term symbolic comes into play. Every object and action has a symbolic meaning, and language serves as a means for people to represent and communicate their interpretations of these meanings to others. Those who believe in symbolic interactionism perceive culture as highly dynamic and fluid, as it is dependent on how meaning is interpreted and how individuals interact when conveying these meanings.
We began this chapter by asking what culture is. Culture is comprised of all the practices, beliefs, and behaviors of a society. Because culture is learned, it includes how people think and express themselves. While we may like to consider ourselves individuals, we must acknowledge the impact of culture; we inherit thought language that shapes our perceptions and patterned behavior, including about issues of family and friends, and faith and politics.
To an extent, culture is a social comfort. After all, sharing a similar culture with others is precisely what defines societies. Nations would not exist if people did not coexist culturally. There could be no societies if people did not share heritage and language, and civilization would cease to function if people did not agree on similar values and systems of social control. Culture is preserved through transmission from one generation to the next, but it also evolves through processes of innovation, discovery, and cultural diffusion. We may be restricted by the confines of our own culture, but as humans, we can question values and make conscious decisions. No better evidence of this freedom exists than the amount of cultural diversity within our society and around the world. The more we study another culture, the better we become at understanding our own.
Defining Cutural Geography
Professor Don Mitchell argues that cultural geography as a subdiscipline did not come into existence merely to serve as a conduit through which geographers can describe and explain the various cultures of the world in the context of space and place. Instead, he contends that cultural geography is a product of “culture wars.” He builds this argument as follows:
In the nineteenth century, people in the Western World believed that Western civilization was superior to all others on earth, and they wanted to know why European culture was far more advanced (in their eyes) than any other. The British, in particular, were keen to pursue this line of research, but so, too, were the Germans, Americans, and French. After all, the nineteenth century was a time of almost unchallenged European imperialism. Therefore, nineteenth-century geographers tended to think of themselves as significant players in the imperial system.
Over time, the work of early cultural geographers split into two opposing camps. One group was epitomized by Carl Sauer, who is seen by many as the father of modern cultural geography, and the other by Friedrich Ratzel, Ellen Churchill Semple, and Ellsworth Huntington, who sought to deterministically connect human behavior to the physical environment.
Environmental determinism argues that both general features and regional variations of human cultures and societies are determined by the physical and biological forms that make up the earth’s many natural landscapes. Geographers influenced by Semple and Huntington tended to describe and explain what they believed to be “superior” European culture (civilization) through the application of the theory of environmental determinism. From their writings, it does not seem that they ever recognized the inaccuracies of their position, let alone the arrogant, racist foundation upon which it rested.
Although modern geographers rarely discuss the impacts of environmental determinism except to note its serious flaws as a model for spatial analysis, its basic concepts were used by the Third Reich to justify German expansion in the 1930s and 1940s. Friedrich Ratzel, a German geographer (American geographer, Ellen Churchill Semple was one of his students) argued that nation states are organic and therefore, must grow in order to survive. In other words, states must continually seek additional “lebensraum” (living room). The state, a living thing, was a natural link between the people and the natural environment (blood and soil). Moreover, the state provided a living tie between people and a place. This application of environmental determinism, and Social Darwinism, eventually came to be more than a mere academic exercise because it was used to justify, or legitimize, the conquering of one people by another. At the height of European imperialism, academics depicted the tremendous colonial empires as natural extensions of superior European cultures that had developed in the beneficial natural surrounding of the mid-latitudes. The concept of “manifest destiny” was used similarly to justify the expansion of the United States from the Atlantic to Pacific shores, at the expense of indigenous people.
Although Ratzel, Semple, and Huntington never expected their ideas to be used to justify Adolf Hitler’s conquest of Europe, Nazi geographers and political scientists built upon their work to develop theories of Nordic racial and cultural superiority. Semple and Huntington wanted nothing more than to define the boundaries of their discipline and to explain the differences in “cultures” and “places” throughout the world. They were merely striving to carve out a piece of academic or intellectual turf for themselves and like-minded colleagues.
By the 1920s, environmental determinism was already under attack by people such as Carl Sauer (at the University of California, Berkeley). Nevertheless, many scholars continued to base their work on the belief that human beings are primarily a product of the environment in which they live. Frederick Jackson Turner, the American historian who eloquently described the westward expansion of the United States, and Sir Halford Mackinder, the British political scientist who developed the “Heartland Theory,” explained away the conquering of indigenous people by Europeans as perhaps regrettable, but nonetheless, natural and unavoidable (given the superiority of cultures spawned in the mid-latitude environs of Western Europe).
The Cultural Landscape
Carl Sauer was probably the most influential cultural geographer of the twentieth century. Sauer’s work is characterized by a focus on the material landscape tempered with an abiding interest in human ecology, and the damaging impacts of humans on the environment. Additionally, and of equal importance, Sauer worked tirelessly to trace the origins and diffusions of cultural practices such as agriculture, the domestication of animals, and the use of fire.
Although there is no question that Sauer’s contributions to cultural geography are of great worth, some also criticize him for an anti-modern, anti-urban bias. Even so, his efforts to correct the inherent flaws associated with “environmental determinism” significantly strengthened the discipline of geography, and cultural geography in particular.
In 1925, Sauer published The Morphology of Landscape. In this work, he sought to demonstrate that nature does not create culture, but instead, culture working with and on nature, creates ways-of-life. Sauer considered human impacts on the landscape to be a manifestation of culture. Therefore, he argued, in order to understand a culture, a geographer must learn to read the landscape.
Sauer looked at “culture” holistically. Simply put, Sauer regarded “culture” as a way of life. Sauer, however, did not fully develop an explanation of what “culture” is. Instead, he left it to anthropologist Franz Boas to debunk “environmental determinism” and “social Darwinism” and to call for the analysis of cultures on “their” own terms (as opposed to using a hierarchical ranking system). Although mildly rooted in “cultural relativism,” he was not interested in necessarily justifying cultural practices. To the contrary, he wanted to eliminate the application of personal biases when studying cultures (as in Mitchell, Don, Cultural Geography: A Critical Introduction).
3.3 Geography of World Languages
Language and religion are two essential cultural characteristics for human geographers to study. Geographers describe the historical and spatial distributions of language and religion across the landscape as a way of understanding cultural identity. Furthermore, when geographers study religion, they are less concerned with theology and more concerned with the diffusion and interaction of religious ideologies across time and space and the imprint it has on the cultural landscape.
Symbols and Language
Humans, consciously, and subconsciously, are always striving to make sense of their surrounding world. Symbols – such as gestures, signs, objects, signals, and words – help people understand that world. They provide clues to understanding experiences by conveying recognizable meanings that are shared by societies.
The world is filled with symbols. Sports uniforms, company logos, and traffic signs are symbols. In some cultures, a gold ring is a symbol of marriage. Some symbols are highly functional; stop signs, for instance, provide useful instruction. As physical objects, they belong to material culture, but because they function as symbols, they also convey nonmaterial cultural meanings. Some symbols are valuable only in what they represent. Trophies, blue ribbons, or gold medals, for example, serve no other purpose than to represent accomplishments. However, many objects have both material and nonmaterial symbolic value.
A police officer’s badge and uniform are symbols of authority and law enforcement. The sight of an officer in uniform or a squad car triggers reassurance in some citizens, and annoyance, fear, or anger in others.
It is easy to take symbols for granted. Few people challenge or even think about stick figure signs on the doors of public bathrooms. However, those figures are more than just symbols that tell men and women which bathrooms to use. They also uphold the value, in the United States, that public restrooms should be gender exclusive. Even though stalls are relatively private, most places do not offer unisex bathrooms.
Symbols often get noticed when they are out of context. Used unconventionally, they convey strong messages. A stop sign on the door of a corporation makes a political statement, as does a camouflage military jacket worn in an antiwar protest. Together, the semaphore signals for “N” and “D” represent nuclear disarmament – and form the well-known peace sign (Westcott 2008). Today, some college students have taken to wearing pajamas and bedroom slippers to class, clothing that was formerly associated only with privacy and bedtime. Though students might deny it, the outfit defies traditional cultural norms and makes a statement.
Even the destruction of symbols is symbolic. Effigies representing public figures are burned to demonstrate anger at certain leaders. In 1989, crowds tore down the Berlin Wall, a decades-old symbol of the division between East and West Germany, communism, and capitalism.
While different cultures have varying systems of symbols, one symbol is common to all: language. Language is a symbolic system through which people communicate and through which culture is transmitted. Some languages contain a system of symbols used for written communication, while others rely on only spoken communication and nonverbal actions.
Societies often share a single language, and many languages contain the same essential elements. An alphabet is a written system made of symbolic shapes that refer to spoken sound. Taken together, these symbols convey specific meanings. The English alphabet uses a combination of twenty-six letters to create words; these twenty-six letters make up over 600,000 recognized English words (OED Online 2011).
Rules for speaking and writing vary even within cultures, most notably by region. Do you refer to a can of carbonated liquid as “soda,” pop,” or “Coke”? Is a household entertainment room a “family room,” “rec room,” or “den”? When leaving a restaurant, do you ask your server for a “check,” the “ticket,” or your “bill”?
Language is continuously evolving as societies create new ideas. In this age of technology, people have adapted almost instantly to new nouns such as “e-mail” and “Internet,” and verbs such as “downloading,” “texting,” and “blogging.” Twenty years ago, the general public would have considered these nonsense words.
Even while it continually evolves, language continues to shape our reality. This insight was established in the 1920s by two linguists, Edward Sapir, and Benjamin Whorf. They believed that reality is culturally determined, and that any interpretation of reality is based on a society’s language. To prove this point, the geographers and other social scientists argued that every language has words or expressions specific to that language. In the United States, for example, the number thirteen is associated with bad luck. In Japan, however, the number four is considered unlucky, since it is pronounced similarly to the Japanese word for “death.”
The Sapir-Whorf hypothesis is based on the idea that people experience their world through their language, and that they, therefore, understand their world through the culture embedded in their language. The hypothesis, which has also been called linguistic relativity, states that language shapes thought (Swoyer 2003). Studies have shown, for instance, that unless people have access to the word “ambivalent,” they do not recognize an experience of uncertainty from having conflicting positive and negative feelings about one issue. Essentially, the hypothesis argues that, if a person cannot describe the experience, the person does not have the experience.
In addition to using language, people communicate without words. Nonverbal communication is symbolic, and, as in the case of language, much of it is learned through one’s culture. Some gestures are nearly universal: smiles often represent joy, and crying often represents sadness. Other nonverbal symbols vary across cultural contexts in their meaning. A thumbs-up, for example, indicates positive reinforcement in the United States, whereas, in Russia and Australia, it is an offensive curse (Passero 2002). Other gestures vary in meaning depending on the situation and the person. A wave of the hand can mean many things, depending on how it is done and for whom. It may mean “hello,” “goodbye,” “no, thank you,” or “I am royalty.” Winks convey a variety of messages, including “We have a secret,” “I am only kidding,” or “I am attracted to you.” From a distance, a person can understand the emotional gist of two people in conversation just by watching their body language and facial expressions. Furrowed brows and folded arms indicate a serious topic, possibly an argument. Smiles, with heads lifted and arms open, suggest a lighthearted, friendly chat.
Language and religion are two essential cultural characteristics for human geographers to study. Geographers describe the historical and spatial distributions of language and religion across the landscape as a way of understanding cultural identity. Furthermore, when geographers study religion, they are less concerned with theology and more concerned with the diffusion and interaction of religious ideologies across time and space and the imprint it has on the cultural landscape.
Languages relate to each other in much the same way that family groups (think of a family tree) relate to each other. Language is a system of communication that provides meaning to a group of people through speech. Nearly all languages around the world have a literary tradition: a system of written communication. Most nations have an official language. Most citizens of a nation with an official language speak and write in that language. Additionally, most official or governmental documents, monetary funds, and transportation signs are communicated in the official language. However, some regions, such as the European Union have 23 official languages.
A language family is a collection of languages related through a common prehistorical language that makes up the main trunk of language identity. A language tree will have language branches, a collection of languages related through a common ancestral language that existed thousands of years ago. Finally, a language group is a collection of languages within a single branch that shares a common origin from the relatively recent past and displays relatively few differences in grammar and vocabulary.
There are various dialects within any language, and English in the United States is no exception. A dialect is a regional variation of a language, such as English, distinguished by distinctive vocabulary, spelling, and pronunciation. In the United States, there is a dialect difference between southern, northern, and western states. We can all understand each other, but the way we say things may sound accented or “weird” to others. There is also a dialect difference between American English and English spoken in Britain, as well as other parts of the British Commonwealth.
Origins and Diffusions of Language
All modern languages originate from an ancient language. The origin of every language may never be known because many ancient languages existed and changed before the written record. Root words within languages are the best evidence that we have to indicate that languages originated from pre-written history. The possible geographic origin of ancient languages is quite impressive. For example, several languages have similar root words for winter and snow, but not for the ocean. This indicates that the original language originated in an interior location away from the ocean. It was not until people speaking this language migrated toward the ocean that the word ocean was added to the lexicon (a catalog of a language’s words).
There are many layers within the Indo-European language family, but we will focus on the specifics. Though they sound very different, German and English, come from the same Germanic branch of the Indo-European language group. The Germanic branch is divided into High German and Low German. Most Germans speak High German, whereas English, Danish, and Flemish are considered subgroups of Low German. The Romance branch originated 2,000 years ago and is derived from Latin. Today, the Romance languages are Spanish, Portuguese, French, and Italian. The Balto-Slavic branch uses to be considered one broad language called Slavic in the 7th Century, but subdivided into a variety of smaller groups over time. Today the Balto-Slavic branch is composed of the following groups: East Slavic, West, Slavic, South Slavic, and Baltic. The Indo-European language branch spoken by most people around the world is Indo-Iranian with over 100 individual languages.
The origin of Indo-European languages has long been a topic of debate among scholars and scientists. In 2012, a team of evolutionary biologists at the University of Auckland led by Dr. Quentin Atkinson released a study that found all modern IE languages could be traced back to a single root: Anatolian — the language of Anatolia, now modern-day Turkey.
Distribution of Language Families
The next question that must be asked is why languages are diffused where they are diffused? Social scientists, specifically linguistics and archaeologists, disagree on this issue because some believe that languages are diffused by war and conquest, whereas others believe diffusion occurs by peaceful/symbiotic means such as food and trade. For example, English is spoken by over 2 billion people and is the dominant language in 55 countries. Much of this diffusion has to do with British imperialism. The primary purpose of British imperialism was to appropriate as much foreign territory as possible to use as sources of raw materials. Imperialism involves diffusion of language through both conquest and trade.
The linguistic structure of the Sino-Tibetan language family is very complex and different from the Indo-European language family. Unlike European languages, the Sino-Tibetan language is based on hundreds of one-syllable spoken words. The other distinctive characteristic of this language is the way it is written. Rather than letters used in the Indo-European language, the Chinese language is written using thousands of characters called ideograms, which represent ideas or concepts rather than sounds. Sino-Tibetan language family exists mainly in China—the most populous nation in the world—and is over 4,000 years old. Of the over 1 billion Chinese citizens, 75 percent speak Mandarin, making it the most common language used in the world.
There are a large variety of other language families in Eastern and Southeast Asian. There is Austronesian in Indonesia, Austro-Asiatic that includes Vietnamese, Tai Kadai that is spoken in Thailand and surrounding countries, Korean and Japanese. In Southwest Asia (also called the Middle East), there are three dominant language families. The Afro-Asiatic languages are spoken by over 200 million people in several countries in the form of Arabic and are the written language of the Muslim holy book called the Quran. Hebrew is another Afro-Asiatic language and is the language of the Torah and Talmud (Jewish sacred texts).
The largest group of the Altaic language family is Turkish. The Turkish language used to be written with Arabic letters, but in 1928 the Turkish government required the use of the Roman alphabet in order to adapt the nation’s cultural and economic communications to those in line with their Western-European counterparts. Finally, the Uralic language family originated 7,000 years ago, near the Ural mountains in Siberia. All European countries speak Indo-European languages except Estonia, Finland, and Hungary, which speak Uralic instead.
The countries that make up Africa have a wealthy and sophisticated family of languages. Africa has thousands of languages that have resulted from 5,000 years of isolation between the various tribes. Just like species that evolve differently over thousands of years of isolation, Africa’s languages have evolved into various tongues. However, there are three major African language families to focus on. The Niger-Congo language family is spoken by 95 percent of the people in sub-Saharan Africa. Within the Niger-Congo language is Swahili, which is the official language of only 800,00 people, but a secondary language is spoken by over 30 million Africans. Only a few million people in Africa speak languages from the Nilo-Saharan language family. The Khoisan language family is spoken by even fewer, but is distinctive because of the “clicking sounds” when spoken.
In a world dominated by communication, globalization, science, and the Internet, English has grown to be the dominant global language. Today English is considered a lingua franca (a language mutually understood and commonly used in trade by people who have different native languages). It is now believed that 500 million people speak English as a second language. There are other lingua fraca such as Swahili in Eastern Africa and Russian in nations that were once a part of the Soviet Union.
Pidgins and Creoles
Pidgins, also called contact languages, which develop out of contact between at least two groups of people who do not share a common language. A pidgin language is a usually a mixture of two or more languages, contains simplified grammar and vocabulary in, and is used for linguistic communication between groups, usually for trading purposes, who speak different languages. Pidgins are not first/native languages and are always learned as a second language. Many pidgins developed during the European colonization of Asia, Africa, and other areas of the world during the seventeenth to nineteenth centuries.
Creole languages are stable languages that develop from pidgins. Different from pidgins, creole languages are primary languages that are nativized by children. Additionally, creoles have their formal grammar and vocabulary. The grammar of a creole language often has grammatical features that differ from those of both parent languages. However, the vocabulary of a creole is primarily taken from the language of the dominant contact group.
Endangered Languages and Preserving Language Diversity
An isolated language is one that is unrelated to any other language. Thus it cannot be connected to any language family. These remote languages, and many others, are experiencing a mass extinction and are quickly disappearing off the planet. It is believed that nearly 500 languages are in danger of being lost forever. Think about the language you speak, the knowledge and understanding acquired and discovered through that language. What would happen to all that knowledge if your language suddenly disappeared? Would all of it be transferred to another language or would major components be lost to time and be rewritten by history? What would happen to your culture if your language was lost to time? Ultimately, is it possible that the Information Age is causing a Dis-information Age as thousands of languages are near extinction? Click here to view an Esri story map on Endangered Languages.
Consider the impact of language on culture, particularly religion. Most religions have some form of written or literary tradition or history, which allows for information to be transferred to future generations. However, some religions are only transferred verbally, and when that culture disappears (which is happening at a frightening rate), so does all of the knowledge and history of that culture.
The Endangered Languages Project serves as an online resource for samples and research on endangered languages, as well as a forum for advice and best practices for those working to strengthen linguistic diversity.
3.4 Geography of World Religions
Origins and diffusion of World Religions
Our world’s cultural geography is very complex with language and religion as two cultural traits that contribute to the richness, diversity, and complexity of the human experience. Nowadays, the word “diversity” is gaining a great deal of attention, as nations around the world are becoming more culturally, religiously, and linguistically complex and interconnected. Specifically, in regards to religion, these prestigious cultural institutions are no longer isolated in their place of origin, but have diffused into other realms and regions with their religious history and cultural dominance. In some parts of the world, this has caused religious wars and persecution; in other regions, it has helped initiate cultural tolerance and respect for others.
These trends are, in some ways, the product of a history of migratory push and pull factors along with a demographic change that have brought together peoples of diverse religious and even linguistic backgrounds. It is critical that people critically learn about diverse cultures by understanding important cultural traits, such as the ways we communicate and maintain spiritual beliefs. Geographers need to be aware that even though our discipline might not be able to answer numerous questions related to language structure or address unique aspects of theological opinion, our field can provide insight by studying these cultural traits in a spatial context. In essence, geography provides us with the necessary tools to understand the spread of cultural traits and the role of geographic factors, both physical and cultural, in that process. People will then see that geography has influenced the distribution and diffusion of differing ideologies, as well as the diverse ways they practice their spiritual traditions.
As is the case with languages, geographers have a method of classifying religions so people can better understand the geographic diffusion of belief systems. Although religions are by themselves complex cultural institutions, the primary method for categorizing them is simple. In essence, there are two main groups: universalizing religions, which actively invite non-members to join them, and ethnic religions, which are associated with particular ethnic or national groups. Everyone can recount moments in his or her life in which there was interaction with individuals eager to share with others their spiritual beliefs and traditions. Also, that same person might have encountered individuals who are very private, perhaps secretive, when it comes to personal religious traditions deemed by this individual as exclusive to his or her family and the national group. A discussion of these life experiences can generate fascinating examples that serve as testimony to our world’s cultural richness when it comes to different religious traditions.
Origins of World Religions
A significant portion of the world’s universalizing religions has a precise hearth or place of origin. This designation is based on events in the life of a man, and the hearths where the largest universalizing religions originated are all in Asia. Of course, not all religions are from Asia. The three universalizing religions diffused from specific hearths, or places of origin, to other regions of the world. The hearths where each of these three largest universalizing religions originated are based on the events in the lives of key individuals within each religion. Together, Christianity, Islam, and Buddhism have over 2.5 billion adherents combined. Below are links to websites that analyze the diffusion of Christianity, Islam, and Buddhism.
Religion is often the catalyst of conflict between local values or traditions with issues and values that come with nationalism or even globalization. Religion tends to represent core beliefs that represent cultural values and identity, which, along with language, often represent local ideology rather than national or international ideology. There are some reasons why, but some include:
- Culture is often the manifestation of core belief systems determined by the interplay between language and religion.
- Universal religions try to appeal to the many, whereas ethnic religions focus on the few in a specific region.
- Cultural landscapes or language and religion are often represented in the physical landscape. When opposing forces come and threaten the physical landscape, it threatens the cultural landscape.
- Universal religions require the adoption of values that make conflict with local traditions and values. If the universal religion is forced upon another universal religion or ethnic religion, conflict may ensue.
- Migrants tend to learn and simulate the language of the region they migrate to, but keep the religion they originated from. This can be viewed as a threat to the people the migrant moved to.
Types of World Religions
The major religions of the world (Hinduism, Buddhism, Islam, Confucianism, Christianity, Taoism, and Judaism) differ in many respects, including how each religion is organized and the belief system each upholds. Other differences include the nature of belief in a higher power, the history of how the world and the religion began, and the use of sacred texts and objects.
Religions organize themselves – their institutions, practitioners, and structures – in a variety of fashions. For instance, when the Roman Catholic Church emerged, it borrowed many of its organizational principles from the ancient Roman military and turned senators into cardinals, for example. Human geographers and sociologists use different terms, like ecclesia, denomination, and sect, to define these types of organizations. Scholars are also aware that these definitions are not static. Most religions transition through different organizational phases. For example, Christianity began as a cult, transformed into a sect, and today exists as an ecclesia.
Cults, like sects, are new religious groups. In the United States today this term often carries pejorative connotations. However, almost all religions began as cults and gradually progressed to levels of greater size and organization. The term cult is sometimes used interchangeably with the term new religious movement (NRM). In its pejorative use, these groups are often disparaged as being secretive, highly controlling of members’ lives, and dominated by a single, charismatic leader.
A sect is a small and relatively new group. Most of the well-known Christian denominations in the United States today began as sects. For example, the Methodists and Baptists protested against their parent Anglican Church in England, just as Henry VIII protested against the Catholic Church by forming the Anglican Church. From “protest” comes the term Protestant.
Occasionally, a sect is a breakaway group that may be in tension with the larger society. They sometimes claim to be returning to “the fundamentals” or to contest the veracity of a particular doctrine. When membership in a sect increases over time, it may grow into a denomination. Often a sect begins as an offshoot of a denomination, when a group of members believes they should separate from the larger group.
Some sects dissolve without growing into denominations. Social scientitsts call these established sects. Established sects, such as the Amish or Jehovah’s Witnesses fall halfway between sect and denomination on the ecclesia–cult continuum because they have a mixture of sect-like and denomination-like characteristics.
A denomination is a large, mainstream religious organization, but it does not claim to be official or state-sponsored. It is one religion among many. For example, Baptist, African Methodist Episcopal, Catholic, and Seventh-day Adventist are all Christian denominations.
The term ecclesia, initially referring to a political assembly of citizens in ancient Athens, Greece, now refers to a congregation. In geography, the term is used to refer to a religious group that most all members of a society belong to. It is considered a nationally recognized, or official religion that holds a religious monopoly and is closely allied with state and secular powers. The United States does not have an ecclesia by this standard; in fact, this is the type of religious organization that many of the first colonists came to America to escape.
One way to remember these religious organizational terms is to think of cults, sects, denominations, and ecclesia representing a continuum, with increasing influence on society, where cults are least influential, and ecclesia are most influential.
Scholars from a variety of disciplines have strived to classify religions. One widely accepted categorization that helps people understand different belief systems considers what or whom people worship (if anything). Using this method of classification, religions might fall into one of these basic categories.
Note that some religions may be practiced – or understood – in various categories. For instance, the Christian notion of the Holy Trinity (God, Jesus, Holy Spirit) defies the definition of monotheism, which is a religion based on a belief in a single deity, to some scholars. Similarly, many Westerners view the multiple manifestations of Hinduism’s godhead as polytheistic, which is a religion based on a belief in multiple deities,, while Hindus might describe those manifestations are a monotheistic parallel to the Christian Trinity. Some Japanese practice Shinto, which follows animism, which is a religion that believes in the divinity of nonhuman beings, like animals, plants, and objects of the natural world, while people who practice totemism believe in a divine connection between humans and other natural beings.
It is also important to note that every society also has nonbelievers, such as atheists, who do not believe in a divine being or entity, and agnostics, who hold that ultimate reality (such as God) is unknowable. While typically not an organized group, atheists and agnostics represent a significant portion of the population. It is essential to recognize that being a nonbeliever in a divine entity does not mean the individual subscribes to no morality. Indeed, many Nobel Peace Prize winners and other great humanitarians over the centuries would have classified themselves as atheists or agnostics.
Religions have emerged and developed across the world. Some have been short-lived, while others have persisted and grown. In this section, we will explore seven of the world’s major religions.
The oldest religion in the world, Hinduism originated in the Indus River Valley about 4,500 years ago in what is now modern-day northwest India and Pakistan. It arose contemporaneously with ancient Egyptian and Mesopotamian cultures. With roughly one billion followers, Hinduism is the third-largest of the world’s religions. Hindus believe in a divine power that can manifest as different entities. Three main incarnations—Brahma, Vishnu, and Shiva—are sometimes compared to the manifestations of the divine in the Christian Trinity.
Multiple sacred texts, collectively called the Vedas, contain hymns and rituals from ancient India and are mostly written in Sanskrit. Hindus generally believe in a set of principles called dharma, which refers to one’s duty in the world that corresponds with “right” actions. Hindus also believe in karma, or the notion that spiritual ramifications of one’s actions are balanced cyclically in this life or a future life (reincarnation).
Buddhism was founded by Siddhartha Gautama around 500 B.C.E. Siddhartha was said to have given up a comfortable, upper-class life to follow one of poverty and spiritual devotion. At the age of thirty-five, he famously meditated under a sacred fig tree and vowed not to rise before he achieved enlightenment (bodhi). After this experience, he became known as Buddha, or “enlightened one.” Followers were drawn to Buddha’s teachings and the practice of meditation, and he later established a monastic order.
Buddha’s teachings encourage Buddhists to lead a moral life by accepting the four Noble Truths: 1) life is suffering, 2) suffering arises from attachment to desires, 3) suffering ceases when attachment to desires ceases, and 4) freedom from suffering is possible by following the “middle way.” The concept of the “middle way” is central to Buddhist thinking, which encourages people to live in the present and to practice acceptance of others (Smith 1991). Buddhism also tends to deemphasize the role of a godhead, instead of stressing the importance of personal responsibility (Craig 2002).
Confucianism was the official religion of China from 200 B.C.E. until it was officially abolished when communist leadership discouraged the religious practice in 1949. The religion was developed by Kung Fu-Tzu (Confucius), who lived in the sixth and fifth centuries B.C.E. An extraordinary teacher, his lessons—which were about self-discipline, respect for authority and tradition, and jen (the kind treatment of every person)—were collected in a book called the Analects.
Some religious scholars consider Confucianism more of a social system than a religion because it focuses on sharing wisdom about moral practices but does not involve any specific worship; nor does it have formal objects. Its teachings were developed in the context of problems of social anarchy and a near-complete deterioration of social cohesion. Dissatisfied with the social solutions put forth, Kung Fu-Tzu developed his model of religious morality to help guide society (Smith 1991).
In Taoism, the purpose of life is inner peace and harmony. Tao is usually translated as “way” or “path.” The founder of the religion is generally recognized to be a man named Laozi, who lived sometime in the sixth century B.C.E. in China. Taoist beliefs emphasize the virtues of compassion and moderation.
The central concept of tao can be understood to describe a spiritual reality, the order of the universe, or the way of modern life in harmony with the former two. The ying-yang symbol and the concept of polar forces are central Taoist ideas (Smith 1991). Some scholars have compared this Chinese tradition to its Confucian counterpart by saying that “whereas Confucianism is concerned with day-to-day rules of conduct, Taoism is concerned with a more spiritual level of being” (Feng and English 1972).
After their Exodus from Egypt in the thirteenth century B.C.E., Jews, a nomadic society, became monotheistic, worshipping only one God. The Jews’ covenant, or promise of a special relationship with Yahweh (God), is an essential element of Judaism, and their sacred text is the Torah, which Christians also follow as the first five books of the Bible. Talmud refers to a collection of sacred Jewish oral interpretation of the Torah. Jews emphasize moral behavior and action in this world as opposed to beliefs or personal salvation in the next world.
Probably one of the most misunderstood religions in the world is Islam. Though predominantly centered in the Middle East and Northern Africa, Islam is the fastest growing religion in the world with 1.3 billion and is only second to Christianity is members. Islam is also divided into two major branches: Sunni and Shiite. The Sunni branch is the largest, composed of 83 percent of all Muslims. The Shiite branch is more concentrated in clusters such as Iran, Iraq, and Pakistan.
Islam is monotheistic religion and it follows the teaching of the prophet Muhammad, born in Mecca, Saudi Arabia, in 570 C.E. Muhammad is seen only as a prophet, not as a divine being, and he is believed to be the messenger of Allah (God), who is divine. The followers of Islam, whose U.S. population is projected to double in the next twenty years (Pew Research Forum 2011), are called Muslims.
Islam means “peace” and “submission.” The sacred text for Muslims is the Qur’an (or Koran). As with Christianity’s Old Testament, many of the Qur’an stories are shared with the Jewish faith. Divisions exist within Islam, but all Muslims are guided by five beliefs or practices, often called “pillars”: 1) Allah is the only god, and Muhammad is his prophet, 2) daily prayer, 3) helping those in poverty, 4) fasting as a spiritual practice, and 5) pilgrimage to the holy center of Mecca.
In Western nations, the primary loyalty of the population is to the state. In the Islamic world, however, loyalty to a nation-state is trumped by dedication to religion and loyalty to one’s family, extended family, tribal group, and culture. In regions dominated by Islam, tribalism and religion play determining roles in the operation of social, economic, cultural, and political systems. As a result, the nation states within the Islamic civilization are weak and generally ineffectual. Instead of nationalism, Muslims are far more interested in identifying with “ummah,” (Islamic civilization).
Furthermore, despite the lack of a core Islamic state, the leaders of the many Muslim nations created (1969) the Organization of the Islamic Conference in order to foster a sense of solidarity between Muslim states. Almost all nations with large Muslim populations are now members of the organization. Additionally, some of the more powerful Muslim states have sponsored the World Muslim Conference and the Muslim League to bring Muslims together in a unified block.
It is instructive to notice that the concept of ummah rests on the notion that nation-states are the illegitimate children of Western Civilization, designed to further Western interests at the expense of others. Currently, Islamic Civilization has no identifiable core state, but nations such as Iran, Turkey, and Saudi Arabia could assume that role in the future.
It is common for Americans to suggest that they do not have a problem with Islam; only Islamic extremists. Huntington, however, argues that the lessons of history demonstrate the opposite. In fact, over the last fourteen hundred years, Christians and Muslims have almost always had stormy relations. After Muslims were able to take control of North Africa, Iberia, the Middle East, Persia, and Northern India in the seventh and eighth centuries, relatively peaceful boundaries between Islam and Christendom existed for about two hundred years. In 1095, however, Christian rulers launched the Crusades to regain control of the “Holy Land.” Despite some successes, they were eventually defeated in 1291. Not long after this, the Ottoman Empire spread Islam into Byzantium, North Africa, the Balkans, and other parts of Europe. They eventually sacked Vienna, and for many years, Europe was under constant threat from Islamic forces. In the fifteenth century, Christians were able to regain control of Iberia, and the Russians were able to bring an end to Tatar rule. In 1683, the Ottomans again attacked Vienna but were defeated, and from that time on, the people of the Balkans sought to rid themselves of Ottoman rule. By the beginning of World War I, the Ottoman Empire was referred to as the “sick man of Europe.” By 1920, only four Islamic countries (Turkey, Saudi Arabia, Iran, and Afghanistan) were free of non-Muslim rule.
As Western colonialism began to wane in the twentieth century, the populations of about forty-five independent states were solidly Muslim. The independence of these Muslim nations was accompanied by a great deal of violence. 50% of the wars that occurred between 1820 and 1929 involved battles between Muslims and Christians. The conflicts were primarily products of two very different points of view. Whereas Christians believe in the separation of Church and State (God and Caesar), Muslims view religion and politics as the same. Additionally, both Christians and Muslims hold a universalistic view. Each believes that it is the one “true faith,” and both (to one extent or another) believe that they should convert others to their faith.
In addition to the importance of the religious foundations of the Western and Islamic Civilizations, practical, real-world factors also play important roles. For example, Muslim population growth has created large numbers of unemployed, angry youth who have been regularly recruited to Islamic causes. Furthermore, the resurgence of Islam has provided Muslims with confidence in the worth of their civilization relative to the West. Western policies and actions over the last century have also played a significant role in cracking the fault line between Islam and Christendom. From the Islamic point of view, the West (particularly the United States) has meddled in the internal affairs of the Islamic world far too often, and for far too long.
Huntington is convinced the Western and Islamic Civilizations are in for many years, perhaps more than a century, of conflict and tension. He points out that Muslims are growing increasingly anti-Western while at the same time, people in the Western Civilization are increasingly concerned about the intentions (and excesses) of modern Islamic states such as Iran. Europeans express a growing fear of (and impatience with) fundamentalist Muslims who threaten them with terrorist attacks. They are also growing weary of Islamic immigrants who refuse to adhere to European traditions, and in some cases, laws.
Huntington does not mince words. He boldly states,”…the underlying problem for the West is not Islamic fundamentalism. It is Islam; a different civilization whose people are convinced of the superiority of their culture, and are obsessed with the inferiority of their power.” He goes on to add, “…the problem for Islam is not the CIA or the U.S. Department of Defense. It is the West; a different civilization whose people are convinced of the universality of their culture, and believe that they are superior, if declining, power imposes on them an obligation to extend that culture throughout the world.” From Huntington’s perspective, these differences will fuel conflict between Western and Islamic cultures for many years to come.
Many Western leaders do not agree with Huntington’s view. Instead, they argue that Americans need not to fear Islam; only radical Islam. They point to the millions of Muslims living throughout the world in peace with their non-Muslim neighbors. If, they reason, Islam were indeed a religion of war and conquest, why is it that millions of Muslims lead peaceful lives? Instead of applying a negative stereotype to all Muslims, they believe our national security would be better served by making more considerable effort to understand the motivations and goals of radical fundamentalists. In a sense, they are calling for in-depth cultural studies that will lead to accurate cultural intelligence about the nature of Islamic terrorists — simply branding all Muslims as potential terrorists are, from those who do not agree with Huntington, simplistic and dangerous.
Today the largest religion in the world, Christianity began 2,000 years ago in Palestine, with Jesus of Nazareth, a charismatic leader who taught his followers about caritas (charity) or treating others as you would like to be treated yourself.
The sacred text for Christians is the Bible. While Jews, Christians, and Muslims share many of same historical religious stories, their beliefs verge. In their shared sacred stories, it is suggested that the son of God—a messiah—will return to save God’s followers. While Christians believe that he already appeared in the person of Jesus Christ, Jews and Muslims disagree. While they recognize Christ as a prominent historical figure, their traditions do not believe he is the son of God, and their faiths see the prophecy of the Messiah’s arrival as not yet fulfilled.
Different Christian groups have variations among their sacred texts. For instance, Mormons, an established Christian sect, also use the Book of Mormon, which they believe details other parts of Christian doctrine and Jesus’ life that is not included in the Bible. Similarly, the Catholic Bible includes the Apocrypha, a collection that, while part of the 1611 King James translation, is no longer included in Protestant versions of the Bible. Although monotheistic, Christians often describe their god through three manifestations that they call the Holy Trinity: the father (God), the son (Jesus), and the Holy Spirit. The Holy Spirit is a term Christians often use to describe the religious experience, or how they feel the presence of the sacred in their lives. One foundation of Christian doctrine is the Ten Commandments, which decry acts considered sinful, including theft, murder, and adultery.
Holy Religiuos Places
Some of the places that in some ways contributed to the foundation and development of a faith frequently gain sacred status, either by the presence of a natural site ascribed as holy, or as the stage for miraculous events, or by some historical event such as the erection of a temple. When a place gains that “sacred” reputation, it is not unusual to see peoples from different parts of the world traveling or making a pilgrimage to this site with the hope of experiencing spiritual and physical renewal.
Buddhists have eight holy sites because they have special meaning or essential events during the Buddha’s life. The first one is in Lumbini, Nepal where the Buddha was born around 563 B.C. The second holy site is in Bodh Gaya, Nepal, where it is believed Siddhartha reached enlightenment to become the Buddha. The third most important site is in Sarnath, India where he gave his first sermon. The fourth holiest site is Kusinagara, India where the Buddha died at the age of 80 and became enlightened. The other four holy sites are where Buddha performed/experienced specific miracles. People who practice Buddhism or Shintoism erect and use pagodas to house relics and sacred texts. Pagodas are also used for individual prayer and meditation.
Islam’s holiest sites are located in Saudi Arabia. The holiest city is Mecca, Saudi Arabia where the Prophet Muhammad was born. It is also the location of the religion’s holiest objects called the Ka’ba, a cube-like structure believed to have been built by Abraham and Ishmael. The second holiest site to Muslims in Medina, Saudi Arabia where Muhammad began his leadership and gained initial support from the people. Every healthy and financially able Muslim is supposed to make at least one pilgrimage to Mecca in their lifetime. For Muslims, a mosque is considered a holy site of worship, but also a place for community assembly. Usually assembled around a courtyard, the pulpit faces Mecca so that all Muslims pray toward their holiest site. Mosques will have a tower called a minaret where someone summons people to worship.
Meaning lord, master, or power, a Christian church is a place of gathering and worship. Compared to other religions, churches play a more important role because they are created to express values and principles. Churches also play a vital role in the landscape. In earlier days and smaller towns, churches tend to be the most significant buildings. Also because of their importance, Christian religions spend lots of money and commitment to the building and maintenance of their churches.