Moral psychology

Moral psychology is a field of study in both philosophy and psychology. Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development.[1] More recently however, the term has come to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind.[2][3][4] Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.[5][6]

Some psychologists that have worked in the field are: Jean Piaget, Lawrence Kohlberg, Carol Gilligan, Elliot Turiel, Jonathan Haidt, Linda Skitka, Leland Saunders, Marc Hauser, C. Daniel Batson, Jean Decety, Joshua Greene, A. Peter McGraw, Philip Tetlock, Darcia Narvaez, Tobias Krettenauer, Aner Govrin, Liane Young, Daniel Hart, Suzanne Fegley, and Fiery Cushman. Philosophers that have worked in the field include Stephen Stich, John Doris, Joshua Knobe, John Mikhail, Shaun Nichols, Thomas Nagel, Robert C. Roberts, Jesse Prinz, Michael Smith, and R. Jay Wallace.

History[edit]

Moral psychology began with early philosophers such as Aristotle, Plato, and Socrates. They believed that "to know the good is to do the good". They analyzed the ways in which people make decisions with regards to moral identity. Empirical studies of moral judgment go back at least as far as the 1890s with the work of Frank Chapman Sharp,[7] coinciding with the development of psychology as a discipline separate from philosophy. Since at least 1894, philosophers and psychologists attempted to empirically evaluate the morality of an individual, especially attempting to distinguish adults from children in terms of their judgment, but these efforts failed because they "attempted to quantify how much morality an individual had—a notably contentious idea—rather than understand the individual's psychological representation of morality".[8]

As the field of psychology began to divide away from philosophy, moral psychology expanded to include risk perception and moralization, morality with regards to medical practices, concepts of self-worth, and the role of emotions when analyzing one's moral identity. In most introductory psychology courses, students learn about moral psychology by studying the psychologist Lawrence Kohlberg,[9][10][11] who introduced the moral development theory in 1969. This theory was built on Piaget's observation that children develop intuitions about justice that they can later articulate. He proposed six stages broken into 3 categories of moral reasoning that, he believed to be universal to all people in all cultures.[12] The increasing sophistication of articulation of reasoning is a sign of development. Moral cognitive development centered around justice and guided moral action increase with development, resulting in a postconventional thinker that can "do no other" than what is reasoned to be the most moral action. But researchers using the Kohlberg model found a gap between what people said was most moral and actions they took. Today, some psychologists and students[where?] alike rely on Augusto Blasi's self-model[citation needed] that link ideas of moral judgment and action through moral commitment. Those with moral goals central to the self-concept are more likely to take moral action, as they feel a greater obligation to do so. Those who are motivated will attain a unique moral identity.[13]

Today, moral psychology is a thriving area of research spanning many disciplines,[14] with major bodies of research on the biological,[15][16] cognitive/computational[17][18][19] and cultural[20][21] basis of moral judgment and behavior, and a growing body of research on moral judgment in the context of artificial intelligence.[22][23]

Methods[edit]

The trolley problem, a commonly-used moral dilemma in psychological research

Philosophers, psychologists and researchers from other fields have created various methods for studying topics in moral psychology. These include moral dilemmas such as the trolley problem, structured interviews and surveys as a means to study moral psychology and its development, as well as the use of economic games,[24] neuroimaging,[25] and studies of natural language use.[26]

Interview techniques[edit]

In 1963, Lawrence Kohlberg presented an approach to studying differences in moral judgment by modeling evaluative diversity as reflecting a series of developmental stages (à la Jean Piaget). Lawrence Kohlberg's stages of moral development are:[27]

  1. Obedience and punishment orientation
  2. Self-interest orientation
  3. Interpersonal accord and conformity
  4. Authority and social-order maintaining orientation
  5. Social contract orientation
  6. Universal ethical principles

Stages 1 and 2 are combined into a single stage labeled "pre-conventional", and stages 5 and 6 are combined into a single stage labeled "post-conventional" for the same reason; psychologists can consistently categorize subjects into the resulting four stages using the "Moral Judgement Interview" which asks subjects why they endorse the answers they do to a standard set of moral dilemmas.[10]

In 1999, some of Kohlberg's measures were tested when Anne Colby and William Damon published a study in which the development was examined in the lives of moral exemplars that exhibited high levels of moral commitment in their everyday behavior.[citation needed] The researchers utilized the moral judgement interview (MJI) and two standard dilemmas to compare the 23 exemplars with a more ordinary group of people. The intention was to learn more about moral exemplars and to examine the strengths and weaknesses of the Kohlberg measure. They found that the MJI scores were not clustered at the high end of Kohlberg's scale, they ranged from stage 3 to stage 5. Half landed at the conventional level (stages 3, 3/4, and 4) and the other half landed at the postconventional level (stages 4/5 and 5). Compared to the general population, the scores of the moral exemplars may be somewhat higher than those of groups not selected for outstanding moral behaviour. Researchers noted that the "moral judgement scores are clearly related to subjects' educational attainment in this study". Among the participants that had attained college education or above, there was no difference in moral judgement scores between genders. The study noted that although the exemplars' scores may have been higher than those of nonexemplars, it is also clear that one is not required to score at Kohlberg's highest stages in order to exhibit high degrees of moral commitment and exemplary behaviour.[10] Apart from their scores, it was found that the 23 participating moral exemplars described three similar themes within all of their moral developments: certainty, positivity, and the unity of self and moral goals. The unity between self and moral goals was highlighted as the most important theme as it is what truly sets the exemplars apart from the 'ordinary' people. It was discovered that the moral exemplars see their morality as a part of their sense of identity and sense of self, not as a conscious choice or chore. Also, the moral exemplars showed a much broader range of moral concern than did the ordinary people and go beyond the normal acts of daily moral engagements.

Rather than confirm the existence of a single highest stage, Larry Walker's cluster analysis of a wide variety of interview and survey variables for moral exemplars found three types: the "caring" or "communal" cluster was strongly relational and generative, the "deliberative" cluster had sophisticated epistemic and moral reasoning, and the "brave" or "ordinary" cluster was less distinguished by personality.[28]

Survey instruments[edit]

Between 1910 and 1930, in the United States and Europe, several morality tests were developed to classify subjects as fit or unfit to make moral judgments.[8][29] Test-takers would classify or rank standardized lists of personality traits, hypothetical actions, or pictures of hypothetical scenes. As early as 1926, catalogs of personality tests included sections specifically for morality tests, though critics persuasively argued that they merely measured awareness of social expectations.[30][page needed]

Meanwhile, Kohlberg inspired a new wave of morality tests. The Defining Issues Test (dubbed "Neo-Kohlbergian" by its constituents) scores relative preference for post-conventional justifications,[31] and the Moral Judgment Test scores consistency of one's preferred justifications.[32] Both treat evaluative ability as similar to IQ (hence the single score), allowing categorization by high score vs. low score.

The Moral Foundations Questionnaire is based on moral intuitions consistent across cultures: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation. The questions ask respondents to rate various considerations in terms of how relevant they are to the respondent's moral judgments. The purpose of the questionnaire is to measure the degree to which people rely upon each of the five moral intuitions (which may coexist). The first two foundations cluster together with liberal political orientation and the latter three cluster with conservative political orientation.[33][34]

Evolutionary origins[edit]

A substantial amount of research in recent decades has focused on the evolutionary origins of various aspects of morality.[35][36][37][38] In Unto Others: the Evolution and Psychology of Unselfish Behavior (1998), Elliott Sober and David Sloan Wilson demonstrated that diverse moralities could evolve through group selection. In particular, they dismantled the idea that natural selection will favor a homogeneous population in which all creatures care only about their own personal welfare and/or behave only in ways which advance their own personal reproduction.[39] Tim Dean has advanced the more general claim that moral diversity would evolve through frequency-dependent selection because each moral approach is vulnerable to a different set of situations which threatened our ancestors.[40]

Theories[edit]

Moral identity[edit]

Empirical studies show that reasoning and emotion only moderately predicted moral action. Scholars, such as Blasi, began proposing identity as a motivating factor in moral motivation.[41] Blasi proposed the self model of moral functioning, which described the effects of the judgment of responsibility to perform a moral action, one's sense of moral identity, and the desire for self-consistency on moral action. Blasi also elaborates on the structure of identity and its connection to morality. According to Blasi, there are two aspects that form identity. One of the aspects focuses on the specific contents that make up the self (objective identity content), which include moral ideals. The second refers to the ways in which identity is subjectively experienced (subjective identity experience). As the subjective side of identity matures, the objective side tends to lean towards internal contents like values, beliefs, and goals, rather than external identity contents like physical aspects, behaviors, and relationships. A mature subjective identity yearns for a greater sense of self-consistency. Therefore, identity would serve as a motivation for moral action. Studies of moral exemplars have shown that exemplary moral action often results from the intertwining of personal goals and desires with moral goals, and studies on moral behavior also show a correlation between moral identity and action. S. Hardy and G. Carlo raise critical questions about Blasi's model as well, and propose that researchers should seek to better operationalize and measure moral identity and apply findings to moral education and intervention programs.[42]

Anne Colby and William Damon suggest that one's moral identity is formed through that individual's synchronization of their personal and moral goals. This unity of their self and morality is what distinguishes them from non-exemplars and in turn makes them exceptional.[43] Colby and Damon studied moral identity through the narratives of civil rights activist Virginia Foster Durr and Suzie Valadez, who provided services for the poor, whose behavior, actions, and life's works were considered to be morally exemplary by their communities and those with whom they came in contact. Some common characteristics that these moral exemplars possess are certainty, positivity (e.g. enjoyment of work, and optimism), and unity of self and moral goals.[44] The research suggests that a "transformation of goals" takes place during the evolution of one's moral identity and development and therefore is not an exercise of self-sacrifice but rather one done with great joy; moral exemplars see their personal goals and moral goals as synonymous. This transformation is not always a deliberate process and is most often a gradual process, but can also be rapidly set off by a triggering event.[45] Triggering events can be anything from a powerful moment in a movie to a traumatic life event, or as in the case of Suzie Valadez, the perception of a vision from God. In many of the moral exemplars interviewed, the triggering events and goal transformation did not take place until their 40s. Moral exemplars are said to have the same concerns and commitments as other moral people but to a greater degree, "extensions in scope, intensity and breadth".[46] Furthermore, exemplars possess the ability to be open to new ideas and experiences, also known as an "active receptiveness"[47] to things exterior to themselves.

Daniel Hart conducted a study to see how adolescents who engaged in exemplary levels of prosocial behavior viewed themselves. To empirically study self-concept, he used four different conceptual models to illustrate the concept of self: Self-Concept as Content, Self-Concept as Semantic Space, Self-Concept as Hierarchy of Selves, and Self-Concept as Theory. The findings suggested that adolescent caring exemplars formulated their self-concept differently from comparable peers. In a hierarchy of selves model, exemplars were shown to incorporate their "ideal self" into their "actual self". Among the exemplar group there was more incorporation of parental representations with the "actual self". Conversely, there was less incorporation of representations of their best friend or the self-expected by the best friend. It is theorized that this is because adolescents are less likely to pick a best friend who is a "goody-goody" and deeply involved in service, as well as exemplars possibly having to give up peer expectations in order to engage in service. In a Self-Concept as Theory model, exemplars were most commonly at level 4, a level of self-theory uncommonly reached by adolescents, but common among exemplars. They were also more likely to emphasize academic goals and moral typical activities. There were no significant differences between the exemplars and the control group concerning moral knowledge. On a semantic space analysis, the moral exemplars tended to view their actual self as more integrated with their ideal and expected self.[48]

David Wong proposes that we think of cultures in an analogy to a conversation, there are people with different beliefs, values, and norms that can voice their opinion loudly or quietly, but over the course of time these factors can change. A moral culture can provide other members with a kind of "language" where there is plenty of room for different "dialects", this allows moral identities to be established and voiced more. Opposing ideas can create conflict between those who are close to us, such as family and friends, and strangers. This can bring a greater risk of trying to decide the best course of action in which either party will be affected by it. In essence the notion of Wong's theory is that in order to define our true morality it ultimately comes down to acceptance and being able to accommodate within and between cultures around the world. He also believes that the concept of culture as conversation will help reduce the problems with boundaries between cultures, reconcile the autonomy with the cultural aspect of moral identity and call into question the understanding of healthy and well developed moral identity.[49]

According to Blasi's theory on moral character, moral character is identified by the person's set of the morality of virtues and vices. He theorized willpower, moral desires, and integrity have the capability for a person to act morally by the hierarchical order of virtues. He believed that the "highest" and complex of virtues are expressed by willpower while the "lowest" and simplistic of virtues are expressed integrity. He essentially stated that to have the lower virtues, one must have one or more of the higher virtues. The end goals of moral development identity are to establish and act upon core goals, as well as and use one's strengths to make a difference.[50]

Moral self[edit]

A "moral self" is fostered by mutually responsive parenting in childhood. Children with responsive parents develop more empathy, prosociality, a moral self and conscience.[51] Darcia Narvaez describes the neurobiological and social elements of early experience and their effects on moral capacities.[52]

The moral self results when people integrate moral values into their self-concept.[53] Research on the moral self has mostly focused on adolescence as a critical time period for the integration of self and morality[54] (i.e. self and morality are traditionally seen as separate constructs that become integrated in adolescence.[55] However, the moral self may be established around age 2–3 years.[56][57] In fact, children as young as 5 years-old are able to consistently identify themselves as having certain moral behavioral preferences.[58] Children's moral self is also increasingly predictive of moral emotions with age.[58]

Moral values[edit]

Kristiansen and Hotte[citation needed] review many research articles regarding people's values and attitudes and whether they guide behavior. With the research they reviewed and their own extension of Ajzen and Fishbein's theory of reasoned action, they conclude that value-attitude-behavior depends on the individual and their moral reasoning. They also pointed out that there are such things as good values and bad values. Good values are those that guide our attitudes and behaviors and allow us to express and define ourselves. It also involves the ability to know when values are appropriate in response to the situation or person that you are dealing with. Bad values on the other hand are those that are relied on so much that it makes you unresponsive to the needs and perspectives of others.

Another issue that Kristiansen and Hotte discovered through their research was that individuals tended to "create" values to justify their reactions to certain situations, which they called the "value justification hypothesis".[citation needed] The authors use an example from feminist Susan Faludi's journal entry of how during the period when women were fighting for their right to vote, a New Rights group appealed to society's ideals of "traditional family values" as an argument against the new law in order to mask their own "anger at women's rising independence." Their theory is comparable to Jonathan Haidt's social intuition theory, where individuals justify their intuitive emotions and actions through reasoning in a post-hoc fashion.

Kristiansen and Hotte also found that independent selves had actions and behaviors that are influenced by their own thoughts and feelings, but Interdependent selves have actions, behaviors and self-concepts that were based on the thoughts and feelings of others. Westerners have two dimensions of emotions, activation and pleasantness. The Japanese have one more, the range of their interdependent relationships. Markus and Kitayama found that these two different types of values had different motives. Westerners, in their explanations, show self-bettering biases. Easterners, on the other hand, tend to focus on "other-oriented" biases.[59]

Psychologist S. H. Schwartz defines individual values as "conceptions of the desirable that guide the way social actors (e.g.organisational leaders, policymakers, individual persons) select actions, evaluate people an events, and explain their actions and evaluations."[60] Cultural values form the basis for social norms, laws, customs and practices. While individual values vary case by case (a result of unique life experience), the average of these values point to widely held cultural beliefs (a result of shared cultural values).

Moral virtues[edit]

Piaget and Kohlberg both developed stages of development to understand the timing and meaning of moral decisions. They were interested in placing people into moral categories or stages of development instead of identifying how each individual's views and behaviors are affected by their background and personality. In 2004, D. Lapsley and D. Narvaez outlined how social cognition explains aspects of moral functioning.[61] The social cognitive approach to personality has six critical resources of moral personality: cognition, self-processes, affective elements of personality, changing social context, lawful situational variability, and the integration of other literature. Lapsley and Narvaez suggest that moral values and actions stem from more than our virtues and are controlled by a set of self-created schemas (cognitive structures that organize related concepts and integrate past events). They claim that schemas are "fundamental to our very ability to notice dilemmas as we appraise the moral landscape" and that over time, people develop greater "moral expertise".[62]

Triune ethics theory[edit]

The triune ethics meta-theory (TEM) has been proposed by Darcia Narvaez as a metatheory that highlights the relative contributions to moral development of biological inheritance (including human evolutionary adaptations), environmental influences on neurobiology, and the role of culture.[63] TET proposes three basic mindsets that shape ethical behavior: self-protectionism (a variety of types), engagement, and imagination (a variety of types that are fueld by protectionism or engagement). A mindset influences perception, affordances, and rhetorical preferences. Actions taken within a mindset become an ethic when they trump other values. Engagement and communal imagination represent optimal human functioning that are shaped by the evolved developmental niche (evolved nest) that supports optimal psychosocial neurobiological development.[64] Based on worldwide anthropological research (e.g., Hewlett and Lamb's Hunter-Gatherer Childhoods), Narvaez uses small-band hunter-gatherers as a baseline for the evolved nest and its effects. The empirical support for the link between early experience and the development of triune ethics dispositions ("moral temperament") is accumulating from studies of young children and of adults.[citation needed]

Moral reasoning and development[edit]

Moral development and reasoning are two overlapping topics of study in moral psychology that have historically received a great amount of attention. Moral reasoning refers specifically to the study of how people think about right and wrong and how they acquire and apply moral rules.[65] Moral development refers more broadly to age-related changes in thoughts and emotions that guide moral beliefs, judgments and behaviors.[66]

Kohlberg's stage theory[edit]

Jean Piaget, in watching children play games, noted how their rationales for cooperation changed with experience and maturation. He identified two stages, heteronomous (morality centered outside the self) and autonomous (internalized morality). Lawerence Kohlberg sought to expand Piaget's work. His cognitive developmental theory of moral reasoning dominated the field for decades. He focused on moral development as one's progression in the capacity to reason about justice. Kohlberg's interview method included hypothetical moral dilemmas or conflicts of interest (most notably, the Heinz dilemma). He proposed six stages and three levels of development (claiming that "anyone who interviewed children about dilemmas and who followed them longitudinally in time would come to our six stages and no others).[67] At the Preconventional level, the first two stages included the punishment-and-obedience orientation and the instrumental-relativist orientation. The next level, the conventional level, included the interpersonal concordance or "good boy – nice girl" orientation, along with the "law and order" orientation. Lastly, the final Postconventional level consisted of the social-contract, legalistic orientation and the universal-ethical-principle orientation.[68] According to Kohlberg, an individual is considered more cognitively mature depending on their stage of moral reasoning, which grows as they advance in education and world experience.

Critics of Kohlberg's approach (such as Carol Gilligan and Jane Attanucci) argue that there is an over-emphasis on justice and an under-emphasis on an additional perspective to moral reasoning, known as the care perspective. The justice perspective draws attention to inequality and oppression, while striving for reciprocal rights and equal respect for all. The care perspective draws attention to the ideas of detachment and abandonment, while striving for attention and response to people who need it. Care Orientation is relationally based. It has a more situational focus that is dependent on the needs of others as opposed to Justice Orientation's objectivity.[69] However, reviews by others have found that Gilligan's theory was not supported by empirical studies since orientations are individual dependent.[70][71] In fact, in neo-Kohlbergian studies with the Defining Issues Test, females tend to get slightly higher scores than males.[72][page needed]

The attachment approach to moral judgment[edit]

The attachment approach to moral judgment was proposed by Aner Govrin[73][74][75] and it is based on evidence from infant research, social psychology and moral psychology. According to this approach, through early interactions with the caregiver, the child acquires an internal representation of a system of rules that determine how right/wrong judgments are to be construed, used, and understood. By breaking moral situations down into their defining features, the attachment model of moral judgment outlines a framework for a universal moral faculty based on a universal, innate, deep structure that appears uniformly in the structure of almost all moral judgments regardless of their content.

Moral behaviour[edit]

Historically, major topics of study in the domain of moral behavior have included violence and altruism,[76] bystander intervention and obedience to authority (e.g., the Milgram experiment and Stanford prison experiment). In recent research on moral behavior, studies have ranged from using experience sampling to try and estimate the actual prevalence of various kinds of moral behavior in everyday life,[77][78] and using beavioral experiments to investigate the way people weight their own interests against other people's when deciding whether to harm people.[79]

James Rest reviewed the literature on moral functioning and identified at least four components necessary for a moral behavior to take place:[80][81]

  • Sensitivity – noticing and interpreting the situation
  • Reasoning and making a judgment regarding the best (most moral) option
  • Motivation (in the moment but also habitually, such as moral identity)
  • implementation—having the skills and perseverance to carry out the action

Reynolds and Ceranic researched the effects of social consensus on one's moral behavior. Depending on the level of social consensus (high vs. low), moral behaviors will require greater or lesser degrees of moral identity to motivate an individual to make a choice and endorse a behavior. Also, depending on social consensus, particular behaviors may require different levels of moral reasoning.[82]

More recent attempts to develop an integrated model of moral motivation[83] have identified at least six different levels of moral functioning, each of which has been shown to predict some type of moral or pro-social behavior: moral intuitions, moral emotions, moral virtues/vices (behavioral capacities), moral values, moral reasoning, and moral willpower. This social intuitionist model of moral motivation[84] suggests that moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the "hotter" levels of intuition, emotion, and behavioral virtue/vice. The "cooler" levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.

Value-behavior consistency[edit]

In looking at the relations between moral values, attitudes, and behaviors, previous research asserts that there is no dependable correlation between these three aspects, differing from what we would assume. In fact, it seems to be more common for people to label their behaviors with a justifying value rather than having a value beforehand and then acting on it. There are some people that are more likely to act on their personal values: those low in self-monitoring and high in self-consciousness, due to the fact that they are more aware of themselves and less aware of how others may perceive them. Self-consciousness here means being literally more conscious of yourself, not fearing judgement or feeling anxiety from others. Social situations and the different categories of norms can be telling of when people may act in accordance with their values, but this still isn't concrete either. People will typically act in accordance with social, contextual and personal norms, and there is a likelihood that these norms can also follow one's moral values. Though there are certain assumptions and situations that would suggest a major value-attitude-behavior relation, there is not enough research to confirm this phenomenon.

Moral willpower[edit]

Building on earlier work by Metcalfe and Mischel on delayed gratification,[85] Baumeister, Miller, and Delaney explored the notion of willpower by first defining the self as being made up of three parts: reflexive consciousness, or the person's awareness of their environment and of himself as an individual; interpersonal being, which seeks to mold the self into one that will be accepted by others; and executive function.[86] They stated, "[T]he self can free its actions from being determined by particular influences, especially those of which it is aware".[87] The three prevalent theories of willpower describe it as a limited supply of energy, as a cognitive process, and as a skill that is developed over time. Research has largely supported that willpower works like a "moral muscle" with a limited supply of strength that may be depleted (a process referred to as Ego depletion), conserved, or replenished, and that a single act requiring much self-control can significantly deplete the "supply" of willpower.[86] While exertion reduces the ability to engage in further acts of willpower in the short term, such exertions actually improve a person's ability to exert willpower for extended periods in the long run.[88] Additional research has been conducted that may cast doubt on the idea of ego-depletion.[89]

Moral intuitions[edit]

In 2001, Jonathan Haidt introduced his social intuitionist model which claimed that with few exceptions, moral judgments are made based upon socially derived intuitions. Moral intuitions happen immediately, automatically, and unconsciously, with reasoning largely serving to generate post-hoc rationalizations to justify one's instinctual reactions.[90] He provides four arguments to doubt causal importance of reason. Firstly, Haidt argues that since there is a dual process system in the brain when making automatic evaluations or assessments, this same process must be applicable to moral judgement as well. The second argument, based on research on motivated reasoning, claims that people behave like "intuitive lawyers", searching primarily for evidence that will serve motives for social relatedness and attitudinal coherence. Thirdly, Haidt found that people have post hoc reasoning when faced with a moral situation, this a posteriori (after the fact) explanation gives the illusion of objective moral judgement but in reality is subjective to one's gut feeling. Lastly, research has shown that moral emotion has a stronger link to moral action than moral reasoning, citing Damasio's research on the somatic marker hypothesis and Batson's empathy-altruism hypothesis.[90]

In 2008, Joshua Greene published a compilation which, in contrast to Haidt's model, suggested that fair moral reasoning does take place. A "deontologist" is someone who has rule-based morality that is mainly focused on duties and rights; in contrast, a "consequentialist" is someone who believes that only the best overall consequences ultimately matter.[91] Generally speaking, individuals who answer to moral dilemmas in a consequential manner take longer to respond and show frontal-lobe activity (associated with cognitive processing). Individuals who answer to moral dilemmas in a deontological manner, however, generally answer more quickly and show brain activity in the amygdala (associated with emotional processing).

Moral Foundations Theory[edit]

In regard to moral intuitions, researchers Jonathan Haidt and Jesse Graham performed a study to research the difference between the moral foundations of political liberals and political conservatives.[92] Haidt and Graham expanded on previous research done by Shweder and his three ethics theory. Shweder's theory consisted of three moral ethics: the ethic of autonomy, the ethic of community, and the ethic of divinity. Haidt and Graham took this theory and extended it to discuss the five psychological systems that more specifically make up the three moral ethics theory. These Five Foundations of Morality and their importance vary throughout each culture and construct virtues based on their emphasized foundation. They challenged individuals to question the legitimacy of their moral world and introduced the five psychological foundations of morality:

  • Harm/care, which starts with the sensitivity to signs of suffering in offspring and develops into a general dislike of seeing suffering in others and the potential to feel compassion in response.
  • Fairness/reciprocity, which is developed when someone observes or engages in reciprocal interactions. This foundation is concerned with virtues related to fairness and justice.
  • Ingroup/loyalty, which constitutes recognizing, trusting, and cooperating with members of one's ingroup as well as being wary of members of other groups.
  • Authority/respect, which is how someone navigates in a hierarchal ingroups and communities.
  • Purity/sanctity, which stems from the emotion of disgust that guards the body by responding to elicitors that are biologically or culturally linked to disease transmission.

The five foundations theory are both a nativist and cultural-psychological theory. Modern moral psychology concedes that "morality is about protecting individuals" and focuses primarily on issues of justice (harm/care and fairness/reciprocity).[93] Their research found that "justice and related virtues…make up half of the moral world for liberals, while justice-related concerns make up only one fifth of the moral world for conservatives".[93] Liberals value harm/care and fairness/reciprocity significantly more than the other moralities, while conservatives value all five equally.

Moral emotions[edit]

Moral emotions are a variety of social emotion that are involved in forming and communicating moral judgments and decisions, and in motivating behavioral responses to one's own and others' moral behavior.[94][95][96] While moral reasoning has been the focus of most study of morality dating all the way back to Plato and Aristotle, the emotive side of morality has been looked upon with disdain.[94] However, in the last 30–40 years, there has been a rise in a new front of research: moral emotions as the basis for moral behavior. This development began with a focus on empathy and guilt, but has since moved on to encompass new scholarship on emotions such as anger, shame, disgust, awe, and elevation. With the new research, theorists have begun to question whether moral emotions might hold a larger in determining morality, one that might even surpass that of moral reasoning.[95]

Moral conviction[edit]

Linda Skitka and colleagues have introduced the concept of moral conviction, which refers to a "strong and absolute belief that something is right or wrong, moral or immoral."[97] According to Skitka's integrated theory of moral conviction (ITMC), attitudes held with moral conviction, known as moral mandates, differ from strong but non-moral attitudes in a number of important ways. Namely, moral mandates derive their motivational force from their perceived universality, perceived objectivity, and strong ties to emotion.[98] Perceived universality refers to the notion that individuals experience moral mandates as transcending persons and cultures; additionally, they are regarded as matters of fact. Regarding association with emotion, ITMC is consistent with Jonathan Haidt's social intuitionist model in stating that moral judgments are accompanied by discrete moral emotions (i.e., disgust, shame, guilt). Importantly, Skitka maintains that moral mandates are not the same thing as moral values. Whether an issue will be associated with moral conviction varies across persons.

One of the main lines of IMTC research addresses the behavioral implications of moral mandates. Individuals prefer greater social and physical distance from attitudinally dissimilar others when moral conviction was high. This effect of moral conviction could not be explained by traditional measures of attitude strength, extremity, or centrality. Skitka, Bauman, and Sargis placed participants in either attitudinally heterogeneous or homogenous groups to discuss procedures regarding two morally mandated issues, abortion and capital punishment. Those in attitudinally heterogeneous groups demonstrated the least amount of goodwill towards other group members, the least amount of cooperation, and the most tension/defensiveness. Furthermore, individuals discussing a morally mandated issue were less likely to reach a consensus compared to those discussing non-moral issues.[99]

Intersections with other fields[edit]

Sociological applications[edit]

Some research shows that people tend to self-segregate based on moral or moral-political values.[100][101]

Normative implications[edit]

Researchers have begun to consider what implications (if any) moral psychology research has for other subfields of ethics such as normative ethics and meta-ethics.[102][103][104] John Doris discusses the way in which social psychological experiments—such as the Stanford prison experiments involving the idea of situationism—call into question a key component in virtue ethics: the idea that individuals have a single, environment-independent moral character.[105][page needed] As a further example, Shaun Nichols (2004) examines how empirical data on psychopathology suggests that moral rationalism is false.[106][page needed]

Additionally, research in moral psychology is being used to inform debates in applied ethics around moral enhancement.[107][108]

Robotics and artificial intelligence[edit]

At the intersection of moral psychology and machine ethics, researchers have begun to study people's views regarding the potentially ethically significant decisions that will be made by self-driving cars.[23][22]

See also[edit]

Notes[edit]

  1. ^ See, for example, Lapsley, Daniel K. (1996). Moral Psychology. Developmental psychology series. Boulder, Colorado: Westview Press. ISBN 978-0-8133-3032-7.
  2. ^ Doris, John; Stich, Stephen (2008), Zalta, Edward N. (ed.), "Moral Psychology: Empirical Approaches", The Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University
  3. ^ Wallace, R. Jay (November 29, 2007). "Moral Psychology". In Jackson, Frank; Smith, Michael (eds.). The Oxford Handbook of Contemporary Philosophy. OUP Oxford. pp. 86–113. ISBN 978-0-19-923476-9. Moral psychology is the study of morality in its psychological dimensions
  4. ^ Ellemers, Naomi; van der Toorn, Jojanneke; Paunov, Yavor; van Leeuwen, Thed (18 January 2019). "The Psychology of Morality: A Review and Analysis of Empirical Studies Published From 1940 Through 2017". Personality and Social Psychology Review. 23 (4): 332–366. doi:10.1177/1088868318811759. ISSN 1088-8683. PMC 6791030. PMID 30658545.
  5. ^ Doris & Stich 2008, §1.
  6. ^ Teper, R.; Inzlicht, M.; Page-Gould, E. (2011). "Are we more moral than we think?: Exploring the role of affect in moral behavior and moral forecasting". Psychological Science. 22 (4): 553–558. CiteSeerX 10.1.1.1033.5192. doi:10.1177/0956797611402513. PMID 21415242.
  7. ^ Sharp, Frank Chapman (January 1898). "An Objective Study of Some Moral Judgments". The American Journal of Psychology. 9 (2): 198–234. doi:10.2307/1411759. JSTOR 1411759.
  8. ^ a b Wendorf, Craig A (2001). "History of American morality research, 1894–1932". History of Psychology. 4 (3): 272–288. doi:10.1037/1093-4510.4.3.272.
  9. ^ Kohlberg, Lawrence (1958). The development of modes of moral thinking and choice in the years 10 to 16 (PhD thesis). Chicago. OCLC 1165315.
  10. ^ a b c Colby, Anne; Kohlberg, Lawrence (1987). The Measurement of Moral Judgment. Standard Issue Scoring Manual. 2. Cambridge: Cambridge University Press. ISBN 978-0-521-32565-3.
  11. ^ Kohlberg, L. (1969). "Stage and sequence: The cognitive development approach to socialization". In Goslin, David (ed.). Handbook of Socialization Theory and Research. Chicago: Rand McNally. pp. 347–480.
  12. ^ Kohlberg, Lawrence (1971-01-31), "1. Stages of moral development as a basis for moral education", Moral Education, University of Toronto Press, doi:10.3138/9781442656758-004, ISBN 9781442656758
  13. ^ Hardy, S. A.; Carlo, G. (2011). "Moral identity: What is it, how does it develop, and is it linked to moral action?". Child Development Perspectives. 5 (3): 212–218. doi:10.1111/j.1750-8606.2011.00189.x.
  14. ^ Doris & Stich (2008), §1.
  15. ^ Sevinc, Gunes; Spreng, R. Nathan; Soriano-Mas, Carles (4 February 2014). "Contextual and Perceptual Brain Processes Underlying Moral Cognition: A Quantitative Meta-Analysis of Moral Reasoning and Moral Emotions". PLoS ONE. 9 (2): e87427. Bibcode:2014PLoSO...987427S. doi:10.1371/journal.pone.0087427. PMC 3913597. PMID 24503959.
  16. ^ Moll, Jorge; Zahn, Roland; de Oliveira-Souza, Ricardo; Krueger, Frank; Grafman, Jordan (October 2005). "The neural basis of human moral cognition". Nature Reviews Neuroscience. 6 (10): 799–809. doi:10.1038/nrn1768. PMID 16276356.
  17. ^ Kleiman-Weiner, Max; Saxe, Rebecca; Tenenbaum, Joshua B. (October 2017). "Learning a commonsense moral theory". Cognition. 167: 107–123. doi:10.1016/j.cognition.2017.03.005. hdl:1721.1/118457. PMID 28351662.
  18. ^ Cushman, Fiery (16 July 2013). "Action, Outcome, and Value". Personality and Social Psychology Review. 17 (3): 273–292. doi:10.1177/1088868313495594. PMID 23861355.
  19. ^ Crockett, Molly J. (August 2013). "Models of morality". Trends in Cognitive Sciences. 17 (8): 363–366. doi:10.1016/j.tics.2013.06.005. PMC 3925799. PMID 23845564.
  20. ^ Henrich, Joseph; Boyd, Robert; Bowles, Samuel; Camerer, Colin; Fehr, Ernst; Gintis, Herbert; McElreath, Richard; Alvard, Michael; Barr, Abigail; Ensminger, Jean; Henrich, Natalie Smith; Hill, Kim; Gil-White, Francisco; Gurven, Michael; Marlowe, Frank W.; Patton, John Q.; Tracer, David (22 December 2005). ""Economic man" in cross-cultural perspective: Behavioral experiments in 15 small-scale societies" (PDF). Behavioral and Brain Sciences. 28 (6): 795–815. doi:10.1017/S0140525X05000142. PMID 16372952.
  21. ^ Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D.; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K.; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph (10 February 2016). "Moralistic gods, supernatural punishment and the expansion of human sociality" (PDF). Nature. 530 (7590): 327–330. Bibcode:2016Natur.530..327P. doi:10.1038/nature16980. PMID 26863190.
  22. ^ a b Awad, Edmond; Dsouza, Sohan; Kim, Richard; Schulz, Jonathan; Henrich, Joseph; Shariff, Azim; Bonnefon, Jean-François; Rahwan, Iyad (24 October 2018). "The Moral Machine experiment". Nature. 563 (7729): 59–64. Bibcode:2018Natur.563...59A. doi:10.1038/s41586-018-0637-6. hdl:10871/39187. PMID 30356211.
  23. ^ a b Bonnefon, J.-F.; Shariff, A.; Rahwan, I. (23 June 2016). "The social dilemma of autonomous vehicles". Science. 352 (6293): 1573–1576. arXiv:1510.03346. Bibcode:2016Sci...352.1573B. doi:10.1126/science.aaf2654. PMID 27339987.
  24. ^ Story, Giles W.; Vlaev, Ivo; Metcalfe, Robert D.; Crockett, Molly J.; Kurth-Nelson, Zeb; Darzi, Ara; Dolan, Raymond J. (30 October 2015). "Social redistribution of pain and money". Scientific Reports. 5 (1): 15389. Bibcode:2015NatSR...515389S. doi:10.1038/srep15389. PMC 4626774. PMID 26515529.
  25. ^ Moll, Jorge; de Oliveira-Souza, Ricardo; Eslinger, Paul J.; Bramati, Ivanei E.; Mourão-Miranda, Janaı́na; Andreiuolo, Pedro Angelo; Pessoa, Luiz (1 April 2002). "The Neural Correlates of Moral Sensitivity: A Functional Magnetic Resonance Imaging Investigation of Basic and Moral Emotions". The Journal of Neuroscience. 22 (7): 2730–2736. doi:10.1523/JNEUROSCI.22-07-02730.2002.
  26. ^ Sagi, Eyal; Dehghani, Morteza (31 October 2013). "Measuring Moral Rhetoric in Text". Social Science Computer Review. 32 (2): 132–144. doi:10.1177/0894439313506837.
  27. ^ Kohlberg, Lawrence (1973). "The Claim to Moral Adequacy of a Highest Stage of Moral Judgment". Journal of Philosophy. 70 (18): 630–646. doi:10.2307/2025030. JSTOR 2025030.
  28. ^ Walker, Lawrence J.; Frimer, Jeremy A.; Dunlop, William L. (2010). "Varieties of moral personality: beyond the banality of heroism". Journal of Personality. 78 (3): 907–942. doi:10.1111/j.1467-6494.2010.00637.x. PMID 20573130.
  29. ^ Verplaetse, Jan (2008). "Measuring the moral sense: morality tests in continental Europe between 1910 and 1930". Paedagogica Historica. 44 (3): 265–286. doi:10.1080/00309230701722721.
  30. ^ Kohlberg, Lawrence (1981). The Philosophy of Moral Development. Essays on Moral Developent. 1. San Francisco: Harper & Row. ISBN 978-0-06-064760-5. OCLC 7307342.
  31. ^ Rest, James R. (1979). Development in Judging Moral Issues. Minneapolis: University of Minnesota Press. ISBN 978-0-8166-0891-1.
  32. ^ Lind, Georg (1978). "Wie misst man moralisches Urteil? Probleme und alternative Möglichkeiten der Messung eines komplexen Konstrukts" [How do you measure moral judgment? Problems and alternative ways of measuring a complex construct]. In Portele, G. (ed.). Sozialisation und Moral [Socialization and Morality] (in German). Weinheim: Beltz. pp. 171–201. ISBN 9783407511348. OCLC 715635639.
  33. ^ Graham, Jesse; Haidt, Jonathan; Nosek, Brian A. (2009). "Liberals and conservatives rely on different sets of moral foundations" (PDF). Journal of Personality and Social Psychology. 96 (5): 1029–1046. doi:10.1037/a0015141. PMID 19379034.
  34. ^ Graham, J.; Haidt, J.; Koleva, S.; Motyl, M.; Iyer, R.; Wojcik, S.; Ditto, P.H. (2013). Moral Foundations Theory: The pragmatic validity of moral pluralism (PDF). Advances in Experimental Social Psychology. 47. pp. 55–130. doi:10.1016/b978-0-12-407236-7.00002-4. ISBN 9780124072367.
  35. ^ Sinnott-Armstrong, Walter, ed. (2007). Moral Psychology, Volume 1: The Evolution of Morality: Adaptations and Innateness. ISBN 9780262693547.
  36. ^ Brosnan, S. F.; de Waal, F. B. M. (18 September 2014). "Evolution of responses to (un)fairness". Science. 346 (6207): 1251776. doi:10.1126/science.1251776. PMC 4451566. PMID 25324394.
  37. ^ Tomasello, Michael; Vaish, Amrisha (3 January 2013). "Origins of Human Cooperation and Morality". Annual Review of Psychology. 64 (1): 231–255. doi:10.1146/annurev-psych-113011-143812. hdl:10161/13649. PMID 22804772.
  38. ^ Hare, Brian (3 January 2017). "Survival of the Friendliest: Evolved via Selection for Prosociality". Annual Review of Psychology. 68 (1): 155–186. doi:10.1146/annurev-psych-010416-044201. PMID 27732802.
  39. ^ Sober, Elliott; Wilson, David Sloan (1998). Unto Others: The Evolution and Psychology of Unselfish Behavior. Cambridge: Harvard University Press. ISBN 9780674930469.
  40. ^ Dean, Tim (2012). "Evolution and moral diversity". Baltic International Yearbook of Cognition, Logic and Communication. 7. doi:10.4148/biyclc.v7i0.1775.
  41. ^ Blasi, Augusto (1980). "Bridging moral cognition and moral action: A critical review of the literature". Psychological Bulletin. 88 (1): 1–45. doi:10.1037/0033-2909.88.1.1. ISSN 0033-2909.
  42. ^ Hardy, S. A.; Carlo, G. (2005). "Identity as a source of moral motivation". Human Development. 48 (4): 232–256. doi:10.1159/000086859.
  43. ^ Colby, Anne; Damon, William (1999). "The Development of Extraordinary Moral Commitment". In Killen, Melanie; Hart, Daniel (eds.). Morality in Everyday Life: Developmental Perspectives. Cambridge University Press. pp. 362. ISBN 978-0-521-66586-5.
  44. ^ Colby & Damon 1999, pp. 361–362.
  45. ^ Colby & Damon 1999, p. 354.
  46. ^ Colby & Damon 1999, p. 364.
  47. ^ Colby & Damon 1999, p. 350.
  48. ^ Hart, D.; Fegley, S. (1995). "Prosocial behavior and caring in adolescence: Relations to self-understanding and social judgment" (PDF). Child Development. 66 (5): 1346–1359. doi:10.2307/1131651. JSTOR 1131651. PMID 7555220.
  49. ^ Narvaez, Darcia; Lapsley, Daniel K. (2009). "Chapter 4". Personality, Identity, and Character: Explorations in Moral Psychology. Cambridge University Press. pp. 79–105. ISBN 978-0-521-89507-1.
  50. ^ Blasi, Augusto (2005). "Moral character: A psychological approach". In Lapsley, Daniel; Power, F. (eds.). Character Psychology and Character Education. Notre Dame, Indiana: University of Notre Dame Press. pp. 67–100. ISBN 978-0-268-03371-2.
  51. ^ Kochanska, Grazyna (2002). "Mutually Responsive Orientation Between Mothers and Their Young Children: A Context for the Early Development of Conscience". Current Directions in Psychological Science. 11 (6): 191–195. doi:10.1111/1467-8721.00198. ISSN 0963-7214.
  52. ^ Narvaez, Darcia (2014). Neurobiology and the Development of Human Morality: Evolution, Culture, and Wisdom (Norton Series on Interpersonal Neurobiology). W. W. Norton & Company. ISBN 978-0-393-70967-4.
  53. ^ Krettenauer, T (2011). "The dual moral self: Moral centrality and internal moral motivation". The Journal of Genetic Psychology. 172 (4): 309–328. doi:10.1080/00221325.2010.538451. PMID 22256680.
  54. ^ Krettenauer (2013). "Revisiting the moral self construct: Developmental perspectives on moral selfhood". In Sokol, Bryan; Grouzet, Frederick; Müller, Ulrich (eds.). Self-Regulation and Autonomy. Cambridge University Press. pp. 115–140. ISBN 978-1-107-02369-7.
  55. ^ See, for example, Damon, William; Hart, Daniel (1988). Self-Understanding in Childhood and Adolescence. Cambridge University Press. ISBN 978-0-521-30791-8.
  56. ^ Emde, R.; Biringen, Z.; Clyman, R.; Oppenheim, D. (1991). "The moral self of infancy: Affective core and procedural knowledge" (PDF). Developmental Review. 11 (3): 251–270. doi:10.1016/0273-2297(91)90013-e.
  57. ^ Kochanska, G (2002). "Committed compliance, moral self, and internalization: A mediational model". Developmental Psychology. 38 (3): 339–351. doi:10.1037/0012-1649.38.3.339.
  58. ^ a b Krettenauer, T.; Campbell, S.; Hertz, S. (2013). "Moral emotions and the development of the moral self in childhood". European Journal of Developmental Psychology. 10 (2): 159–173. doi:10.1080/17405629.2012.762750.
  59. ^ Kristiansen, Connie M; Hotte, Alan M (1996). Morality and the self: Implications for the when and how of value-attitude-behavior relations. The Psychology of Values: The Ontario Symposium on Personality and Social Psychology. 8. Erlbaum Hillsdale, NJ. pp. 77–105.
  60. ^ Schwartz, S. H. (1999). "A Theory of Cultural Values and Some Implications for Work" (PDF). Applied Psychology: An International Review. 48 (1): 23–47. doi:10.1080/026999499377655.
  61. ^ Lapsley, Daniel K.; Narvaez, Darcia (2004). "A social-cognitive approach to the moral personality". Moral Development, Self, and Identity. Psychology Press. pp. 189–212. ISBN 978-1-135-63233-5.
  62. ^ Lapsley & Narvaez 2004, p. 197.
  63. ^ Narvaez, Darcia (March 1, 2008). "Triune ethics: The neurobiological roots of our multiple moralities". New Ideas in Psychology. 26 (1): 95–119. CiteSeerX 10.1.1.152.4926. doi:10.1016/j.newideapsych.2007.07.008. ISSN 0732-118X.
  64. ^ Narvaez, Darcia (2014). Neurobiology and the Development of Human Morality: Evolution, Culture and Wisdom. WWNorton. ISBN 978-0393706550.
  65. ^ Pizarro, David A. (2007). "Moral Reasoning". In Baumeister, Roy F; Vohs, Kathleen F (eds.). Encyclopedia of Social Psychology. SAGE Publications, Inc. pp. 591–592. doi:10.4135/9781412956253.n352. ISBN 9781412956253.
  66. ^ Barnett, Mark A. (2007). "Moral Development". In Baumeister, Roy F; Vohs, Kathleen D (eds.). Encyclopedia of Social Psychology. SAGE Publications, Inc. p. 587. doi:10.4135/9781412956253.n349. ISBN 9781412956253.
  67. ^ Kohlberg, Lawrence (1984). The Psychology of Moral Development: The Nature and Validity of Moral Stages. Essays on Moral Developent. 2. Harper & Row. p. 195. ISBN 978-0-06-064761-2.
  68. ^ Crain, W.C. "Kohlberg's Stages of Moral Development". Theories of Development. Prentice-Hall. Archived from the original on October 4, 2011. Retrieved October 3, 2011.
  69. ^ Gilligan, Carol; Attanucci, Jane (1994). "Two Moral Orientations: Gender Differences and Similarities". In Puka, Bill (ed.). Moral Development: Caring Voices and Women's Moral Frames. 34. Taylor & Francis. pp. 123–237. ISBN 978-0-8153-1553-7.
  70. ^ Walker, Lawrence J.; Smetana, Judith (2005). "Gender and Morality". In Killen, Melanie (ed.). Handbook of Moral Development. Psychology Press. pp. 93–115. ISBN 978-1-135-61917-6.
  71. ^ Jaffee and Hyde (2001)[full citation needed]
  72. ^ Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J. (1999). Postconventional Moral Thinking: A Neo-Kohlbergian Approach. Psychology Press. ISBN 978-1-135-70561-9.
  73. ^ Govrin, A. (2014). The ABC of moral development: an attachment approach to moral judgment. Frontiers in psychology, 5 (6), 1-14. CC-BY icon.svg This article contains quotations from this source, which is available under the Creative Commons Attribution 3.0 Unported (CC BY 3.0) license.
  74. ^ Govrin, A. (2019). Ethics and attachment - How we make moral judgments. London: Routledge
  75. ^ Govrin, A. (2014) "From Ethics of Care to Psychology of Care: Reconnecting Ethics of Care to Contemporary Moral Psychology", Frontiers in Psychology pp. 1-10
  76. ^ Staub, Ervin (2003). Psychology of good and evil: why children, adults, and groups help and harm others. Cambridge University Press. ISBN 978-0-511-07031-0.
  77. ^ Hofmann, W.; Wisneski, D. C.; Brandt, M. J.; Skitka, L. J. (11 September 2014). "Morality in everyday life". Science. 345 (6202): 1340–1343. Bibcode:2014Sci...345.1340H. doi:10.1126/science.1251560. PMID 25214626.
  78. ^ Hofmann, Wilhelm; Brandt, Mark J.; Wisneski, Daniel C.; Rockenbach, Bettina; Skitka, Linda J. (30 May 2018). "Moral Punishment in Everyday Life" (PDF). Personality and Social Psychology Bulletin. 44 (12): 1697–1711. doi:10.1177/0146167218775075. PMID 29848212.
  79. ^ Crockett, Molly J.; Kurth-Nelson, Zeb; Siegel, Jenifer Z.; Dayan, Peter; Dolan, Raymond J. (2 December 2014). "Harm to others outweighs harm to self in moral decision making". Proceedings of the National Academy of Sciences. 111 (48): 17320–17325. Bibcode:2014PNAS..11117320C. doi:10.1073/pnas.1408988111. PMC 4260587. PMID 25404350.
  80. ^ Rest, James R (1983). "Morality". Handbook of Child Psychology. 3: 556–629.
  81. ^ Narváez, Darcia; Rest, James (1995). "The four components of acting morally" (PDF). Moral Behavior and Moral Development: An Introduction: 385–400.
  82. ^ Reynolds, Scott J.; Ceranic, Tara L. (2007). "The effects of moral judgment and moral identity on moral behavior: An empirical examination of the moral individual" (PDF). Journal of Applied Psychology. 92 (6): 1610–1624. doi:10.1037/0021-9010.92.6.1610. ISSN 1939-1854. PMID 18020800.
  83. ^ Leffel, G. M. (2008). "Who cares? Generativity and the moral emotions, part 2: A "social intuitionist model" of moral motivation". Journal of Psychology and Theology. 36 (3): 182–201. doi:10.1177/009164710803600303.
  84. ^ Leffel 2008's model draws heavily on Haidt 2001's social intuitionist model of moral judgment.
  85. ^ Metcalfe, J.; Mischel, W. (1999). "A hot/cool-system analysis of delay of gratification: Dynamics of willpower". Psychological Review. 106 (1): 3–19. doi:10.1037/0033-295x.106.1.3. PMID 10197361.
  86. ^ a b Baumeister (2005). "Self and volition". In Miller, William; Delaney, Harold (eds.). Judeo-Christian Perspectives on Psychology: Human Nature, Motivation, and Change. Washington, DC: American Psychological Association. pp. 57–72. ISBN 978-1-59147-161-5.
  87. ^ Baumeister 2005, p. 68.
  88. ^ Muraven, Mark; Baumeister, Roy F.; Tice, Dianne M. (August 1, 1999). "Longitudinal Improvement of Self-Regulation Through Practice: Building Self-Control Strength Through Repeated Exercise" (PDF). The Journal of Social Psychology. 139 (4): 446–457. doi:10.1080/00224549909598404. ISSN 0022-4545. PMID 10457761.
  89. ^ "Hagger et al (2016) A Multilab Preregistered Replication of the Ego-Depletion Effect.pdf" (PDF).
  90. ^ a b Haidt, Jonathan (October 2001). "The Emotional Dog and Its Rational Tail" (PDF). Psychological Review. 108 (4): 814–834. CiteSeerX 10.1.1.620.5536. doi:10.1037/0033-295X.108.4.814. PMID 11699120.
  91. ^ Greene, Joshua (2008). "The secret joke of Kant's Soul". In Sinnott-Armstrong, Walter (ed.). Moral Psychology. 3. Cambridge, Massachusetts: MIT Press. pp. 35–80. ISBN 978-0-262-69355-4. OCLC 750463100.
  92. ^ Haidt, Jonathan; Graham, Jesse (23 May 2007). "When Morality Opposes Justice: Conservatives Have Moral Intuitions that Liberals may not Recognize". Social Justice Research. 20 (1): 98–116. doi:10.1007/s11211-007-0034-z.
  93. ^ a b Heidt & Graham 2007, p. 99.
  94. ^ a b Pizarro, David A. (2007). "Moral Emotions". In Baumeister, Roy F; Vohs, Kathleen D (eds.). Encyclopedia of Social Psychology. SAGE Publications, Inc. pp. 588–589. doi:10.4135/9781412956253.n350. ISBN 9781412956253.
  95. ^ a b Haidt, Jonathan (2003). "The Moral Emotions" (PDF). In Davidson, Richard; Scherer, Klaus; Goldsmith, H. (eds.). Handbook of Affective Sciences. Oxford University Press. p. 855. ISBN 978-0-19-512601-3.
  96. ^ Tangney, June Price; Stuewig, Jeff; Mashek, Debra J. (January 2007). "Moral Emotions and Moral Behavior" (PDF). Annual Review of Psychology. 58 (1): 345–372. doi:10.1146/annurev.psych.56.091103.070145. PMC 3083636. PMID 16953797.
  97. ^ Skitka, Linda (2002). "Do the means always justify the ends or do the ends sometimes justify the means? A value protection model of justice". Personality and Social Psychology Bulletin. 28 (5): 452–461. doi:10.1177/0146167202288003.
  98. ^ Morgan, G. S.; Skitka, L. J. (2011). "Moral conviction". In Christie, Daniel J. (ed.). Encyclopedia of Peace Psychology. Wiley-Blackwell. ISBN 978-1-4051-9644-4.
  99. ^ Skitka, L. J.; Bauman, C.; Sargis, E. (2005). "Moral conviction: Another contributor to attitude strength or something more?" (PDF). Journal of Personality and Social Psychology. 88 (6): 895–917. doi:10.1037/0022-3514.88.6.895. PMID 15982112.
  100. ^ Haidt, Jonathan; Rosenberg, Evan; Hom, Holly (2003). "Differentiating Diversities: Moral Diversity Is Not Like Other Kinds". Journal of Applied Social Psychology. 33 (1): 1–36. doi:10.1111/j.1559-1816.2003.tb02071.x.
  101. ^ Motyl, Matt; Iyer, Ravi; Oishi, Shigehiro; Trawalterl, Sophie; Nosek, Brian A. (2014). "How ideological migration geographically segregates groups". Journal of Experimental Social Psychology. 51: 1–14. doi:10.1016/j.jesp.2013.10.010.
  102. ^ Kahane, Guy (March 2011). "Evolutionary Debunking Arguments". Noûs. 45 (1): 103–125. doi:10.1111/j.1468-0068.2010.00770.x. PMC 3175808. PMID 21949447.
  103. ^ Greene, Joshua; Cohen, Jonathan (29 November 2004). "For the law, neuroscience changes nothing and everything". Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences. 359 (1451): 1775–1785. doi:10.1098/rstb.2004.1546. PMC 1693457. PMID 15590618.
  104. ^ Berker, Selim (September 2009). "The Normative Insignificance of Neuroscience". Philosophy & Public Affairs. 37 (4): 293–329. doi:10.1111/j.1088-4963.2009.01164.x.
  105. ^ Doris, John M. (2002). Lack of Character: Personality and Moral Behavior. Cambridge University Press. ISBN 978-1-316-02549-9.
  106. ^ Nichols, Shaun (2004). Sentimental Rules: On the Natural Foundations of Moral Judgment. Oxford University Press. ISBN 978-0-19-988347-9.
  107. ^ Darby, R. Ryan; Pascual-Leone, Alvaro (22 February 2017). "Moral Enhancement Using Non-invasive Brain Stimulation". Frontiers in Human Neuroscience. 11: 77. doi:10.3389/fnhum.2017.00077. PMC 5319982. PMID 28275345.
  108. ^ Levy, Neil; Douglas, Thomas; Kahane, Guy; Terbeck, Sylvia; Cowen, Philip J.; Hewstone, Miles; Savulescu, Julian (2014). "Are You Morally Modified?: The Moral Effects of Widely Used Pharmaceuticals". Philosophy, Psychiatry, & Psychology. 21 (2): 111–125. doi:10.1353/ppp.2014.0023. PMC 4398979. PMID 25892904.

References[edit]

External links[edit]

From the Stanford Encyclopedia of Philosophy
From the Internet Encyclopedia of Philosophy