Only philosophers usually think about the nature of morality, and for the average person, what is good and what is bad usually seems self-evident. We do not think when making our moral judgments, but it just 'comes naturally'.
We can usually justify our judgment, but there are more complicated situations where it quickly becomes clear that rational reasoning leads to a result other than immediate judgment. In fact, one person thinks such-and-such, and another thinks in a different way. Is it a lie if we do not tell a terminally ill patient their diagnosis? Is it wrong to measure the strong and the weak by unequal criteria? Can we tell a victim of bank currency fraud to bear the consequences of their decision, or instead help them out with public money?
How is a moral judgment made?
Morals are derived from rational thinking by the average person, and by many great thinkers. This makes sense, because what if everyone stole, killed, committed adultery, etcetera? Accordingly we learn moral rules by experiencing sin and virtue as we grow up. Except for a minority for whom it doesn't work for some reason. Why does moral sense not develop in psychopaths, who are still able to think rationally? If morality really is so logical, why can't they grasp it?
Of course, moral sense is not the same as social norms, although obviously there is a great overlap between an individual's morals and social expectations. However, it's possible to think that all people are equal, while at the same time the state enshrines in law the inferiority of certain ethnic groups.
The cognitive theory of morality
From antiquity the existence and development of morals interested philosophers, and later psychologists.
Jean Piaget (1896-1980), was born in Switzerland and became famous in France, and was the first true observer and writer on the formation and development of human cognitive abilities, who transformed from being a biologist to a developmental psychologist. He explained how, with the development of the intellect, children's thinking grasps reality more and more realistically. For example, the concept of so-called constants (material retention, volume constancy) goes through several phases. Young children, when the same liquid is poured into a tall transparent tube or a flat bowl, find that the liquid in the tube is "more". When one of two identical Plasticine balls is flattened, the flattened ball is "more" because it looks bigger. This is the result of long development, and Piaget says that it has a great deal to do with the experience that eventually children - through several well-described phases - get to the fact that the amount of substance does not depend on the form it takes. In his view, the child's moral development is the same: the child proceeds from their own ideas, but morality eventually acquires its final form through experience.
American psychologist Lawrence Kohlberg (1927-1987) set the stage for the development of a child's moral thinking in the 1960s, in Piaget's conception. Each section is identified by asking a child to judge the actions of characters in standard stories.
One of the best-known stories, in short, is that a man's wife is terminally ill, there is medicine for her, but the man is poor, and the pharmacist won't sell the medicine cheaply. Therefore, the man breaks into the pharmacy at night to get the medicine. Children give different answers to the question of whether the man has done the right thing. The youngest disapprove of this because it is punishable, and the next level is that children judge the issue from the point of view of the public good (what if everyone behaved that way?).
However in adolescence and young adulthood, at the level of autonomous morality, young people are able to exceed the standard of common morality in the name of universal morality, that is, for example, a person's life is worth more than rigid adherence to the "do not steal" principle (Cole and Cole, 2003).
Kohlberg's system is logical, agrees with experience, and in our world in any case we attach great importance to common sense and free will. Moral development is therefore explained. Today's academic psychology, and most psychologists, believe in the omnipotence of education, and that a baby is born as an empty, pure white sheet. Of course, in the name of scientific correctness, everyone now recognizes that there are also innate qualities, but they are not considered to be really decisive.
In fact, many everyday experiences contradict the cognitive theory of moral judgments, but they are swept under the table, or ignored by the dominant theory.
When common sense goes bankrupt
Jonathan Haidt begins his study of "The Emotional Dog and His Rational Tail", which refers to the saying "The tail wags the dog" with the following intentionally provocative story:
Juliet and Mark are siblings. They are on summer vacation in France. One night they are together in their room near the sea. They decide it would be an interesting and amusing experience to have sex with each other. After all, this would be a new experience for them. They both enjoy sex with each other, but decide not to do it again. This night remains their shared secret, which connects them even more (Haidt, 2001).
The question is: did the young people do the right thing? Most people immediately point out that this was unacceptable behavior. When investigators asked for justification, many argued that there were risks to incestuous reproduction, but the researchers said that Julia was using contraceptives and Mark was using a condom. Respondents then argued that this case was an emotionally depressing memory for the young people. To this the researchers responded that the young people found the sex very exciting and it remained a beautiful memory for them. In that case, most people shrugged their shoulders by closing the debate with "I can't justify it, but I know it was wrong".
But what kind of moral judgment born through rational thinking is it when we cannot then explain it? It is also telling that the respondents insisted on their judgment despite all the contrary arguments. Moral judgment seems to have come first, and rational justification came only afterwards. If moral judgment were something to be learned, arguments could be changed, just as other mistakes are corrected with proper arguments. That is, when we scratch the surface, the rational overlay disappears and moral judgment seems to have been given as if it had always been in our heads.
Already in the 18th century, the Scottish philosopher David Hume (1711-1766) wrote that moral judgment is like aesthetic judgment: it is born from an emotional basis, not based on rational thinking. His famous saying was, " Reason is a slave to the passions ", that is, the moral regulation of action cannot come from rational thinking (Haidt, 2001). Hume's views were not popular in the age of rationalism, and he remains a figure in the history of philosophy.
However, the social intuitionist theory that emerged from the ground of ever-evolving evolutionary psychology warmed to Hume's views. According to this trend, moral judgments are a kind of intuition that appear almost immediately in our minds without much mental effort, while rational arguments that are much slower to form then serve to justify these feelings. That is, rational reasoning is not the cause, but the consequence of moral judgment. However since the two are mostly in harmony, it's easy to believe that moral judgment is a product of thinking. The rational conclusion is also that many moral judgments can be changed by reasoning.
But let's just take the following story: A runaway tram could kill four people, but if we can steer it to another track by pressing a switch, then only one person dies there. When asked if we should do this, most people answer yes. Now let's take the version where we can save the lives of four people by stopping a speeding tram by tossing one large person under it. Most people find this unacceptable (Greene and Haidt, 2002). And yet the end result - four lives saved - is the same, but the two stories lead to conflicting moral judgment. There are plenty of similar stories to devise, and life even brings us comparable ones.
Although the incident did not actually happen, people believed for a long time that British Prime Minister Winston Churchill knew about the German plan to bomb the city of Coventry, but he did not want to reveal the secret that the German telegraphic code had been deciphered. To preserve this secret the city was not evacuated, and as a result nearly six hundred people were killed, and over eight hundred seriously wounded. Most people find this acceptable, realizing how many more lives Churchill ultimately saved. But let's ask the question of how you would feel having your mother, wife, and children among the six hundred dead. I think you would be immediately uncertain as to whether Churchill's decision was really acceptable.
Or suppose there is a hard candy that cannot be halved. Of her two children to whom does the mother give it? Giving it to one child is unfair. If, on the other hand, she decided based on the toss of a coin, the decision would seem fair.
The difference in moral judgments is explained by the fact that pushing the switch or making an abstract military decision, or the result of coin toss, is an impersonal act, whereas if we had to push an innocent person under the tram, or sacrifice our family to protect a war secret, or arbitrarily decide on the candy, it is a personal act.
Many people would agree with the death penalty, but they would not work as executioner. In the evolutionary history of man, impersonal harm has only appeared in the last few thousand years, and for years before, there were only "I do it for him" situations. Therefore, our moral sense views these situations quite differently, even if the end result is the same.
Evolution and moral sense
Throughout millions of years of human evolution, all our traits have been selected for survival or successful reproduction. A tribe was more successful in defending itself against attacks by other tribes and beasts, and hunting was also more successful if the tribe members could accept the authority of the tribal leader, when mutuality and cooperation characterized tribal life and helped those in need. As a result, these traits have become more and more prevalent from generation to generation. In fact the development of these traits can also be observed in more advanced animal species. Because rules of behavior have fundamentally determined tribal life, it has become a moral issue for someone to live by them, or break them. During millions of years of human coexistence, five important moral principles crystallized. These are: care versus harm; correctness versus fraud; loyalty versus betrayal; authority versus subversion; holiness versus impurity (Graham et al., 2013).
Of these, the principle of holiness may need to be explained. For primitive tribes, as well as throughout human history, tradition, and respect for the god(s) was an important cohesive force, and their violation was fatal. It is well known that taboo abuse led to exile, which could lead to a voodoo death. The Bible speaks of unclean spirits, and strangers (because they can carry infections) are also unclean, but so are the sick and the mentally ill, so they must be locked up. Also contaminated or spoiled food is unclean, causing disgust, but greed and avarice are also unclean.
Like many other abilities and qualities, moral sense is genetically fixed in us.
Violation of these principles, for example, if a mother does not take care of her child, someone deceives others, someone betrays their people, a child slaps their father, or someone spits on the national flag, immediately trigger a violent emotional reaction in us. All of this is a function of the moral sense we are born with. Of course, if necessary, we can justify why we condemn these behaviors, but not for the reasons that we think we feel what we feel, but because the moral judgment they give is an emotional expression of an unpleasant bodily condition.
The evolutionary roots of virtue
All our psychic traits and propensities were shaped by evolution, so human virtues such as altruism, willingness to cooperate, the cohesiveness of relatives, honesty, and generosity, are all virtues that have emerged and become hereditary traits, because in community life they enhance an individual's success (Bereczkei, 2009). These qualities, such as altruism, have been able to spread because if everyone is selfless when they have the opportunity, then in the long run, altruism pays off, because when someone is in need, they can enjoy helping others. Cheaters rely on such reciprocity, by taking advantage of the altruism of others. However the sophisticated effect of evolution on the formation of psychic traits is evidenced by the fact that the human brain has developed a "cheat detector" that can automatically detect cheaters (Van Lier et al., 2013). Our facial analysis detection is so accurate that we are able to identify potential cheaters even among unknown faces (Yamagishi et al., 2003). Following on from that, the automatism by which we can judge someone's moral or rule-breaking behavior is not so surprising. However, it is important to keep in mind that while our spiritual activities worked well in the Stone Age, in modern times they sometimes fail or mislead us.
Physical signs and moral decisions
How are these "intuitive" moral judgments born? Just think of the phrases, "My stomach clenched", "I gasped for breath", "I froze at the thought". These all stem from the old observation that our bodies respond to all our perceptions and thoughts with a characteristic pattern of bodily states. These are not simply the physical states that accompany emotions, but they are the emotions themselves. Spiritual tension literally involves muscle tension, and sedatives simultaneously relieve anxiety and relax muscles. This physical-psychic unit explains how prolonged anxiety, and stress can cause physical illness (diarrhea, high blood pressure, etcetera). In a sense, we "think" with our bodies: our brains detect bodily signals and react with behavior or thoughts (Damasio, 1996).
When we observe a violation of a moral rule, we experience unpleasant feelings in our bodies and form a moral judgment. Those who showed greater disgust at the idea of drinking alcohol, or using the toilet after others were more likely to condemn things that were "unclean" such as gay marriage or abortion (Inbar et al., 2009). That is, gay marriage or abortion also provoked disgust in them, like all unclean things. If this sounds strange, just think of how many times your body signals that you are scared of heights, disgusted by a rat, uncomfortable with certain people, or excited about acting in front of others. This idea of moral judgment is unusual only in so far as we believe it to be the result of our thinking. Of course, sometimes it is. There are moral judgments that we have "practiced", that is, we "know" what to think in a given situation. And our knowledge of things also affects us. If you believe that life begins from conception, you will condemn abortion at that stage, whereas if you only consider the fully developed fetus to be a living creature, then that is the point at which abortion would be condemned. If a doctor views gratitude as recompense for his work, he does not feel guilty when accepting payment.
When we detect emotions and bodily vegetative patterns and make moral decisions, certain areas of our brain are very active. It supports the explanation that the intuitive moral judgments outlined here do not necessarily impair thinking when damaged, but the moral sense is lost. Psychopaths have long been referred to as morally insane, because they have no moral sense, even though they can comprehend moral rules. Modern brain imaging procedures demonstrate that certain areas of the brain don't work in them, that is, when they violate a moral rule, they do not display bodily patterns that would deter others from breaking the rule (Greene and Haidt, 2002).
Moral sense and ideology
People differ in how intensely they physically react to different stimuli, and what those stimuli consist of. According to social intuitionist theory, the difference between conservative and liberal ideology ultimately stems from the fact that of the five evolutionary principles of morality, people who follow one or the other ideology consider different things important. These principles fall into two groups: individual values are, Care and Fairness. Liberals emphasize these. Loyalty, Authority and respect for Religion are community values that conservatives see as fundamental. Of course, both ideologies attach importance to the other principles, but while liberals emphasize individual freedom, conservatives emphasize individual responsibility (Graham et al., 2009).
If ideologies can be interpreted in favor of moral principles, the bodily reaction pattern of conservatives to different stimuli is fundamentally different from that of liberals. Numerous studies have confirmed that conservatives are more responsive to negative, fear-inducing, insecure, or disgusting things (Renshon et al., 2015; Oxley et al., 2008). So the psychological basis of conservative ideology is that the individual feels better in stable, secure, closed societies where patriotism, a leadership cult, and traditions determine existence, where there are no immigrants and the social hierarchy is fixed. That is, there are no social upheavals, riots, or rebellions. According to this, conservativism is a way of thinking and behaving that prevents or reduces feelings of anxiety and threat (Jost et al., 2003). Of course, there are many downsides to a liberal strategy too, just think of the wildness of early capitalism based on the total freedom of the individual. Both approaches have their virtues and weaknesses.
References
Bereczkei T: Az erény természete.[The nature of virtue] Typotex, 2009.
Cole, M; Cole, S: The development of children. Worth Publ, New York, 2001. Fourth ed.
Damasio, AR: Descartes' Error : Emotion, Reason and the Human Brain. Vintage Publ, 2006.
Graham J, Haidt J, Nosek BA. Liberals and conservatives rely on different sets of moral foundations. J Pers Soc Psychol. 2009 May;96(5):1029-46.
Graham, J; Haidt, J; Koleva, S; Motyl, M; Iyer, R; Wojcik, SP; Ditto, PH: Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism. in: Zanna, MP; Devine, P; Olson, JM; Plant, A (eds.): Advances in Experimental Social Psychology Volume 47, 2013, Elsevier, pp: 55-130.
Greene J, Haidt J. How (and where) does moral judgment work? Trends Cogn Sci. 2002 Dec 1;6(12):517-523.
Haidt J. The emotional dog and its rational tail: a social intuitionist approach to moral judgment. Psychol Rev. 2001 Oct;108(4):814-34.
Inbar, Y; Pizarro, DA; Bloom, P: Conservatives are more easily disgusted than liberals. Cognition and Emotion, 2009, 714-725
Jost JT, Glaser J, Kruglanski AW, Sulloway FJ. Political conservatism as motivated social cognition. Psychol Bull. 2003 May;129(3):339-75.
Oxley DR, Smith KB, Alford JR, Hibbing MV, Miller JL, Scalora M, Hatemi PK, Hibbing JR. Political attitudes vary with physiological traits. Science. 2008 Sep 19;321(5896):1667-70.
Renshon, J; Lee, JJ; Tingley, D: Physiological Arousal and Political Beliefs. Political Psychology,2015, 36(5):569-585
Van Lier J, Revlin R, De Neys W. Detecting cheaters without thinking: testing the automaticity of the cheater detection module. PLoS One. 2013;8(1):e53827.
Yamagishi, T., Tanida, S., Mashima, R., Shimoma, E., & Kanazawa, S. (2003). You can judge a book by its cover: Evidence that cheaters may look different from cooperators. Evolution and Human Behavior, 24(4), 290-301