Moralistic fallacyWikipedia open wikipedia design.
The moralistic fallacy is the informal fallacy of assuming that an aspect of nature which has socially unpleasant consequences cannot exist. Its typical form is "if X were true, then it would happen that Z!", where Z is a morally, socially or politically undesirable thing. What should be moral is assumed a priori to also be naturally occurring. The moralistic fallacy is sometimes presented as the inverse of the naturalistic fallacy. However, it could be seen as a variation of the very same naturalistic fallacy; the difference between them could be considered pragmatical, depending on the intentions of that who uses it: naturalistic fallacy if the user wants to justify existing social practices with the argument that they are natural; moralistic fallacy if the user wants to combat existing social practices with the argument of denying that they are natural.
Steven Pinker writes that "[t]he naturalistic fallacy is the idea that what is found in nature is good. It was the basis for social Darwinism, the belief that helping the poor and sick would get in the way of evolution, which depends on the survival of the fittest. Today, biologists denounce the naturalistic fallacy because they want to describe the natural world honestly, without people deriving morals about how we ought to behave (as in: If birds and beasts engage in adultery, infanticide, cannibalism, it must be OK)." Pinker goes on to explain that "[t]he moralistic fallacy is that what is good is found in nature. It lies behind the bad science in nature-documentary voiceovers: lions are mercy-killers of the weak and sick, mice feel no pain when cats eat them, dung beetles recycle dung to benefit the ecosystem and so on. It also lies behind the romantic belief that humans cannot harbor desires to kill, rape, lie, or steal because that would be too depressing or reactionary."
- Warfare is destructive and tragic, and so it is not of human nature.
- Eating meat harms animals and the environment, and so no one has physiological use for it.
- Men and women ought to be given equal opportunities, and so women and men can do everything equally well.
- Unfaithfulness is immoral, and so it is unnatural to feel desire for others when in a monogamous relationship.
- The pill I am taking should have therapeutic effects on me, and so it does have therapeutic effects on me. (An instance of the placebo effect.)
- Warfare must be allowed because human violence is instinctive.
- Veganism is foolish because humans have eaten meat for thousands of years.
- Men and women should not have the same roles in society because men have more muscle mass and women can give birth.
- Adultery is acceptable because people can naturally want more sexual partners.
Effects on science and society
Sometimes basic scientific findings or interpretations are rejected, or their discovery or development or acknowledgement is opposed or restricted, through assertions of potential misuse or harmfulness.
In the late 1970s, Bernard Davis, in response to growing political and public calls to restrict basic research (versus applied research), amid criticisms of dangerous knowledge (versus dangerous applications), applied the term moralistic fallacy toward its present use.
(The term was used as early as 1957 to at least some if differing import.)
In natural science, the moralistic fallacy can result in rejection or suppression of basic science, whose goal is understanding the natural world, on account of its potential misuse in applied science, whose goal is the development of technology or technique. This blurs scientific assessment, discussed in natural sciences (like physics or biology), versus significance assessment, weighed in social sciences (like social psychology, sociology, and political science), or in behavioral sciences (like psychology).
Davis asserted that in basic science, the descriptive, explanatory, and thus predictive ability of information is primary, not its origin or its applications, since knowledge cannot be ensured against misuse, and misuse cannot falsify knowledge. Both misuse of scientific work and suppression of scientific knowledge can have undesired or even undesirable effects. In the early 20th century, the development of quantum physics made possible the atomic bomb in the mid 20th century. Without quantum physics, however, much of the technology of communications and imaging might have been impossible.
Scientific theories with abundant research support can be discarded in public debates, where general agreement is central but can be utterly false. The obligation of basic scientists to inform the public, however, can be stymied by contrasting claims from others both rousing alarm and touting assurances of protecting the public. Davis had indicated that greater and clearer familiarization with the uses and limitations of science can more effectively prevent knowledge misuse or harm.
Natural science can help humans understand the natural world, but it cannot make policy, moral, or behavioral decisions. Questions involving values—what people should do—are more effectively addressed through discourse in social sciences, not by restriction of basic science. Misunderstanding of the potential of science, and misplaced expectations, have resulted in moral and decisionmaking impediments, but suppressing science is unlikely to resolve these dilemmas.
Seville Statement on Violence
The Seville Statement on Violence was adopted, in Seville, Spain, on 16 May 1986, by an international meeting of scientists convened by the Spanish National Commission for UNESCO. UNESCO adopted the statement, on 16 November 1989, at the twenty-fifth session of its General Conference. The statement purported to refute "the notion that organized human violence is biologically determined".[page needed]
Some, including Steven Pinker, have criticized the Seville Statement as an example of the moralistic fallacy. Research in the areas of evolutionary psychology and neuropsychology suggest that human violence has biological roots.
- Sailer, Steve (October 30, 2002). "Q&A: Steven Pinker of 'Blank Slate'". UPI. Archived from the original on December 5, 2015. Retrieved December 5, 2015.
- Davis BD (1978). "The moralistic fallacy". Nature. 272 (5652): 390. doi:10.1038/272390a0. PMID 11643452.
- Moore EC (1957). "The Moralistic Fallacy". The Journal of Philosophy. 54 (2): 29–42. doi:10.2307/2022356. JSTOR 2022356.
- Davis BD (2000). "The scientist's world". Microbiol Mol Biol Rev. 64 (1): 1–12. doi:10.1128/MMBR.64.1.1-12.2000. PMC 98983. PMID 10704471.
- Kreutzberg GW (2005). "Scientists and the marketplace of opinions". EMBO Rep. 6 (5): 393–96. doi:10.1038/sj.embor.7400405. PMC 1299311. PMID 15864285.
- Davis BD (2000), section "Technology".
- Davis BD (2000), section "Limited scope of science".
- Suter, Keith (2005). 50 Things You Want to Know About World Issues... But Were Too Afraid to Ask. Milson's Point, NSW, Australia: Transworld Publishers. ISBN 978-1-86325-503-5.
- Pinker, Steven. How the Mind Works. New York: W. W. Norton & Company, 1997, pp. 44, 49.
- Jones D (2008). "Human behaviour: Killer instincts". Nature. 451 (7178): 512–15. doi:10.1038/451512a. PMID 18235473.
- May ME & Kennedy CH (2009). "Aggression as positive reinforcement in mice under various ratio- and time-based reinforcement schedules". Journal of the Experimental Analysis of Behavior. 91 (2): 185–96. doi:10.1901/jeab.2009.91-185. PMC 2648522. PMID 19794833.