Omission+bias

Primary author: Phil Guntle

Omission bias is the tendency for humans to prefer inactivity to activity, even in a situation where inactivity leads to greater risk or a worse outcome. Connolly & Reb (2003) examined vaccination decisions of undergraduate college students, and nonparent adults. What was found was that the attitude towards vaccination was less of a bias towards action or inaction, and more of a bias towards regret they expect to feel if they do not get the vaccine. This study is an example of how difficult it is to compose a scientific study to measure this bias. Omission bias is part of a larger group of cognitive constructs, each which tries to focus on the mental processes inferred from behavior (Kutluk, Sengel, Kilickaya 2010). There are many different biases, and sometimes it can be hard to tell each thought process apart because so many things affect the behavior. Often times, omission bias is mistaken for the status quo bias, which is the tendency to not change an established behavior unless the incentive to change is compelling (Samuelson & Zeckhauser 1988). Often, omission bias is referred to when discussing the morals of a decision. Families are challenged with this bias frequently because it involves requiring people to place moral judgment in a situation. For instance, a teenager becomes inadvertently pregnant and struggles to tell her parents about it. The moral choice to take action and speak to them versus the inaction and hiding her pregnancy for as long as possible weighs on the shoulders of that teenager. It is hard to determine whether omission bias was present in this case because so many factors influence her decision.

A famous problem called //The Trolley Problem// illustrates the idea of omission bias. The problem is: // A // // trolley // // is running out of control down a track. In its path are five people who have been tied to the track by a mad philosopher. Fortunately, you could flip a switch, which will lead the trolley down a different track to safety. Unfortunately, there is a single person tied to that track. Should you flip the switch or do nothing? // Mikhail (2007) used this problem in a cognitive science fashion to research whether age, gender, education, or cultural background would change the decision. What he found was that those factors do influence the decision of morality, and that when placed in a situation where morality is already compromised, it was suggested that all decisions are made by an unconscious moral grammar, which picks the best morals for each situation.

In a study by Stankovich and West (2008), in seven different studies found that omission bias is uncorrelated with cognitive ability. When the brain is presented with the opportunity to have an omission bias, the process of cognitive thought tries to rationalize the decision, making it hard for some people to avoid the bias. In my own words, The thought process tries to make the best decision possible, so the thought attempts to avoid any moral conflict. If this is not possible, the thought process is to minimize any immediate negative reaction. This goes back to earlier in the semester where if posed with the choice of 50 dollars now, or 75 dollars in 1 year, many people would select the 50 dollars now. The immediate inaction in omission bias represents the choosing of 50 dollars right now, however losing out on the larger reward later on, which is the worse outcome.

References Stankovich, K., & West, R. (2008). On the Relative Independence of Thinking Biases and Cognitive Ability. //National Institute of Health- Pub med.gov, 94(4), 672-95.// Retrieved at []. Connolly, T., & Reb, J. (2003). Omission bias in vaccination decisions: Where's the omission? Where's the bias?. //Organizational behavior and human decision processes, 91//(2), 186-202. Retrieved from [|http://cat.inist.fr/aModele=afficheN&cpsidt=1499151]  Mikhail, J. (2007). Universal moral grammar: theory, evidence and the future. //Trends in Cognitive Science, 11//(4), 143-152. code Retrieved from [] code

code Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. //Journal of Risk and Uncertainty, pg// 7-59. Retrieved from code

code      dtserv2.compsy.uni-jena.de/.../samuelson_zeckhauser_1988.pdf code