. |
|
Philosophy 162FLecture 6 NotesCorporate Character and Individual Responsibility
Russell P. Boisjoly, Ellen Foster Curtis and Eugene MellicanRoger Boisjoly and the Challenger Disaster: The Ethical Dimensions The Challenger disaster was an example of an accident that may people, experts on the nature of the accident in question, tried to prevent. Businesses rely on experts to deliver the basis on which to make business decisions. Business, whether profit or non-profit, must often take risks. Ideally, businesses assess the probability of risk along with the nature, extent and magnitude of possible harm and benefits. Some risks are deemed too great for certain outcomes. Generally loss of life risks are only acceptable if the probability is low and the potential benefits are great. For any business that relies on expert testimony in making decisions of dire risk, it is important that the management structure of that business take the best account of expert testimony that it can. It may be the case that in cases of dire risk, the organizational structure of a business is not merely a bottom-line concern, it is an ethical concern due to the dangers involved. By focusing on the problematic relationship between individual and organizational responsibility, this analysis reveals that the organizational structure governing the space shuttle program became the locus of responsibility in such a way that not only did it undermine the responsibilities of individual decision makers within the process, but it also became a means of avoiding real, effective responsibility throughout the entire system. (Pg. 131) The claim made by Boisjoly, et al. is that the organizational structure impeded the activity of ethical decision making. Two types of responsibility:
The second type of responsibility can be seen as a way to get corporations to act in a way that is analogous to the way that individual responsibility (in this sense) operates. Organizations have no mind to assess and act on a situation. To mimic this process, an organization can have formal rules for specific agents to review information, make recommendations, and take action. These formal rules often have strict requirements of evidence before a decision can be endorsed by the organization. Strict rules that impede or slow down action are good when attempting to prevent the initiation of courses of action that may cause harm. However, these same rules can be a problem if immediate action must be taken by the organization to prevent ongoing harm or harm that is likely to occur in the near future. Example of Structure Impeding Decision: Level III to Level II (pp. 131-132)
The formal structure of decision making within the organization required individual managers to base their decisions upon formal memos and strict chain of command. Thus a serious flaw in the shuttle was legitimately not part of the knowledge of individual managers:
This type of situation is not unique to businesses that work on the basis of engineering reports. In any business with employees, there are certain concerns with these employees that, while important, must nevertheless be put aside for organizational reasons. For example, an employer or a manager may have to assign duties or even extra reward to an employee that they do not trust with the task.
Boisjoly, et al. identify part of the organizational problem at NASA and MTI as groupthink. Their definition of groupthink comes from Irving L. Janis (Victims of Groupthink, 1972): a mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action . Groupthink refers to the deterioration of mental efficiency, reality testing, and moral judgment that results from in-group pressures. (Janis, pg. 9) Boisjoly, et al. suggest that instances of groupthink influenced the way that the information about the O-ring problems were processed at different points in the organization. The organizational structure turned from a demand for documentation in order to assess actual probabilities of risk into an adversarial system that placed a high demand on arguments against the status quo.
Though this analysis is useful, Boisjoly et al do not want the organizational analysis of the disaster to obfuscate the actions of individuals as part of this organization. The organizational structure makes demands of individuals at different stages to take personal responsibility for different aspects of the launch. This must be recognized and highlighted. Even though the organizational structure should change, individuals must take responsibility for their actions in refusing to acknowledge the evidence in their possession.
The commission investigating the disaster reported that, "There was no system in place which made it imperative that launch constraints be considered by all levels of management." (Report of the Presidential Commission on the Space Shuttle Challenger Accident, 1986, pg. 104) This contradicts the findings of the Commission that there had been a violation of such a system (pg. 113 of text) and that the individual responsible for this failure had been identified (pg. 104).
Criteria for holding individuals responsible for an outcome:
The Challenger disaster shows that no matter the organizational structure, individuals can still play key roles in the outcome of the actions of the organization.
Lisa BelkinHow Can We Save the Next Victim? Belkin's piece comes to a position that is almost the opposite of that of Boisjoly, et al. Boisjoly, et al. argue that we must look past systematic features to reveal the human causes of the outcome. Belkin argues that we must look past individual human actions to see the systematic effects that brought about the outcome. The recent history of medical error is an example of systems where there is no blame to attach to any individual. Systematic hospital error, she claims, are often the result of trusting in the ability of individuals to correct for mistakes that can only be corrected for systematically. Belkin draws our attention to ways that we can improve these systems, in medicine and in other organizations. Belkin does allow that there may be some difference between systemic medical error and that in other systems. Other organizations usually only have to deal with difficulties in personnel and equipment. Medical organizations have an additional complication in the form of patients. It is also worth noting that Belkin is writing about error. What she says may not be applicable to cases where there is genuine malfeasance on the part of an organization or its management.
Jose Martinez Case
The fallout from this incident was that the hospital in question was placed on accreditation watch by a regulatory body. This category was intended to encourage the hospital to report on and address the systematic processes that had lead to the death. This would lead to a reduction of the likelihood of similar errors.
In many of the cases that Belkin cites, the original source of error is innocuous. This innocuous source of error is then compounded when individuals within the system rely upon the results of the initial error in carrying out their tasks. Successful organizations, she seems to suggest, are those that adopt procedures to recheck crucial elements of their processes independently. An important part of this is making sure that information flows back and forth between individuals who carry out diverse elements of complicated procedures. This is not something that can be accounted for simply by individuals as individuals.
|
. |