Raphael Portela Chalhub CHBE 443 November 13, 2013RISK-BENEFIT ASSESSMENT: Three Mile Island & Chernobyl In ethics, safety is related to the matter of people judging risks as acceptable or unacceptable assuming they possess knowledge of the risks and are consequently basing their choices in their most settled value perspectives. Moreover, risk is an ultimately abstract notion and must thus be clarified to be the potential that something unwanted or harmful may occur such as bodily harm, economic loss or environmental degradation. Issues that play a role in determining the acceptability of such risks when performing safety assessment include factors such as: voluntarily accepting risks, effects of probability knowledge, magnitude and proximity of such risks, whether potential victims are identifiable beforehand and many others. In society today, many large projects are justified on the basis of riskbenefit analysis, which employs these ideas in a holistic manner; placing these two opposing concepts on a rational gauge and examining its many facades allows one to decide whether or not pursuing a venture is safe and viable. In order to illustrate such central topics in engineering ethics, the Three Mile Island and the Chernobyl incidents are discussed below focusing on their similarities and differences regarding their respective risk versus benefit assessments.1 Risk-benefit analyses are studies that attempt to answer the question of whether a product’s worth/benefit outweigh the cost/risks connected with its use. Because risk-free products are essentially inexistent, these types of examinations are of utmost important in all sorts of fields, especially those affecting the public at large such as engineering. Although at first one might perceive such investigation as something simple consisting of just quantifying the risk as a function of benefit or vice-versa, many complications tend to arise. Difficulties inherent in such process include the uncertainty in predicting benefit and risk seeing as these lay in the future, the matter of applying relatively reliable discounts it is clear that the accident occurred because of both mechanical failures in conjunction with human errors. it is clear that the president committed the first risk-benefit analysis failure of a series. On the other hand. . The ambiguity of the control panel’s display. which occurred on March 28. 1 All in all. and so forth. the fact that block valves had been left closed after maintenance work had been done two days earlier resulting in the auxiliary pumps being unable to supply feedwater to the steam generator. although the claims made by the newspaper might seem incorrect in the eyes of the president. the president of Metropolitan Edison – the power company responsible for TMI – chose to completely ignore such red flag deeming such allegations to be “something less than a patriotic act”. human errors included. 1979 in Pennsylvania. converting individual factors that affect the risk or benefit quantification to equivalent units. Therefore. an example of a detected mechanical failure was the fact that the valve responsible for lowering the pressure in the reactor remained open contrary to what the control panel indicated. In analyzing whether a reasonable risk versus benefit assessment was done prior to the episode. the fact remains that the risk-benefit analysis performed by Metropolitan Edison was not satisfactory in terms of general equipment selection and personnel training. As president of a nuclear power plant.when accounting for delayed effects. could have been easily avoided at little cost and high benefit. it is his responsibility to conduct a thorough inspection to ascertain that these articles are indeed erroneous. Instead of taking into consideration such accusations. for example. for instance. 1 Such action can most certainly be classified as irresponsible. 1 By examining the events that led to the disaster. it is important to start by pointing out that the possibility of unsafe conditions at the plant had already been brought to light by the media. 1 The first accident studied refers to the Three Mile Island (TMI) incident. it is his duty to maintain the neighboring populace under safe conditions and thus it would have been wise to provide a public statement refuting such accusations in a thorough and rational manner backed by tests/facts instead of publicly dismissing the raised concern. For instance. but instead signaled the status of the solenoid in the reactor. Unlike the TMI accident where the catastrophe was the result of combined mechanical and human failures.2 Such a small detail would have most likely helped the operators diagnose the issue hours earlier.apparently the light in the control panel was not wrongfully indicating that the valve was closed. seeing as such equipment is to be used during emergencies. 1 The second case discussed pertains to the Chernobyl incident. However. also a failure in a nuclear power plant complex. Kiev requested an expected demand from the unit and the test had to be delayed. a clear failure in performing reasonable risk-benefit analysis can be evidenced by examining the design of the respirators used by the personnel who stayed behind after the force evacuation. once the SCRAM had begun. anyone with basic knowledge would have known that the heat was still being generated by decay and it had to leave somehow. 1 . Another important point to make regards the fact that the personnel/operators of such plant seemed not to have a thorough knowledge of the fundamentals behind the TMI-2. The first violation occurred exactly at that moment: the personnel had disabled the core-cooling emergency system for the test. On April 25. but it remained turned off despite the test postponement. it is evident that communication is of utmost importance in such scenario. 1 Moreover. An easy fix providing high benefit for low cost/risk would be to install radio communicators in such equipment to ensure that the men’s performance in fixing such time-sensible issue was not further hindered. These apparently made communication difficult. 1986 a test was scheduled on Reactor 4 in order “to determine how long the mechanical inertia of the turbine-generator’s rotating mass could keep the generator turning and producing electric power after the steam supply was shut off”1. the Chernobyl incident was mostly caused by operator imprudence and a series of explicit safety violations. Despite the mistakes mentioned above. Such understanding would have most certainly helped in resolving the issue in a timelier manner. Such realization makes it obvious that the crew. they were aware of the instabilities of the reactor. Instead of interrupting the test. the personnel decided to keep the test running despite being aware of the low output by raising the control rods. significantly below the desired range. they focused on how such condition would delay their work and performance as employees. 1 After the reactor reached an uncontrollable state. The measure was only sufficient to raise the power to 200 megawatts. failed to sensibly judge the risks and benefits of continuing such examination. It is clear that by this stage the risk-benefit analysis being employed by the operating crew is already quite flawed. the test proceeded. despite possessing reasonable understanding of the reactor. Even without much knowledge of how nuclear power plants operate. the personnel should’ve recognize that the risk of tweaking and re-tweaking the system to perform the test by the desired deadline must outweigh the benefit at this point. Furthermore. The carelessness of the operators becomes even more apparent in this scenario where clearly the operation is running at dangerous conditions. another error took place when the control device was not correctly reprogrammed to maintain the power at the desired lower level – between 700 and 1000 megawatts – yielding an extremely low output of 30 megawatts where it behaves unstably. Yet again. explosions occurred and radioactivity spread. An additional faulty risk-benefit analysis is illustrated through the fact that Sweden detected an increase . the fact that successive measures to restore normal test conditions had failed is by itself a red flag.Again. 1 An even more unambiguous display of their lack of ability to perform a reasonable risk versus benefit assessment is shown subsequently when “the operators at this point recognized that because of the instabilities in this reactor and the way the xenon poisoning builds up. another safety violation took place when emergency signals and automatic shutdown controls were blocked so that the test could continue. but instead of worrying about how that affected the plant’s safety. once the reactor is shut down. they would have to wait a long time before starting it up again”3. It is absurd to think that villages a few miles away from the plant were only evacuated the next day. . 1 In summary. such assessments cannot be taken lightly.R. the truth remains that regardless of the difficulty in performing risk-benefit analysis for all sorts of products/public services. A prompt evacuation strategy is most certainly a measure with high benefit relative to cost that should have been in place ever since the start of such plant. In the engineering world where one’s contributions often affect the general public. Clearly the U. The truth is that building a completely safe product/plant is virtually impossible and thus it is important to build system that can adequately respond to failure. despite the similarities and differences between the two accidents described above.in radioactivity before Moscow did. engineers must constantly and repeatedly strive to employ reasonable risk-benefit analysis in their everyday work to ensure the safety of their communities.S. these investigations are of utmost importance for such pre-emptive steps can mitigate catastrophic consequences in the future. was not appropriately prepared to handle such radioactive consequence. Moscow evidently had not developed such vital safe exit strategy. For this reason.S. Clearly in both incidents preventive measures could have saved the lives of many. 4th. Donald. Mike W. 236. 2 Norman. Print. NY: Basic Books. Print. 117-142. Science. New York. NY: McGrawHill. The Design of Everyday Things. and Donald Norman.Bibliography 1 Martin. 2005. 43-44. Ethics in Engineering. Print. 1988. 1987. .. New York. and Roland Schinzinger. John F. Nuclear Power After Chernobyl. ed. 3 Ahearne. 673-679.