November 28, 2022

Large-Scale Disasters: Mechanistic Framework for Prediction, Control and Mitigation

Mohamed Gad-el-Hak, Ph.D.


The subject of large-scale disasters is broadly introduced in this article. Both the art and science of predicting, preventing and mitigating natural and manmade disasters are discussed. A universal, quantitative metric that puts all natural and manmade disasters on a common scale is proposed. Issues of prediction, control and mitigation of catastrophes are presented. The laws of nature govern the evolution of any disaster. In some cases, as for example weather-related disasters, the first-principles laws of classical mechanics could be written in the form of field equations, but exact solutions of these often nonlinear differential equations are impossible to obtain particularly for turbulent flows, and heuristic models together with intensive use of supercomputers are necessary to proceed to a reasonably accurate forecast. In other cases, as for example earthquakes, the precise laws are not even known and prediction becomes more or less a black art. Management of any type of disaster is more art than science. Nevertheless, much can be done to alleviate the resulting pain and suffering. The expansive presentation of the broad field of large-scale disasters precludes a detailed coverage of any one of the many topics touched upon. Three take-home messages are conveyed, however: a universal metric for all natural and manmade disasters is presented; all facets of the genre are described; and a proposal is made to view all disasters as dynamical systems governed for the most part by the laws of classical mechanics.


In this article, the subject of large-scale disasters is broadly introduced. Both the art and science of predicting, preventing and mitigating natural and manmade disasters are discussed. A universal, quantitative metric that puts all natural and manmade disasters on a common scale is proposed. Issues of prediction, control and mitigation of catastrophes are presented. The expansive presentation of the many facets of disaster research precludes a detailed coverage of any one of the many topics covered. We merely scratch the surface of a broad subject that may be of interest to all those who view the world mechanistically. The hope is that few readers of the Journal of Critical Incident Analysis, who are not already involved in this aspect of disaster research, would benefit from this particular viewpoint whose practical importance cannot be overstated. The article is excerpted from Chapter 2 of the book edited by Gad-el-Hak (2008).

Are Disasters a Modern Curse?

Although it appears that way when the past few years are considered, large-scale disasters have been with us since Homo sapiens set foot on this third planet from the Sun. Frequent disasters struck the Earth even before then, as far back as the time of its formation around 4.5 billion years ago. In fact, the geological Earth that we know today is believed to be the result of agglomeration of the so-called planetesimals and subsequent impacts of bodies of similar mass (Huppert, 2000). The planet was left molten after each giant impact, and its outer crust was formed on radiative cooling to space. Those were the “good” disasters perhaps. On the bad side, there have been several mass extinctions throughout the Earth’s history. The dinosaurs, along with about 70% of all species existing at the time, became extinct because a large meteorite struck the Earth 65 million years ago and the resulting airborne dust partially blocked the Sun, thus making it impossible for cold-blooded animals to survive. However, if we concern ourselves with our own warm-blooded species, then starting 200,000 years ago, ice ages, famines, infections, and attacks from rival groups and animals were constant reminders of human vulnerability. On average, there are about three large-scale disasters that strike the Earth every day, but only a few of these natural or manmade calamities make it to the news. Humans have survived because we were programmed to do so. We return to this point in Section 6.


Because of the nature of the subject, few of the topics discussed are not mainstream for this journal, for example the mechanistic aspects of disasters. The mechanics of disasters are more extensively covered, but even here we begin the conversation rather than actually solving specific problems. Appropriate references are made, however, to close the gap.

The article is organized as follows. We begin by proposing a metric by which disasters are sized in terms of the number of people affected and/or the extent of the geographic area involved. In Section 3, the different facets of large-scale disasters are described. The science, particularly the mechanics, of disasters is outlined in Section 4. Sections 5–7 respectively cover the art of disaster management, a bit of sociology, and few recent disasters as examples. Finally, brief concluding remarks are given in Section 8.

Disaster Scope


There is no easy answer to the question of whether a particular disaster is large or small. The mild injury of one person may be perceived as catastrophic by that person or by his or her loved ones. What we consider herein, however, is the adverse effects of an event on a community or an ecosystem. What makes a disaster a large-scale one is the number of people affected by it and/or the extent of the geographic area involved. Such disaster taxes the resources of local communities and central governments. Under the weight of a large-scale disaster, a community diverges substantially from its normal social structure. Return to normalcy is typically a slow process that depends on the severity, but not the duration, of the antecedent calamity as well as the resources and efficiency of the recovery process.

The extreme event could be natural, manmade, or a combination of the two in the sense of a natural disaster made worse by humans’ past actions. Examples of naturally occurring disasters include earthquakes, wildfires, pandemics, volcanic eruptions, mudslides, floods, droughts, and extreme weather phenomena such as ice ages, hurricanes, tornadoes, and sandstorms. Human foolishness, folly, meanness, mismanagement, gluttony, unchecked consumption of resources, or simply sheer misfortune may cause wars, energy crises, economic collapses of a nations or corporations, market crashes, fires, global warming, famine, air/water pollution, urban sprawl, desertification, deforestation, bus/train/airplane/ship accident, oil slicks, or terrorist acts. Citizens suffering under the tyranny of a despot or a dictator can too be considered under the duress of a disaster, and, of course, genocide, ethnic cleansing and other types of mass murder are gargantuan disasters that often test the belief in our own humanity. Although technological advances exponentially increased human prosperity, they also provided humans with more destructive power. Manmade disasters have caused the death of at least 200 million people during the twentieth century, a cruel age without equal in the history of man (de Boer & van Remmen, 2003).

In addition to the degree or scope of a disaster, there is also the issue of the rapidity of the calamity. Earthquakes, for example, occur over extremely short time periods measured in seconds, whereas anthropogenic catastrophes such as global warming and air and water pollution are often slowly-evolving disasters, their duration measured in years and even decades or centuries, although their devastation, over the long term, can be worse than that of a rapid, intense calamity (McFedries, 2006). The painful, slow death of a cancer patient who contracted the dreadful disease as a result of pollution is just as tragic as the split-second demise of a human at the hands of a crazed suicide bomber. The latter type of disaster makes the news, but the former does not. This is quite unsettling because the death of many spread over years goes unnoticed for the most part. The fact that 100 persons die in a week in a particular country as a result of starvation is not a typical news story. However, 100 humans perishing in an airplane crash will make CNN headlines for days

For the disaster’s magnitude, how large is large? Much the same as is done to individually size hurricanes, tornadoes, earthquakes, and, very recently, winter storms, we propose herein a universal metric by which all types of disaster are sized in terms of the number of people affected and/or the extent of the geographic area involved. This quantitative scale applies to both natural and manmade disasters. The suggested scale is nonlinear, logarithmic in fact, much the same as the Richter scale used to measure the severity of an earthquake. Thus, moving up the scale requires an order of magnitude increase in the severity of the disaster as it adversely affects people or an ecosystem. Note that a disaster may affect only a geographic area without any direct and immediate impact on humans. For example, a wildfire in an uninhabited forest may have long-term adverse effects on the local and global ecosystem, although no human is immediately killed, injured, or dislocated as a result of the event.


Figure 1.  Classification of disaster severity.


The scope of a disaster is determined if at least one of two criteria is met, relating to either the number of displaced/tormented/injured/killed people or the adversely affected area of the event. We classify disaster types as being of Scopes I to V, according to the scale pictorially illustrated in Figure 1. For example, if 70 persons were injured as a result of a wildfire that covered 20 km2, this would be considered Scope III, large disaster (the larger of the two categories II and III). However, if 70 persons were killed as a result of a wildfire that covered 2 km2, this would be considered Scope II, medium disaster. An unusual example, at least in the sense of even attempting to classify it, is the close to 80 million citizens of Egypt (area slightly larger than 1 million sq. km) who have been tormented for more than a half-century by a virtual police state.1 This manmade cataclysm is readily stigmatized by the highest classification, Scope V, gargantuan disaster.

The quantitative metric introduced herein is contrasted to the conceptual scale devised by Fischer (2003a; 2003b), which is based on the degree of social disruption resulting from an actual or potential disaster. His ten disaster categories are based on the scale, duration, and scope of disruption and adjustment of a normal social structure, but those categories are purely qualitative. For example, Disaster Category 3 (DC-3) is indicated if the event partially strikes a small town (major scale, major duration, partial scope), whereas DC-8 is reserved for a calamity massively striking a large city (major scale, major duration, major scope). The recent article by Gad-el-Hak (2010) provides further discussion on the continuing debate of defining, scoping and categorizing disasters and their impact.

The primary advantage of having a universal classification scheme such as the one proposed herein is that it gives officials a quantitative measure of the magnitude of the disaster so that proper response can be mobilized and adjusted as warranted. The metric suggested applies to all types of disaster. It puts them on a common scale, which is more informative than the variety of scales currently used for different disaster types; the Saffir–Simpson scale for hurricanes, the Fujita scale for tornadoes, the Richter scale for earthquakes, and the recently introduced Northeast Snowfall Impact Scale (notable, significant, major, crippling, extreme) for the winter storms that occasionally strike the northeastern region of the United States. Of course, the individual scales also have their utility; for example, knowing the range of wind speeds in a hurricane as provided by the Saffir–Simpson scale is a crucial piece of information to complement the number of casualties the proposed scale supplies. In fact, a prediction of wind speed allows estimation of potential damage to people and property. The proposed metric also applies to disasters, such as terrorist acts or droughts, where no quantitative scale is otherwise available to measure their severity.

In formulating all scales, including the proposed one, a certain degree of arbitrariness is unavoidable. In other words, none of the scales are totally objective. The range of 10 to 100 persons associated with a Scope II disaster, for example, could very well be 20 to 80, or some other range. What is important is the relative comparison among various disaster degrees; a Scope IV disaster causes an order of magnitude more damage than a Scope III disaster, and so on. One could arbitrarily continue beyond five categories, always increasing the influenced number of people and geographic area by an order of magnitude, but it seems that any calamity adversely affecting more than 10,000 persons or 1,000 km2 is so catastrophic that a single Scope V is adequate to classify it as a gargantuan disaster. The book Catastrophe is devoted to analyzing the risk of and response to unimaginable but not impossible calamities that have the potential of wiping out the human race (Posner, 2004). Curiously, its author, Richard A. Posner, is a judge in the U.S. Seventh Circuit Court of Appeals.

In the case of certain disasters, the scope can be predicted in advance to a certain degree of accuracy; otherwise, the scope can be estimated shortly after the calamity strikes with frequent updates as warranted. The magnitude of the disaster should determine the size of the first-responder contingency to be deployed; which hospitals to mobilize and to what extent; whether the military forces should be involved; what resources, such as, food, water, medicine, and shelter should be stockpiled and delivered to the stricken area, and so on. Predicting the scope should facilitate the subsequent recovery and accelerate the return to normalcy. The proposed metric is systematically applied in Section 7.13 to the thirteen disasters used as prototypical examples in Section 7.1–7.12.

Facets of Large-Scale Disasters

A large-scale disaster is an event that adversely affects a large number of people, devastates a large geographic area, and taxes the resources of local communities and central governments. Although disasters can naturally occur, humans can cause their share of devastation. There is also the possibility of human actions causing a natural disaster to become more damaging than it would otherwise. An example of such an anthropogenic calamity is the intense coral reef mining off the Sri Lankan coast, which removed the sort of natural barrier that could mitigate the force of waves. As a result of such mining, the 2004 Pacific tsunami devastated Sri Lanka much more than it would have otherwise. A second example is the soil erosion caused by overgrazing, farming, and deforestation. In April 2006, wind from the Gobi Desert dumped 300,000 tons of sand and dust on Beijing, China. Such gigantic dust tempests—exasperated by soil erosion—blow around the globe, making people sick, killing coral reefs, and melting mountain snow packs continents away. Examples such as this incited the 1995 Nobel laureate and Dutch chemist Paul J. Crutzen to coin the present geological period as anthropocene to characterize humanity’s adverse effects on global climate and ecology,

What could make the best of a bad situation is to be able to predict the disaster’s occurrence, location, and severity. This can help prepare for the calamity and evacuating large segments of the population out of harm’s way. For certain disaster types, their evolution equations can be formulated mostly from a mechanistic viewpoint. Predictions can then be made to different degrees of success using heuristic models, empirical observations, and super computers. Once formed, the path and intensity of a hurricane, for example, can be predicted to a reasonable degree of accuracy up to 1 week in the future. This provides sufficient warning to evacuate several medium or large cities in the path of the extreme event. However, smaller-scale severe weather such as tornadoes can only be predicted up to 15 minutes in the future, giving very little window for action. Earthquakes cannot be predicted beyond stating that there is a certain probability of occurrence of a certain magnitude earthquake at a certain geographic location during the next 50 years. Such predictions are almost as useless as stating that the Sun will burn out in a few billion years.

Once disaster strikes, mitigating its adverse effects becomes the primary concern, particularly: how to save lives, take care of the survivors’ needs, and protect properties from any further damage. Dislocated people need shelter, water, food, and medicine. Both the physical and the mental health of the survivors, as well as relatives of the deceased, can be severely jeopardized. Looting, price gouging, and other law-breaking activities need to be contained, minimized, or eliminated. Hospitals need to prioritize and even ration treatments, especially in the face of the practical fact that the less seriously injured tend to arrive at emergency rooms first, perhaps because they transported themselves there. Roads need to be operable and free of landslides, debris, and traffic jams for the unhindered flow of first responders and supplies to the stricken area, and evacuees and ambulances from the same. This is not always the case, especially if the antecedent disaster damages most if not all roads, as occurred after the 2005 Kashmir Earthquake. Buildings, bridges, and roads need to be rebuilt or repaired, and power, potable water, and sewage needs to be restored.

Figure 2 depicts the different facets of large-scale disasters. The important thing is to judiciously employ the finite resources available to improve the science of disaster prediction, and to artfully manage the resulting mess to minimize loss of life and property.

The Science of Disaster Prediction and Control


Science, particularly classical mechanics, can help predict the course of certain types of disasters. When, where, and how intense would a severe weather phenomena strike? Are the weather conditions favorable for extinguishing a particular wildfire? What is the probability of a particular volcano erupting? How about an earthquake striking a population center? How much air and water pollution is going to be caused by the addition of a factory cluster to a community? How would a toxic chemical or biological substance disperse in the atmosphere or in a body of water? Below a certain concentration, certain danger substances are harmless, and “safe” and “dangerous” zones could be established based on the dispersion forecast. The degree of success in answering these and similar questions varies dramatically. Once formed, the course and intensity of a hurricane (tropical cyclone), which typically lasts from inception to dissipation for a few weeks, can be predicted about one week in advance. The path of the much smaller and short-lived, albeit more deadly, tornado can be predicted only about 15 minutes in advance, although weather conditions favoring its formation can be predicted a few hours ahead.

Earthquake prediction is far from satisfactory but is seriously attempted nevertheless. The accuracy of predicting volcanic eruptions is somewhere in between those of earthquakes and severe weather. Patané, de Gori, Chiarabba and Bonaccorso (2006) report on the ability of scientists to “see” inside Italy’s Mount Etna and forecast its eruption using seismic tomography, a technique similar to that used in computed tomography scans in the medical field. The method yields time photographs of the three-dimensional movement of rocks to detect their internal changes. The success of the technique is in no small part due to the fact that Europe’s biggest volcano Mount Etna is equipped with a high-quality monitoring system and seismic network, tools that are not readily available for most volcanoes.

Science and technology can also help control the severity of a disaster, but here the achievements to date are much less spectacular than those in the prediction arena. Cloud seeding to avert drought is still far from being a routine, practical tool. Nevertheless it has been tried since 1946. In 2008, Los Angeles county officials used the technique as part of a drought-relief project that used silver iodide to seed clouds over the San Gabriel Mountains to ward off fires. China employed the same technology to bring some rain and clear the air before the 2008 Beijing Summer Olympics. Despite the difficulties, cloud seeding is still a notch more rational than the then Governor of Texas George W. Bush’s 1999 call in the midst of a dry period to “pray for rain.”

Slinging a nuclear device toward an asteroid or a meteor to avert its imminent collision with Earth remains solidly in the realm of science fiction (in the 1998 film Armageddon, a Texas-size asteroid was courageously nuked from its interior!). In contrast, employing scientific principles to combat a wildfire is doable, as is the development of scientifically based strategies to reduce air and water pollution, moderate urban sprawl, evacuate a large city, and minimize the probability of accident for air, land, and water vehicles. Structures could be designed to withstand an earthquake of a given magnitude, wind of a given speed, and so on. Dams could be constructed to moderate the flood–drought cycles of rivers, and levees/dikes could be erected to protect land below sea level from the vagaries of the weather. Storm drains, fire hydrants, fire-retardant materials, sprinkler systems, pollution control, simple hygiene, strict building codes, traffic rules and regulations in air, land and sea, and many other examples are the measures a society should take to mitigate or even eliminate the adverse effects of certain natural and manmade disasters. Of course, there are limits to what we can do. Although much better fire safety will be achieved if a firehouse is erected, equipped, and manned around every city block, and less earthquake casualties will occur if every structure is built to withstand the strongest possible tremor, the prohibitive cost of such efforts clearly cannot be justified or even afforded.

At the extreme scale, geoengineering is defined as options that would involve large-scale engineering of our environment in order to combat or counteract the effects of changes in atmospheric chemistry. Along those lines, Nobel laureate Paul Crutzen has proposed a method of artificially cooling the global climate by releasing particles of sulfur in the upper atmosphere, which would reflect sunlight and heat back into space. Scientists are taking the controversial proposal seriously because Crutzen has a proven track record in atmospheric research. Sponsored by the U.S. National Science Foundation, a scientific meeting was held in 2008 to explore far-fetched strategies to combat hurricanes and tornadoes.

In contrast to natural disasters, manmade ones are generally somewhat easier to control, but more difficult to predict. The war on terrorism is a case in point. Who could predict the behavior of a crazed suicide bomber? A civilized society spends its valuable resources on intelligence gathering, internal security, border control, and selective/mandatory screening to prevent (control) such devious behavior, whose dynamics (i.e., time evolution) obviously cannot be distilled into a differential equation to be solved. However, even in certain disastrous situations that depend on human behavior, predictions can sometimes be made; crowd dynamics being a prime example, where the behavior of a crowd in an emergency can to some degree be modeled and anticipated so that adequate escape or evacuation routes can be properly designed (Adamatzky, 2005). Helbing, Farkas, and Vicsek (2002) write on simulation of panic situations and other crowd disasters modeled as nonlinear dynamical systems. All such models are heuristic and do not stem from the first-principles laws of classical mechanics.

The tragedy of the numerous manmade disasters is that they are all preventable, at least in principle. We cannot prevent a hurricane, at least not yet, but using less fossil fuel and seeking alternative energy sources could at least slow global warming trends down. Conflict resolution strategies can be employed between nations to avert wars. Speaking of wars, the Iraqi–American poet Dunya Mikhail, lamenting on the many manmade disasters, calls the present period “The Tsunamical Age.” A bit more humanity, commonsense, selflessness, and moderation, as well as a bit less greed, meanness, selfishness, and zealotry, and the world will be a better place for having fewer manmade disasters.

Modeling the Disaster’s Dynamics


For disasters that involve (fluid) transport phenomena, such as severe weather, fire, and release of toxic substance, the governing equations can be formulated subject to some assumptions, the less the better. Modeling is usually in the form of nonlinear partial differential equations with an appropriate number of initial and boundary conditions. Integrating those field equations leads to the time evolution, or the dynamics, of the disaster. In principle, marching from the present (initial conditions) to the future gives the potent predictability of classical mechanics and ultimately leads to the disaster’s forecast.

However, the first-principles equations are typically impossible to solve analytically, particularly if the fluid flow is turbulent, which unfortunately is the norm for the high Reynolds number flows encountered in the atmosphere and oceans. Furthermore, initial and boundary conditions are required for both analytical and numerical solutions, and massive amounts of data need to be collected to determine those conditions with sufficient resolution and accuracy. Computers are not big enough either, so numerical integration of the instantaneous equations (direct numerical simulations) for high Reynolds number natural flows is computationally prohibitively expensive, if not outright impossible at least for now and the foreseeable future. Heuristic modeling then comes to the rescue, but at a price. Large eddy simulations, spectral methods, probability density function models, and the more classical Reynolds stress models are examples of such closure schemes that are not as computationally intensive as direct numerical simulations, but are not as reliable either. This type of second-tier modeling is phenomenological in nature and does not stem from first principles. The more heuristic the modeling is, the less accurate the expected results are. Together with massive ground, sea, and sky data to provide at least in part the initial and boundary conditions, the models are entered into supercomputers that come out with a forecast, whether it is a prediction of a severe thunderstorm that is yet to form, the future path and strength of an existing hurricane, or the impending concentration of a toxic gas that was released in a faraway location some time in the past. The issue of non-integrability of certain dynamical systems is an additional challenge and opportunity that is revisited in Section 4.6.

For other types of disasters such as earthquakes, the precise laws are not even known mostly because proper constitutive relations are lacking. Additionally, deep underground data are difficult to gather to say the least. Predictions in those cases become more or less a black art.

In the next seven subsections, we focus on the prediction of disasters involving fluid transport. This important subject has spectacular successes within the past few decades, for example, in being able to predict the weather a few days in advance. The accuracy of today’s 5-day forecast is the same as the 3-day and 1.5-day ones in 1976 and 1955, respectively. The 3-day forecast of a hurricane’s strike position is accurate to within 100 km, about a 1-hour drive on the highway (Gall & Parsons, 2006). The painstaking advances made in fluid mechanics in general and turbulence research in particular together with the exponential growth of computer memory and speed undoubtedly contributed immeasurably to those successes.

The British physicist Lewis Fry Richardson was perhaps the first to make a scientifically based weather forecast. Based on data taken at 7:00 am, May 20, 1910, he made a 6 hour “forecast” that took him 6 weeks to compute using a slide rule. The belated results2 were totally wrong as well! In his remarkable book, Richardson (1922) wrote, “Perhaps some day in the dim future it will be possible to advance the computations faster than the weather advances and at a cost less than the saving to mankind due to the information gained. But that is a dream” (p. vii.). We are happy to report that Richardson’s dream is one of the few that came true. A generation ago, the next day’s weather was hard to predict. Today, the 10-day forecast is available 24/7 on for almost any city in the world. While perhaps not the most accurate, this is far better than the pioneering Richardson’s 6-hour forecast.

The important issue is to precisely state the assumptions needed to write the evolution equations, which are basically statements of the conservation of mass, momentum and energy, in a certain form. The resulting equations and their eventual analytical or numerical solutions are only valid under those assumptions. This seemingly straightforward fact is often overlooked and wrong answers readily result when the situation we are trying to model is different from that assumed. Much more details of the science of disaster’s prediction are provided in a book edited by Gad-el-Hak (2008).

The Fundamental Transport Equations

Each fundamental law of fluid mechanics and heat transfer—conservation of mass, momentum, and energy—are listed first in their raw form, (i.e., assuming only that the speeds involved are non-relativistic and that the fluid is a continuum). In non-relativistic situations, mass and energy are conserved separately and are not interchangeable. This is the case for all normal fluid velocities that we deal with in everyday situations—far below the speed of light. The continuum assumption ignores the grainy (microscopic) structure of matter. It implies that the derivatives of all the dependent variables exist in some reasonable sense. In other words, local properties such as density and velocity are defined as averages over large elements compared with the microscopic structure of the fluid, but small enough in comparison with the scale of the macroscopic phenomena to permit the use of differential calculus to describe them. The resulting equations therefore cover a broad range of situations, the exception being flows with spatial scales that are not much larger than the mean distance between the fluid molecules, as for example in the case of rarefied gas dynamics, shock waves that are thin relative to the mean free path, or flows in micro- and nanodevices. Thus, at every point in space–time in an inertial (i.e., non-accelerating/non-rotating), Eulerian frame of reference, the three conservation laws for non-chemically reacting fluids, respectively, read in Cartesian tensor notations




where  is the fluid density,  is an instantaneous velocity component (u,v,w),  is the second-order stress tensor (surface force per unit area),  is the body force per unit mass, e is the internal energy per unit mass, and  is the sum of heat flux vectors due to conduction and radiation. The independent variables are time t, and the three spatial coordinates ,  and , or (x,y,z). Finally, the Einstein’s summation convention applies to all repeated indices. Gad-el-Hak (2000) provides a succinct derivation of the previous conservation laws for a continuum, non-relativistic fluid.

Closing the Equations

Equations (1), (2) and (3) constitute five differential equations for the seventeen unknowns , , , e, and . Absent any body couples, the stress tensor is symmetric having only six independent components, which reduces the number of unknowns to fourteen. To close the conservation equations, relation between the stress tensor and deformation rate, relation between the heat flux vector and the temperature field, and appropriate equations of state relating the different thermodynamic properties is needed. Thermodynamic equilibrium implies that the macroscopic quantities have sufficient time to adjust to their changing surroundings. In motion, exact thermodynamic equilibrium is impossible because each fluid particle is continuously having volume, momentum, or energy added or removed, and so in fluid dynamics and heat transfer we speak of quasi-equilibrium. The second law of thermodynamics imposes a tendency to revert to equilibrium state, and the defining issue here is whether the flow quantities are adjusting fast enough. The reversion rate will be very high if the molecular time and length scales are very small as compared to the corresponding macroscopic flow scales. This will guarantee that numerous molecular collisions will occur in sufficiently short time to equilibrate fluid particles whose properties vary little over distances comparable to the molecular length scales. Gas flows are considered in a state of quasi-equilibrium if the Knudsen number—the ratio of the mean free path to a characteristic length of the flow—is less than 0.1. In such flows, the stress is linearly related to the strain rate, and the (conductive) heat flux is linearly related to the temperature gradient. Empirically, common liquids such as water follow the same laws under most flow conditions. Gad-el-Hak (1999) provides extensive discussion of situations in which the quasi-equilibrium assumption is violated. These may include gas flows at great altitudes, flows of complex liquids such as long-chain molecules, and even ordinary gas and liquid flows when confined in micro- and nanodevices.

For a Newtonian, isotropic, Fourier,3 ideal gas, for example, those constitutive relations read


Heat flux due to radiation                                                                   (5)

and                                                              (6)


where p is the thermodynamic pressure,  and  are the first and second coefficients of viscosity, respectively,  is the unit second-order tensor (Kronecker delta),  is the thermal conductivity, T is the temperature field,  is the specific heat at constant volume, and R is the gas constant. The Stokes’ hypothesis relates the first and second coefficients of viscosity, , although the validity of this assumption has occasionally been questioned (Gad-el-Hak, 1995). With the previous constitutive relations and neglecting radiative heat transfer,4 Equations (1), (2) and (3), respectively, read






The three components of the vector equation (8) are the Navier–Stokes equations expressing the conservation of momentum (or, more precisely, stating that the rate of change of momentum is equal to the sum of all forces) for a Newtonian fluid. In the thermal energy equation (9),  is the always positive (as required by the Second Law of Thermodynamics) dissipation function expressing the irreversible conversion of mechanical energy to internal energy as a result of the deformation of a fluid element. The second term on the right-hand side of Equation (9) is the reversible work done (per unit time) by the pressure as the volume of a fluid material element changes. For a Newtonian, isotropic fluid, the viscous dissipation rate is given by




There are now six unknowns, , , p and T, and the five coupled Equations (7), (8) and (9), plus the equation of state relating pressure, density and temperature. These six equations, together with sufficient number of initial and boundary conditions constitute a well-posed, albeit formidable, problem. The system of equations (7)–(9) is an excellent model for the laminar or turbulent flow of most fluids, such as air and water under most circumstances, including high-speed gas flows for which the shock waves are thick relative to the mean free path of the molecules.

Polymers, rarefied gases, and flows in micro- and nanodevices are not equilibrium flows and have to be modeled differently. In those cases, higher-order relations between the stress tensor and rate of strain tensor, and between the heat flux vector and temperature gradient, are used. In some cases, the continuum approximation is abandoned altogether, and the fluid is modeled as it really is—a collection of molecules. The molecular-based models used for those unconventional situations include molecular dynamics simulations, direct simulation Monte Carlo methods, and the analytical Boltzmann equation (Gad-el-Hak, 2006). Under certain circumstances, hybrid molecular–continuum formulation is required.

Returning to the continuum, quasi-equilibrium equations, and considerable simplification is achieved if the flow is assumed incompressible, usually a reasonable assumption provided that the characteristic flow speed is less than 0.3 of the speed of sound and other conditions are satisfied. The incompressibility assumption, discussed in greater detail by Panton (2005), is readily satisfied for almost all liquid flows and for many gas flows. In such cases, the density is assumed either a constant or a given function of temperature (or species concentration).

The governing equations for such flows are






These are five equations for the five dependent variables, p and T. Note that the left-hand side of Equation (13) has the specific heat at constant pressure  and not . This is the correct incompressible flow limit—of a compressible fluid—as discussed in detail in Section 10.9 of Panton (2005); a subtle point perhaps but one that is frequently missed in textbooks. The system of equations (11)–(13) is coupled if either the viscosity or density depends on temperature; otherwise, the energy equation is uncoupled from the continuity and momentum equations, and can therefore be solved after the velocity and pressure fields are determined from solving Equations (11) and (12). For most geophysical flows, the density depends on temperature and/or species concentration, and the previous system of five equations is coupled.

In non-dimensional form, the incompressible flow equations read






where  is a dimensionless function that characterizes the viscosity variation with temperature, and Re, Gr, Pe and Ec are, respectively, the Reynolds, Grashof, Péclet and Eckert numbers. These dimensionless parameters determine the relative importance of the different terms in the equations.

For both the compressible and the incompressible equations of motion, the transport terms are neglected away from solid walls in the limit of infinite Reynolds number (i.e., zero Knudsen number). The flow is then approximated as inviscid, non-conducting and non-dissipative; in other words, it is considered in perfect thermodynamic equilibrium. The corresponding equations in this case read (for the compressible case):





The Euler equation (18) can be integrated along a streamline, and the resulting Bernoulli’s equation provides a direct relation between the velocity and the pressure.

4.4 Other Complexities Despite their already complicated nature, other effects could further entangle the transport equations previously introduced. We list herein a few examples. Geophysical flows occur at such large length scales as to invalidate the inertial frame assumption made previously. The Earth’s rotation affects these flows, and such things as centrifugal and Coriolis forces enter into the equations rewritten in a non-inertial frame of reference fixed with the rotating Earth. Oceanic and atmospheric flows are more often than not turbulent flows that span the enormous range of length scales of nine decades, from a few  millimeters to thousands of kilometers (Garrett, 2000; McIntyre, 2000).

Density stratification is important for many atmospheric and oceanic phenomena. Buoyancy forces are produced by density variations in a gravitational field, and those forces drive significant convection in natural flows (Linden, 2000). In the ocean, those forces are further complicated by the competing influences of temperature and salt (Linden, 2000). The competition affects the large-scale global ocean circulation and, in turn, climate variability. For weak density variations, the Bousinessq’s approximation permits the use of the coupled incompressible flow equations, but more complexities are introduced in situations with strong density stratification, such as when strong heating and cooling is present. Complex topography further complicates convective flows in the ocean and atmosphere.

Air–sea interface governs many of the important transport phenomena in the ocean and atmosphere, and plays a crucial role in determining the climate. The location of that interface is itself not known a priori and thus is the source of further complexity in the problem. Even worse, the free boundary nature of the liquid–gas interface, in addition to the possibility of breaking that interface and forming bubbles and droplets, introduces new nonlinearities that augment or compete with the customary convective nonlinearity (Davis, 2000). Chemical reactions are obviously important in fires and are even present in some atmospheric transport problems. When liquid water or ice is present in the air, two-phase treatment of the equations of motion may need to be considered, again complicating even the relevant numerical solutions.

However, even in those complex situations described previously, simplifying assumptions can be made rationally to facilitate solving the problem. Any spatial symmetries in the problem must be exploited. If the mean quantities are time independent, then that too can be exploited.

An extreme example of simplification that surprisingly yields reasonable results includes the swirling giants depicted in Figure 3. Here, an oceanic whirlpool, a hurricane, and a spiral galaxy are simply modeled as a rotating, axisymmetric viscous core and an external inviscid vortex joined by a Burgers’ vortex. The viscous core leads to a circumferential velocity proportional to the radius, and the inviscid vortex leads to a velocity proportional to 1/r. This model leads to surprisingly good results in some narrow sense for those exceedingly complex flows.

Figure 3.  Simple modeling of an oceanic whirlpool, a hurricane and a spiral galaxy.


A cyclone’s pressure is the best indicator of its intensity because it can be precisely measured, whereas winds have to be estimated. The previous simple model yields the maximum wind speed from measurements of the center pressure, the ambient pressure, and the size of the eye of the storm. It is still important to note that it is the difference in the hurricane’s pressure and that of its environment that actually give it its strength. This difference in pressure is known as the “pressure gradient” and it is this change in pressure over a distance that causes wind. The bigger the gradient, the faster  the winds will be generated. If two cyclones have the same minimum pressure, but one is in an area of higher ambient pressure than the other, that one is in fact stronger. The cyclone must be more intense to get its pressure commensurately lower, and its larger pressure gradient would make its winds faster.



Thus far in this section, we discussed prediction of the type of disaster involving fluid transport phenomena, weather-related disasters being the most rampant. Predictions are possible on those cases, and improvements in forecast’s accuracy and extent are continually being made as a result of enhanced understanding of flow physics, increased accuracy and resolution of global measurements, and exponentially expanded computer power. Other types of disasters do not fare as well, earthquakes being calamities that thus far cannot be accurately predicted. Prediction of weather storms is possible in part because the atmosphere is optically transparent, which facilitates measurements that in turn provide not only the initial and boundary conditions necessary for integrating the governing equations but also a deeper understanding of the physics. The oceans are not as accessible, but measurements there are possible as well, and scientists learned a great deal in the past few decades about the dynamics of both the atmosphere and the ocean (Garrett, 2000; McIntyre, 2000). Our knowledge of terra firma, in contrast, does not fare as well mostly because of its inaccessibility to direct observation (Huppert, 2000). What we know about the Earth’s solid inner core, liquid outer core, mantle, and lithosphere comes mainly from inferences drawn from observations at or near the planet’s surface, which include the study of propagation, reflection, and scatter of seismic waves. Deep underground measurements are not very practical, and the exact constitutive equations of the different constituents of the “solid” Earth are not known. All of that inhibits us from writing down and solving the precise equations, and their initial and boundary conditions, for the dynamics of the Earth’s solid part. That portion of the planet contains three orders of magnitude more volume than all the oceans combined and six orders of magnitude more mass than the entire atmosphere, and it is a true pity that we know relatively little about the solid Earth.

The science behind earthquakes basically began shortly after the infamous rupture of the San Andreas Fault that devastated San Francisco a little more than a century ago. Before then, geologists had examined seismic faults and even devised primitive seismometers to measure shaking. However, they had no idea what caused the ground to heave without warning. A few days after the Great Earthquake struck on April 18, 1906, Governor George C. Pardee of California charged the state’s leading scientists with investigating how and why the Earth’s crust had ruptured for hundreds of miles with such terrifying violence. The foundation for much of what is known today about earthquakes was laid two years later, and the resulting report (Lawson, 1908) carried the name of the famed geologist Andrew C. Lawson.

Earthquakes are caused by stresses in the Earth’s crust that build up deep inside a fault until it ruptures with a jolt. Prior to the Lawson Report (1908), many scientists believed earthquakes created the faults instead of the other way around. The San Andreas Fault system marks the boundary between two huge moving slabs of the Earth’s crust: the Pacific Plate and the North American Plate. As the plates grind constantly past each other, strain builds until it is released periodically in a full-scale earthquake. A few small sections of the San Andreas Fault had been mapped by scientists years before 1906, but Lawson and his team discovered that the entire zone stretched for more than 950 km along the length of California. By measuring land movements on either side of the fault, the team learned that the earthquake’s motion had moved the ground horizontally, from side to side, rather than just vertically as scientists had previously believed.

A century after the Lawson Report (1908), its conclusions remain valid, but it has stimulated modern earthquake science to move far beyond. Modern scientists have learned that major earthquakes are not random events—they apparently come in cycles. Although pinpoint prediction remains impossible, research on faults throughout the San Francisco Bay Area and other fault locations enables scientists to estimate the probability that strong quakes will jolt a region within the coming decades. Sophisticated broadband seismometers can measure the magnitude of earthquakes within a minute or two of an event and determine where and how deeply on a fault the rupture started. Orbiting satellites now measure within fractions of an inch how the Earth’s surface moves as strain builds up along fault lines, and again how the land is distorted after a quake has struck. “Shakemaps”, available on the Internet and by e-mail immediately after every earthquake, can swiftly tell disaster workers, utility companies and residents where damage may be greatest. Supercomputers, simulating ground motion from past earthquakes, can show where shaking might be heaviest when new earthquakes strike. The information can then be relayed to the public and to emergency workers.

One of the latest and most important ventures in understanding earthquake behavior is the borehole-drilling project at Parkfield in southern Monterey County, California, where the San Andreas Fault has been heavily instrumented for many years. The hole is about 3.2 km deep and crosses the San Andreas underground. For the first time, sensors can actually be inside the earthquake machine to catch and record the earthquakes right where and when they are occurring.

The seismic safety of any structure depends on the strength of its construction and the geology of the ground on which it stands—a conclusion reflected in all of today’s building codes in the United States. Tragically, the codes in some earthquake prone countries are just as strict as those in the United States, but are not enforceable for the most part. In other nations, building codes are not sufficiently strict or non-existent altogether.

The Butterfly Effect


There are two additional issues to ponder for all disasters that could be modeled as nonlinear dynamical systems. The volume edited by Bunde, Kropp, and Schellnhuber (2002) is devoted to this topic, and is one of very few books to tackle large-scale disasters purely as a problem to be posed and solved using scientific principles. The modeling could be in the form of a number of algebraic equations or, more likely, ordinary or partial differential equations, with nonlinear term(s) appearing somewhere within the finite number of equations. First, we examine the bad news. Nonlinear dynamical systems are capable of producing chaotic solutions, which limit the ability to predict too far into the future, even if infinitely powerful computers are available. Second, we examine the (potentially) good news. Chaotic systems can be controlled, in the sense that a very small perturbation can lead to a significant change in the future state of the system. In this subsection, we elaborate on both issues.

In the theory of dynamical systems, the so-called “butterfly effect” (a lowly diurnal lepidopteran flapping its wings in Brazil may set off a future tornado in Texas) denotes sensitive dependence of nonlinear differential equations on initial conditions, with phase-space solutions initially very close together and separating exponentially. Massachusetts Institute of Technology’s atmospheric scientist Edward Lorenz originally used seagull’s wings for the metaphor in a paper for the New York Academy of Sciences (Lorenz, 1963), but in subsequent speeches and papers he used the more poetic butterfly. For a complex system such as the weather, initial conditions of infinite resolution and infinite accuracy are clearly never going to be available, thus further making certain that precise long-term predictions are never achievable.

The solution of nonlinear dynamical systems of three or more degrees of freedom5 may be in the form of a strange attractor whose intrinsic structure contains a well-defined mechanism to produce a chaotic behavior without requiring random forcing (Ott, 1993). Chaotic behavior is complex, aperiodic, and, although deterministic, appears to be random. The dynamical system in that case is non-integrable,6 and our ability for long-term forecast is severely hindered because of the extreme sensitivity to initial conditions. One can predict the most probable weather, for example, a week from the present, with a narrow standard deviation to indicate all other possible outcomes. We speak of a 30% chance of rain 7 days from now, and so on. That ability to provide reasonably accurate predictions diminishes as time progresses because the sensitivity to initial conditions intensifies exponentially, and Lorenz (1967) proposes a 20-day theoretical limit for predicting weather. This means that regardless how massive future computers will become, weather prediction beyond 20 days will always be meaningless. Nevertheless, we still have a way to go to double the extent of the current 10-day forecast.

Weather and climate should not be confused, however. The latter describes the long-term variability of the climate system whose components comprise the atmosphere, hydrosphere, cryosphere, pedosphere, lithosphere and biosphere. Climatologists apply models to compute the evolution of the climate a hundred years or more into the future (Fraedrich & Schönwiese, 2002; Hasselmann, 2002). Seemingly paradoxical, meteorologists use similar models but have difficulties forecasting the weather beyond just a few days. Both weather and climate are nonlinear dynamical systems, but the former concerns the evolution of the system as a function of the initial conditions with fixed boundary conditions, whereas the latter, especially as influenced by human misdeeds, concerns the response of the system to changes in boundary conditions with fixed initial conditions. For long time periods, the dependence of the time-evolving climate state on the initial conditions becomes negligible asymptotically.

The Art of Disaster Management

The laws of nature are the same regardless of what type of disaster is considered. A combination of first-principles laws of classical mechanics, heuristic modeling, data collection, and computers may help, to different degrees of success, the prediction and control of natural and manmade disasters, as discussed in Section 4. Once a disaster strikes, mitigating its adverse effects becomes the primary concern. Disaster management is more art than science, but the management principles are similar for most types of disasters, especially those that strike suddenly and intensely. The organizational skills and resources needed to mitigate the adverse effects of a hurricane are not much different from those required in the aftermath of an earthquake. The scope of the disaster (Section 2) determines the extent of the required response. Slowly evolving disasters such as global warming or air pollution are different and their management requires a different set of skills, response, and political will. Although millions of people may be adversely affected by global warming, the fact that that harm may be spread over decades and thus diluted in time does not provide immediacy to the problem and its potential mitigation. Political will to solve long-range problems—not affecting the next election—is typically non-existent except in the case of the rare visionary leader.

In his book, der Heide (1989) states that disasters are the ultimate test of emergency response capability. Once a large-scale disaster strikes, mitigating its adverse effects becomes the primary concern. There are concerns about how to save lives, take care of the survivors’ needs, and protect property from any further damage. Dislocated people need shelter, water, food, and medicine. Both the physical and the mental health of the survivors, as well as relatives of the deceased, can be severely jeopardized. Looting, price gouging, and other law breaking activities need to be contained, minimized, or eliminated. Hospitals need to prioritize and even ration treatments, especially in the face of the practical fact that the less seriously injured tend to arrive at emergency rooms first, perhaps because they transported themselves there. Roads need to be operable and free of landslides, debris, and traffic jams for the unhindered flow of first responders and supplies to the stricken area, and evacuees and ambulances from the same. This is not always the case especially if the antecedent disaster damages most if not all roads, as occurred after the 2005 Kashmir Earthquake. Buildings, bridges, and roads need to be rebuilt or repaired, and power, potable water and sewage need to be restored.

Lessons learned from one calamity can be applied to improve the response to subsequent ones (Cooper & Block, 2006; Olasky, 2006). Disaster mitigation is not a trial-and-error process, however. Operations research (operational research in Britain) is the discipline that uses the scientific approach to decision making, which seeks to determine how best to design and operate a system, usually under conditions requiring the allocation of scarce resources (Winston, 1994). Churchman, Ackoff, and Arnoff (1957) similarly define the genre as the application of scientific methods, techniques, and tools to problems involving the operations of systems so as to provide those in control of the operations with optimum solutions to the problems. Operations research and engineering optimization principles are skillfully used to facilitate recovery and return to normalcy following a large-scale disaster (Altay & Green, 2006). The always-finite resources available must be utilized so as to maximize their beneficial impact. A lot of uncoordinated, incoherent activities are obviously not a good use of scarce resources. For example, sending huge amounts of perishable food to a stricken area that has no electricity makes little sense. Although it seems silly, it is not difficult to find such examples that were made in the heat of the moment.

Most books on large-scale disasters are written from either a sociologist’s or a tactician’s point of view, in contrast to the scientist’s viewpoint of this article. There are few popular science or high school-level books on disasters, e.g., Engelbert, Deschenes, Nagal, and Sawinski (2001) and Allen (2005), and even fewer more advanced science books, such as Bunde et al. (2002) and Gad-el-Hak (2008). The other books deal, for the most part, with the behavioral response to disasters and the art of mitigating their aftermath. Current topics of research include disaster preparedness and behavioral and organizational responses to disasters. A small sample of recent books includes Pickett and White (1985), Smith (1991), Alexander (1993; 2000), Smith and Dickie (1993), Tobin (1997), Burby (1998), Fischer (1998), Quarantelli (1998), Kunreuther and Roth (1998), Gist and Lubin (1999), Mileti (1999), Steinberg (2000), Cutter (2001), Tierney, Lindell, and Perry (2001), Childs and Dietrich (2002), de Boer and van Remmen (2003), Pelling (2003), Stein and Wysession (2003), Bankoo, Frerks, and Hilhorst (2004), Posner (2004), Wallace and Webber (2004), Abbott (2005), Dilley, Chan, Deichmann, Lerner-Lam, and Arnold (2005), Vale and Campanella (2005), and McKee and Guthridge (2006).

A Bit of Sociology

Although it appears that large-scale disasters are more recent when the past few years are considered, they have actually been with us since homo sapiens set foot on Earth. Frequent disasters struck the planet as far back as the time of its formation. The dinosaur went extinct because a meteorite struck the Earth 65 million years ago. However, if we concern ourselves with humans, then starting 200,000 years ago, ice ages, famines, attacks from rival groups or animals, and infections were constant reminders of human’s vulnerability. We survived because we were programmed to do so.

Humans deal with natural and manmade disasters with an uncanny mix of dread, trepidation, curiosity, and resignation, but they often rise to the challenge with acts of resourcefulness, courage, and unselfishness. Disasters are common occurrences in classical and modern literature. William Shakespeare’s comedy The Tempest opens with a storm that becomes the driving force of the plot and tells of reconciliation after strife. Extreme weather forms the backdrop to three of the bard’s greatest tragedies: Macbeth; Julies Caesar and King Lear. In Macbeth, the tempest is presented as unnatural and is preceded by “portentous things”. Men enveloped in fire walked the streets, lions became tame, and night birds howled in the midday sun. Order is inverted, man acts against man, the gods and elements turn against humanity and mark their outrage with “a tempest dropping fire”. In Julius Caesar, humanity’s abominable actions are accompanied through violent weather. Caesar’s murder is plotted while the sea swells, rages and foams, and “All the sway of earth shakes like a thing unfirm”. In King Lear extreme weather conditions mirror acts of human depravity. The great storm that appears in Act 2, Scene 4, plays a crucial part in aiding Lear’s tragic decline deeper into insanity.

On the popular culture front, disaster movies flourish in Hollywood, particularly in times of tribulation. Witness the following sample of the movie genre: San Francisco (1936); A Night to Remember (1958); Airport (1970); The Poseidon Adventure (1972); Earthquake (1974); Towering Inferno (1974); The Hindenburg (1975); The Swarm (1978); Meteor (1979); Runaway Train (1985);; Outbreak (1995); Twister (1996); Titanic (1997); Volcano (1997); Armageddon (1998); Deep Impact (1998); Flight 93 (2006); United 93 (2006); and World Trade Center (2006).

Does disaster bring out the worst in people? Thomas Glass, professor of epidemiology at the Johns Hopkins University, argues the opposite (Glass, 2001; 2002). From an evolutionary viewpoint, disasters bring out the best in us. It almost has to be that way. Humans survived ice ages, famines, and infections, not because we were strong or fast, but because in the state of extreme calamity, we tend to be resourceful and cooperative, except when there is a profound sense of injustice—that is, when some group has been mistreated or the system has failed. In such events, greed, selfishness, and violence do occur. A sense of breach of fairness can trigger the worst in people. Examples of those negative connotations include distributing the bird flu vaccine to the rich and mighty first, and the captain and crew escaping a sinking ferry before the passengers. The first of these two examples has not yet occurred, but the second is a real tragedy that recently took place (the sinking of Al-Salam Boccaccio ferry on February 3, 2006 in the Red Sea).

If reading history amazes you, you will find that the bird flu pandemic (or similar flu) wiped out a lot of the European population in the seventeenth century, before they cleaned it up. The bright side of a disaster is the reconstruction phase. Disasters are not always bad, even if we think they are. We need to look at what we learn and how we grow to become stronger after a disaster. For example, it is certain that the local, state and federal officials in the United States are now learning painful lessons from Hurricane Katrina, and will try to avoid the same mistakes again. It is up to us humans to learn from mistakes and not to forget them. However, human nature is forgetful, political leaders are not historians, and facts are buried.

The sociologist Henry W. Fischer, III (Fischer, 1998) argues that certain human depravities commonly perceived to emerge during disasters (e.g., mob hysteria, panic, shock looting) are the exception, not the rule. The community of individuals does not break down, and the norms that we tend to follow during normal times hold during emergency times. Emergencies bring out the best in us and we become much more altruistic. Proving his views using several case studies, Fischer (1998) writes about people who pulled through a disaster:

Survivors share their tools, their food, their equipment, and especially their time.    Groups of survivors tend to emerge to begin automatically responding to the   needs of one another. They search for the injured, the dead, and they begin       cleanup activities. Police and fire personnel stay on the job, putting the needs of          victims and the duty they have sworn to uphold before their own personal needs   and concern. The commonly held view of behavior is incorrect (p. 18–19).

Fisher’s observations are commonly accepted among modern sociologists. Indeed, as stated previously, we survived the numerous disasters encountered throughout the ages because we were programmed to do so.

Few Recent Disasters


It is always useful to learn from past disasters and to prepare better for the next one. Losses of lives and property from recent years are staggering. Not counting the manmade disasters that were tallied in Section 2, some frightening numbers from natural calamities alone are:

  • Seven hundred natural disasters in 2003, which caused 75,000 deaths (almost seven times the number in 2002), 213 million people adversely affected to some degree, and $65 billion in economic losses.
  • In 2004, 244,577 persons killed globally as a result of natural disasters.
  • In 2005, $150 billion in economic losses, with hurricanes Katrina and Rita, which ravaged the Gulf Coast of the United States, responsible for 88% of that amount.
  • Within the first half of 2006, natural disasters already caused 12,718 deaths and $2.3 billion in economic damages.

In the following twelve subsections we briefly recall a few manmade and natural disasters. Thee handful—as compared to the hundreds striking Earth every year—were chosen because they either were in the news recently or possess a certain degree of generality/universality with similar extreme events. The information herein and the accompanying photographs are mostly as reported in the online encyclopedia Wikipedia ( The numerical data were cross-checked using archival media reports from such sources as The New York Times and ABC News. The numbers did not always match, and more than one source was consulted to reach the most reliable results. Absolute accuracy is not guaranteed, however. The dozen or so disasters sampled herein are not by any stretch of the imagination comprehensive, merely a few examples that may present important lessons for future calamities. Remember, they strike Earth at the average rate of three per day! The metric developed in Section 2 is applied in Section 7.13 to the thirteen disasters sampled in the present section.

San Francisco Earthquake


A major earthquake of magnitude 7.8 on the Richter scale struck the city of San Francisco, California, at around 5:12 am, Wednesday, April 18, 1906. The Great Earthquake, as it became known, was along the San Andreas Fault with its epicenter close to the city. Its violent shocks were felt from Oregon to Los Angeles and inland as far as central Nevada. The earthquake and resulting fires would go down in history as one of the worst natural disasters to hit a major U.S. city.

At the time only 478 deaths were reported, a figure concocted by government officials who believed that reporting the true death toll would hurt real estate prices and efforts to rebuild the city. This figure has been revised to today’s conservative estimate of more than 3,000 victims. Most of the deaths occurred in San Francisco, but 189 were reported elsewhere across the San Francisco Bay Area. Other places in the Bay Area such as Santa Rosa, San Jose, and Stanford University also received severe damage.

Between 225,000 and 300,000 people were left homeless, out of a population of about 400,000. Half of these refugees fled across the bay to Oakland, in an evacuation similar to the Dunkirk Evacuation that would occur years later. Newspapers at the time described Golden Gate Park, the Panhandle, and the beaches between Ingleside and North Beach as covered with makeshift tents. The overall cost of the damage from the earthquake was estimated at the time to be around $400 million. The earthquake’s notoriety rests in part on the fact that it was the first natural disaster of its magnitude to be captured by photography. Furthermore, it occurred at a time when the science of seismology was blossoming. Figure 4 and 5 depict the devastation.

Eight decades after the Great Earthquake, another big one struck the region. This became known as the Loma Prieta Earthquake. At 5:04 pm, on October 17, 1989, a magnitude 7.1 earthquake on the Richter scale severely shook the San Francisco and Monterey Bay regions. The epicenter was located at 37.04oN latitude, 121.88oW longitude near Loma Prieta peak in the Santa Cruz Mountains, approximately 14 km northeast of Santa Cruz and 96 km south–southeast of San Francisco. The tremor lasted for 15 seconds and occurred when the crustal rocks comprising the Pacific and North American Plates abruptly slipped as much as 2 m along their common boundary—the San Andreas Fault system (Section 4.5). The rupture initiated at a depth of 18 km and extended 35 km along the fault, but it did not break the surface of the Earth.

This major earthquake caused severe damage as far as 110 km away; most notably in San Francisco, Oakland, the San Francisco Peninsula, and in areas closer to the epicenter in the communities of Santa Cruz, the Monterey Bay, Watsonville, and Los Gatos. Most of the major property damage in the more distant areas resulted from liquefaction of soil used over the years to fill in the waterfront and then built. The magnitude and distance of the earthquake from the severe damage to the north were surprising to geotechnologists. Subsequent analysis indicates that the damage was likely due to reflected seismic waves—the reflection from well-known deep discontinuities in the Earth’s gross structure, about 25 km below the surface.

There were at least 66 deaths and 3,757 injuries as a result of this earthquake. The highest concentration of fatalities, 42, occurred in the collapse of the Cypress structure on the Nimitz Freeway (Interstate 880), where a double-decker portion of the freeway collapsed, crushing the cars on the lower deck. One 15 m section of the San Francisco–Oakland Bay Bridge also collapsed, causing two cars to fall to the deck below and leading to a single fatality. The bridge was closed for repairs for 1 month.

Because this earthquake occurred during the evening rush hour, there could have been a large number of cars on the freeways at the time, which on the Cypress structure could have endangered many hundreds of commuters. Very fortunately, and in an unusual convergence of events, the two local Major League Baseball teams, the Oakland Athletics and the San Francisco Giants, were about to start their third game of the World Series, which was scheduled to start shortly after 5:30 pm. Many people had left work early or were participating in early after work group viewings and parties. As a consequence, the usually crowded highways were experiencing exceptionally light traffic at the time.

Extensive damage also occurred in San Francisco’s Marina District, where many expensive homes built on filled ground collapsed. Fires raged in some sections of the city as water mains broke. The San Francisco’s fireboat Phoenix was used to pump salt water from San Francisco Bay using hoses dragged through the streets by citizen volunteers. Power was cut to most of San Francisco and was not fully restored for several days. Deaths in Santa Cruz occurred when brick storefronts and sidewalls in the historic downtown, which was then called the Pacific Garden Mall, tumbled down on people exiting the buildings. A sample of the devastation is shown in the photograph in Figure 6. The earthquake also caused an estimated $6 billion in property damage, the costliest natural disaster in U.S. history at the time. It was the largest earthquake to occur on the San Andreas Fault since the Great Earthquake. Private donations poured in to aid relief efforts, and on October 26, 1986, President George H. W. Bush signed a $3.45-billion earthquake relief package for California.




Figure 6.  Land break near the Loma Prieta quake’s epicenter. The man taking a photograph provides a length scale for the width of the hole.


Hyatt Regency Walkway Collapse

The Hyatt Regency Hotel was built in Kansas City, Missouri, in 1978. A-state-of-the-art facility, this hotel boasted a forty-story hotel tower and conference facilities. These two components were connected by an open-concept atrium, within which three suspended walkways connected the hotel and conference facilities on the second, third, and fourth levels. Due to their suspension, these walkways were referred to as “floating walkways” or “skyways”. The atrium boasted 1,580 m2 and was 15 m high. It seemed incredulous that such an architectural masterpiece could be involved in the United States’ most devastating structural failure (not caused by earthquake, explosion, or airplane crash) in terms of loss of life and injuries.

It was July 17, 1981, when the guests at Kansas City Hyatt Regency Hotel witnessed the collapse of a main walkway. Approximately 2,000 people were gathered to watch a dance contest in the hotel lobby. Although the majority of the guests were on the ground level, some were dancing on the floating walkways on the second, third, and fourth levels. At about 7:05 pm, a loud crack was heard as the second- and fourth-level walkways collapsed onto the ground level. This disaster took the lives of 114 people and left more than 200 injured.

What did we learn from this manmade disaster? The project for constructing this particular hotel began in 1976 with Gillum–Colaco International, Inc., as the consulting structural engineering firm. Gillum–Colaco Engineering (G.C.E.) provided input into various plans that were being made by the architect and owner, and were contracted in 1978 to provide “all structural engineering services for a 750-room hotel project.” Construction began in the spring of 1978. In the winter of 1978, Havens Steel Company entered the contract to fabricate and erect the atrium steel for the project under the standards of the American Institute of Steel Construction for steel fabricators. During construction in October 1979, part of the atrium roof collapsed. An inspection team was brought in to investigate the collapse and G.C.E. vowed to review all steel connections in the structure, including that of the roof.

The proposed structure details of the three walkways were as follows:

  • Wide-flange beams were to be used on either side of the walkway, which was hung from a box beam.
  • A clip angle was welded to the top of the box beam, which connected to the flange beams with bolts.
  • One end of the walkway was welded to a fixed plate, while the other end was supported by a sliding bearing.
  • Each box beam of the walkway was supported by a washer and a nut that were threaded onto the supporting rod. Because the bolt connection to the wide flange had virtually no movement, it was modeled as a hinge. The fixed end of the walkway was also modeled as a hinge, while the bearing end was modeled as a roller.


Due to disputes between the G.C.E. and Havens, design changes from a single- to a double-hanger, rod-box beam connection were implemented. Havens did not want to have to thread the entire rod in order to install the washer and nut. This revised design consisted of the following:

  • One end of each support rod was attached to the atrium’s roof cross-beams.
  • The bottom end went through the box beam where a washer and nut were threaded on to the supporting rods.
  • The second rod was attached to the box beam 10 cm from the first rod.

Additional rods suspended downward to support the second level in a similar manner.

Why did the design fail? Due to the addition of another rod in the actual design, the load on the nut connecting the fourth-floor segment was increased. The original load for each hanger rod was to be 90 kN, but with the design alteration the load was doubled to 181 kN for the fourth-floor box beam. Because the box beams were longitudinally welded, as proposed in the original design, they could not hold the weight of the two walkways. During the collapse, the box beam split and the support rod pulled through the box beams resulting in the fourth- and second-level walkways falling to the ground level.

The following paradigm clarifies the design failure of the walkways quite well. Suppose a long rope is hanging from a tree, and two people are holding onto the rope, one at the top and one near the bottom. Under the conditions that each person can hold their own body weight and that the tree and rope can hold both people, the structure would be stable. However, if one person was to hold onto the rope, and the other person was hanging onto the legs of the first, then the first person’s hands must hold both people’s body weights, and thus the grip of the top person would be more likely to fail. The initial design is similar to the two people hanging onto the rope, while the actual design is similar to the second person hanging from the first person’s legs. The first person’s grip is comparable to the fourth-level hanger-rod connection. The failure of this grip caused the walkway collapse.

Who was responsible? One of the major problems with the Hyatt Regency project was the lack of communication between parties. In particular, the drawings prepared by G.C.E. were only preliminary sketches but were interpreted by Havens as finalized drawings. These drawings were then used to create the components of the structure. Another large error was G.C.E.’s failure to review the final design, which would have allowed them to catch the error in increasing the load on the connections. As a result, the engineers employed by G.C.E., who affixed their seals to the drawings, lost their engineering licenses in the states of Missouri and Texas. G.C.E. also lost its ability to be an engineering firm.

An engineer has a responsibility to his or her employer and, most important, to society. In the Hyatt Regency case, the lives of the public were hinged on G.C.E.’s ability to design a structurally sound walkway system. Their insufficient review of the final design led to the failure of the design and a massive loss of life. Cases such as the Hyatt Regency walkway collapse are a constant reminder of how an error in judgment can create a catastrophe. It is important that events in the past are remembered so that engineers will always fulfill their responsibility to society.

Izmit Earthquake


On August 17, 1999, the Izmit Earthquake with a magnitude of 7.4 struck northwestern Turkey. It lasted 45 seconds and killed more than 17,000 people according to the government report. Unofficial albeit credible reports of more than 35,000 deaths were also made. Within 2 hours, 130 aftershocks were recorded and two tsunamis were observed.

The earthquake had a rupture length of 150 km from the city of Düzce to the Sea of Marmara along the Gulf of Izmit. Movements along the rupture were as large as 5.7 m. The rupture passed through major cities that are among the most industrialized and urban areas of Turkey, including oil refineries, several car companies, and the navy headquarters and arsenal in Gölcük, thus increasing the severity of the life and property loss.

This earthquake occurred in the North Anatolian Fault Zone (NAFZ). The Anatolian Plate, which is comprised of Turkey primarily, is being pushed west by about 2 to 2.5 cm/yr, because it is squeezed between the Eurasian Plate on the north, and both the African Plate and the Arabian Plate on the south. Most of the large earthquakes in Turkey result as slip occurs along the NAFZ or a second fault to the east, the Eastern Anatolian Fault.

Impacts of the earthquake were vast. These included in the short term, 4,000 buildings destroyed, including an army barracks, an ice skating rink, and refrigerated lorries used as mortuaries; cholera, typhoid, and dysentery were spread; homelessness and post-traumatic stress disorder were observed in around 25% of those living in the tent city set up by officials for the homeless. An oil refinery leaked into the water supply and Izmit Bay and, subsequently, caught fire. Because of the leak and the fire, the already highly polluted bay saw a two- to three-fold increase in polycyclic aromatic hydrocarbon levels compared to 1984 samples. Dissolved oxygen and chlorophyll reached their lowest levels in 15 years. Economic development was set back 15 years, and the direct damage of property was estimated at $18 billion, a huge sum for a developing country. A curious scene of the damage in the town of Gölcük, 100 km east of Istanbul, is depicted in Figure 7.

September 11

A series of coordinated suicide attacks upon the United States were carried out on Tuesday, September 11, 2001, in which 19 hijackers took control of four domestic commercial airlines. The terrorists crashed two planes into the World Trade Center in Manhattan, New York City, one into each of the two tallest towers, about 18 minutes apart. Within 2 hours, both towers had collapsed. The hijackers crashed the third aircraft into the Pentagon, the U.S. Department of Defense headquarters, in Arlington County, Virginia. The fourth plane crashed into a rural field in Somerset County, Pennsylvania, 129 km east of Pittsburgh, following passenger resistance. The official count records 2,986 deaths in the attacks, including the hijackers, the worst act of war against the United States on its own soil. (The Imperial Japanese Navy’s surprise attack on Pearl Harbor, Oahu, Hawaii, on the morning of December 7, 1941 was aimed at the Pacific Fleet and killed 2,403 American servicemen and 68 civilians.)

The National Commission on Terrorist Attacks Upon the United States (9/11 Commission) states in its final report that the nineteen hijackers who carried out the attack were terrorists affiliated with the Islamic Al-Qaeda organization. The report named Osama bin Laden, a Saudi national, as the leader of Al-Qaeda, and as the person ultimately suspected as being responsible for the attacks, with the actual planning being undertaken by Khalid Shaikh Mohammed. Bin Laden categorically denied involvement in two 2001 statements, before admitting a direct link to the attacks in a subsequent taped statement.

The 9/11 Commission reported that these hijackers turned the planes into the largest suicide bombs in history. The 9/11 attacks are among the most significant events to have occurred so far in the twenty-first century in terms of the profound economic, social, political, cultural, psychological, and military effects that followed in the United States and many other parts of the world.

Following the September 11 disaster, the Global War on Terrorism was launched by the United States, enlisting the support of NATO members and other allies, with the stated goal of ending international terrorism and state sponsorship of the same. The difficulty of the war on terrorism, now raging for more than five years, is that it is mostly a struggle between a super power and a nebulously defined enemy: thousands of stateless, loosely connected, disorganized, undisciplined religion fanatics scattered around the globe, but particularly in Africa, the Middle East, South Asia, and Southeast Asia.

Pacific Tsunami


A tsunami is a series of waves generated when water in a lake or a sea is rapidly displaced on a massive scale. Earthquakes, landslides, volcanic eruptions, and large meteorite impacts all have the potential to generate a tsunami. The effects of a tsunami can range from unnoticeable to devastating. The Japanese term “tsunami” means harbor and wave. The term was created by fishermen who returned to port to find the area surrounding the harbor devastated, although they had not been aware of any wave in the open water. A tsunami is not a subsurface event in the deep ocean; it simply has a much smaller amplitude offshore and a very long wavelength (often hundreds of kilometers long), which is why it generally passes unnoticed at sea, forming only a passing “hump” in the ocean.

Tsunamis have been historically referred to as tidal waves because as they approach land, they take on the characteristics of a violent onrushing tide rather than the more familiar cresting waves that are formed by wind action on the ocean. However, because tsunamis are not actually related to tides, the term is considered misleading and its usage is discouraged by oceanographers.

The 2004 Indian Ocean Earthquake, known by the scientific community as the Sumatra–Andaman Earthquake, was an undersea earthquake that occurred at 00:58:53 UTC (07:58:53 local time) on December 26, 2004. According to the U.S. Geological Survey (USGS), the earthquake and its tsunami killed more than 283,100 people, making it one of the deadliest disasters in modern history. Indonesia suffered the worse loss of life at more than 168,000. The disaster is known in Asia and the media as the Asian Tsunami; in Australia, New Zealand, Canada and the United Kingdom it is known as the Boxing Day Tsunami because it took place on Boxing Day, although it was still Christmas Day in the Western Hemisphere when the disaster struck.

The earthquake originated in the Indian Ocean just north of Simeulue Island, off the western coast of northern Sumatra, Indonesia. Various values were given for the magnitude of the earthquake that triggered the giant wave, ranging from 9.0 to 9.3 (which would make it the second largest earthquake ever recorded on a seismograph), although authoritative estimates now put the magnitude at 9.15. In May 2005, scientists reported that the earthquake itself lasted close to 10 minutes even though most major earthquakes last no more than a few seconds; it caused the entire planet to vibrate at least a few centimeters. It also triggered earthquakes elsewhere, as far away as Alaska.

The resulting tsunami devastated the shores of Indonesia, Sri Lanka, South India, Thailand, and other countries with waves up to 30-m high. The tsunami caused serious damage and death as far as the east coast of Africa, with the furthest recorded death due to the tsunami occurring at Port Elizabeth in South Africa, 8,000 km away from the epicenter. Figures 9 and 10 show examples of the devastation caused by one of the deadliest calamities of the twenty-first century. The plight of the many affected people and countries prompted a widespread humanitarian response.

Unlike in the Pacific Ocean, there is no organized alert service covering the Indian Ocean. This is partly due to the absence of major tsunami events between 1883 (the Krakatoa eruption, which killed 36,000 people) and 2004. In light of the 2004 Indian Ocean Tsunami, UNESCO and other world bodies have called for a global tsunami monitoring system.

Human’s actions caused this particular natural disaster to become more damaging than it would otherwise. The intense coral reef mining off the Sri Lankan coast, which removed the sort of natural barrier that could mitigate the force of waves, amplified the disastrous effects of the tsunami. As a result of such mining, the 2004 Pacific Tsunami devastated Sri Lanka much more than it would have otherwise.

Hurricane Katrina


Hurricane Katrina was the eleventh named tropical storm, fourth hurricane, third major hurricane, and first category 5 hurricane of the 2005 Atlantic hurricane season. It was the third most powerful storm of the season, behind Hurricane Wilma and Hurricane Rita, and the sixth strongest storm ever recorded in the Atlantic basin. It first made landfall as a category 1 hurricane just north of Miami, Florida, on August 25, 2005, resulting in a dozen deaths in South Florida and spawning several tornadoes, which fortunately did not strike any dwellings. In the Gulf of Mexico, Katrina strengthened into a formidable category 5 hurricane with maximum winds of 280 km/h and minimum central pressure of 902 mbar. It weakened considerably as it was approaching land, making its second landfall on the morning of August 29th along the Central Gulf Coast near Buras-Triumph, Louisiana, with 200 km/h winds and 920 mbar central pressure, making it a strong category 3 storm, having just weakened from category 4 as it was making landfall.

The sheer physical size of Katrina caused devastation far from the eye of the hurricane; it was possibly the largest hurricane of its strength ever recorded, but estimating the size of storms from before the pre-satellite 1960s era is difficult to impossible. On August 29th, Katrina’s storm surge breached the levee system that protected New Orleans from Lake Pontchartrain and the Mississippi River. Most of the city was subsequently flooded, mainly by water from the lake. Heavy damage was also inflicted onto the coasts of Mississippi and Alabama, making Katrina the most destructive and costliest natural disaster in the history of the United States and the deadliest since the 1928 Okeechobee Hurricane.

The official combined direct and indirect death toll now stands at 1,836, the fourth highest in U.S. history, behind the Galveston Hurricane of 1900, the 1893 Sea Islands Hurricane, and possibly the 1893 Chenier Caminanda Hurricane, and ahead of the Okeechobee Hurricane of 1928. As of December 20, 2005, more than 4,000 people remain unaccounted for, so the death toll may still grow. As of November 22, 2005, 1,300 of those missing were either in heavily-damaged areas or were disabled and “feared dead”; if all 1,300 of these were to be confirmed dead, Katrina would surpass the Okeechobee Hurricane and become the second-deadliest in U.S. history and deadliest in over a century.

More than 1.2 million people were under an evacuation order before landfall. In Louisiana, the hurricane’s eye made landfall at 6:10 am CDT on Monday, August 29th. After 11:00 am CDT, several sections of the levee system in New Orleans collapsed. By early September, people were being forcibly evacuated, mostly by bus to neighboring states. More than 1.5 million people were displaced—a humanitarian crisis on a scale unseen in the United States since the Great Depression. The damage is now estimated to be about $81.2 billion (2005 U.S. dollars), more than double the previously most expensive Hurricane Andrew, making Katrina the most expensive natural disaster in U.S. history.

Federal disaster declarations blanketed 233,000 km2 of the United States, an area almost as large as the United Kingdom. The hurricane left an estimated 3 million people without electricity, taking some places several weeks for power to be restored (but faster than the 4 months originally predicted). Referring to the hurricane itself plus the flooding of New Orleans, Homeland Security Secretary Michael Chertoff described on September 3rd that the aftermath of Hurricane Katrina as “probably the worst catastrophe, or set of catastrophes” in U.S. history.

A small sample of the devastation of Katrina is depicted in Figure 11. The aftermath of the hurricane produced the perfect political storm whose winds lasted long after the hurricane. Congressional investigations reaffirmed what many have suspected: governments at all levels failed. The city of New Orleans, the state of Louisiana, and the United States let the citizenry down. The whole episode was a study in ineptitude—and in buck-passing that fooled no one. The then-director of the Federal Emergency Management Agency, Michael Brown, did not know that thousands of New Orleans were trapped in the Superdome with subhuman conditions. In the middle of the bungled response, President George W. Bush uttered his infamous phrase “Brownie, you’re doin’ a heckuva job”. Several books were published in the aftermath of the calamity, mostly offering scathing criticism of the government as well as more sensible strategies to handle future crises (Cooper & Block, 2006; Olasky, 2006).

On October 23, 2007, slightly more than two years after the Katrina debacle, the new FEMA Deputy Administrator, Vice Admiral Harvey E. Johnson, held a news conference as wildfires raged in California. The briefing went very well and was carried out live on several news outlets. Only problem, all present were FEMA staffers playing reporters! FEMA yet once again became the subject of national ridicule. In a Washington Post column entitled “FEMA Meets the Press, Which Happens to Be…FEMA” (Kamen, 2007, p. A19), Al Kamen derided the notorious government agency, “FEMA has truly learned the lessons of Katrina. Even its handling of the media has improved dramatically.”

Kashmir Earthquake

The Kashmir Earthquake—aka the Northern Pakistan Earthquake or South Asia Earthquake—of 2005 was a major seismological disturbance that occurred at 08:50:38 Pakistan Standard Time (03:50:38 UTC, 09:20:38 India Standard Time, 08:50:38 local time at epicenter) on October 8, 2005, with the epicenter in the Pakistan-administered region of the disputed territory of Kashmir in South Asia. It registered 7.6 on the Richter scale, making it a major earthquake similar in intensity to the 1935 Quetta Earthquake, the 2001 Gujarat Earthquake, and the 1906 San Francisco Earthquake.

Most of the casualties from the earthquake were in Pakistan where the official death toll is 73,276, putting it higher than the one massive scale of destruction of the Quetta earthquake of May 31, 1935. Most of the affected areas were in mountainous regions and landslides that have blocked the roads impeded access. An estimated 3.3 million people were left homeless in Pakistan. According to Indian officials, nearly 1,400 people died in the Indian-administered Kashmir region. The United Nations (UN) reported that more than 4 million people were directly affected. Many of them were at risk of dying from cold and the spread of disease as winter began. Pakistan Prime Minister Shaukat Aziz made an appeal to survivors on October 26th to come down to valleys and cities for relief. It has been estimated that damages incurred are well more than $5 billion US. Three of the five crossing points have been opened on the line of control between India and Pakistan. Figure 12 depicts a small sample of the utter devastation.

Hurricane Wilma

In the second week of October 2005, a large and complex area of low pressure developed over the western Atlantic and eastern Caribbean with several centers of thunderstorm activity. This area of disturbed weather southwest of Jamaica slowly organized on October 15, 2005 into tropical depression number 24. It reached tropical storm strength at 5:00 am EDT on October 17th, making it the first storm ever to use a “W” name since alphabetical naming began in 1950, and tying the 1933 record for most storms in a season. Moving slowly over warm water with little wind shear, tropical storm Wilma strengthened steadily and became a hurricane on October 18th. This made it the twelfth hurricane of the season, tying the record set in 1969.

Hurricane Wilma was the sixth major hurricane of the record-breaking 2005 Atlantic hurricane season. Wilma set numerous records for both strength and seasonal activity. At its peak, it was the most intense tropical cyclone ever recorded in the Atlantic Basin. It was the third category 5 hurricane of the season (the other two being hurricanes Katrina and Rita), the only time this has occurred in the Atlantic, and only the third category 5 to develop in October. Wilma was the second twenty-first storm in any season and the earliest-forming twenty-first storm by nearly a month.

Wilma made several landfalls, with the most destructive effects experienced in the Yucatan Peninsula of Mexico, Cuba, and the U.S. state of Florida. At least 60 deaths were reported, and damage is estimated at between $18 billion and $22 billion, with $14.4 billion in the United States alone, ranking Wilma among the top ten costliest hurricanes ever recorded in the Atlantic and the fifth costliest storm in U.S. history.

Figures 13 and 14 show different aspects of Hurricane Wilma. Around 4:00 pm EDT on October 18, 2005, the storm began to intensify rapidly. During a 10-hour period, Hurricane Hunter aircraft measured a 78-mbar pressure drop. In a 24-hour period from October 18th, at 8:00 am EDT  to the following morning, the pressure fell 90 mbar. In this same 24-hour period, Wilma strengthened from a strong tropical storm with 110 km/h winds to a powerful category 5 hurricane with 280 km/h winds. In comparison, Hurricane Gilbert of 1988—the previous record-holder for lowest Atlantic pressure—recorded a 78-mbar pressure drop in a 24-hour period for a 3 mbar/h pressure drop. This is a record for the Atlantic Basin and is one of the most rapid deepening phases ever undergone by a tropical cyclone anywhere on Earth—the record holder is 100 mbar by Super Typhoon Forrest in 1983.

During its intensification on October 19, 2005, the eye’s diameter shrank to 3 km—one of the smallest eyes ever seen in a tropical cyclone. Quickly thereafter, Wilma set a record for the lowest pressure ever recorded in an Atlantic hurricane when its central pressure dropped to 884 mbar at 8:00 am EDT and then dropped again to 882 mbar 3 hours later before rising slowly in the afternoon, while remaining a category 5 hurricane. In addition, at 11:00 pm EDT that day, Wilma’s pressure dropped again to 894 mbar, as the storm weakened to a category 4 with winds of 250 km/h. Wilma was the first hurricane ever in the Atlantic Basin, and possibly the first tropical cyclone in any basin, to have a central pressure below 900 mbar while at category 4 intensity. In fact, only two other recorded Atlantic hurricanes have ever had lower pressures at this intensity; these two storms being previous Atlantic record holder Hurricane Gilbert of 1988 and the Labor Day Hurricane of 1935.

Although Wilma was the most intense hurricane (i.e., a tropical cyclone in the Atlantic, Central Pacific, or Eastern Pacific) ever recorded, there have been many more intense typhoons in the Pacific. Super Typhoon Tip is the most intense tropical cyclone on record at 870 mbar. Hurricane Wilma existed within an area of ambient pressure that was unusually low for the Atlantic Basin, with ambient pressures below 1,010 mbar. These are closer to ambient pressures in the northwest Pacific Basin. Indeed, under normal circumstances, the Dvorak matrix would equate an 890 mbar storm in the Atlantic basin—a current intensity (CI) number of 8—with an 858 mbar storm in the Pacific. Such a conversion, if normal considerations were in play, would suggest that Wilma was more intense than Tip. However, Wilma’s winds were much slower than the 315 km/h implied by an 8 on the Dvorak scale. A speed of 280+ km/h may seem incredibly fast, but for an 882 mbar hurricane it is actually quite slow. In comparison, Hurricane Gilbert had a pressure of 888 mbar but winds of 300 km/h. In fact, at one point after Wilma’s period of peak intensity, it had a pressure of 894 mbar, but was actually not even a category 5, with winds of just 250 km/h. Before Wilma, it had been unheard of for a storm to go under 900 mbar and not be a category 5. These wind speeds indicate that the low ambient pressure surrounding Wilma caused the 882 mbar pressure to be less significant than under normal circumstances, involving a lesser pressure gradient. By the gradient standard, it is entirely possible that Hurricane Gilbert, and not Wilma, is still the strongest North Atlantic hurricane on record.

Hurricane Wilma’s southeast eye-wall passed the greater Key West area in the lower Florida Keys in the early morning hours of October 24, 2005. At this point, the storm’s eye was approximately 56 km in diameter, and the north end of the eye wall crossed into the south and central section of Palm Beach County as the system cut a diagonal swath across the southern portion of the Florida peninsula. Several cities in the South Florida Metropolitan Area, which includes Palm Beach, Fort Lauderdale, and Miami, suffered severe damage as a result of the intense winds of the rapidly moving system. The center of the eye was directly over the South Florida Metropolitan Area at 10:30 am on Monday, October 24th. After the hurricane had already passed, there was a 3-m storm surge from the Gulf of Mexico that completely inundated a large portion of the lower Keys. Most of the streets in and near Key West were flooded with at least 1 m of salt water, causing the destruction of tens of thousands of vehicles. Many houses were also flooded with 0.5 m of seawater.

Despite significant wind shear in the Gulf, Hurricane Wilma regained some strength before making a third landfall just north of Everglades City, Florida, near Cape Romano, at 6:30 am EDT, October 24, 2005, as a category 3 hurricane. The re-intensification of Hurricane Wilma was due to its interaction with the Gulf Loop Current. At landfall, Wilma had sustained winds of 200 km/h. Over the Florida peninsula, Wilma weakened slightly to a category 2 hurricane, and exited Florida and entered the Atlantic at that strength about 6 hours later. Unexpectedly, Wilma regained strength over the Gulf Stream and once again became a category 3 hurricane north of the Bahamas, regaining all the strength it lost within 12 hours. However, on October 25th, the storm gradually began weakening and became extra-tropical late that afternoon south of Nova Scotia, although it still maintained hurricane strength and affected a large area of land and water with stormy conditions.

Hajj Stampede of 2006


There have been many serious incidents during the Hajj that have led to the loss of hundreds of lives. The Hajj is the Islamic annual pilgrimage to the city of Mecca, Saudi Arabia. There are an estimated 1.3 billion Muslims living today, and during the month of the Hajj, the city of Mecca must cope with as many as 4 million pilgrims. The Muslim world follows a lunar calendar, and therefore, the Hajj month shifts from year to year relative to the Western, solar calendar.

Jet travel also makes Mecca and the Hajj more accessible to pilgrims from all over the world. As a consequence, the Hajj has become increasingly crowded. City officials are consequently required to control large crowds and provide food, shelter, and sanitation for millions. Unfortunately, they have not always been able to prevent disasters, which are hard to avoid with so many people. The worst of the incidents has occurred during the ritual stoning of the devil, an event near the tail end of the Hajj. Saudi authorities had replaced the pillar, which had represented the devil in the past, with an oval wall with padding around the edges to protect the crush of pilgrims. The officials had also installed cameras and dispatched about 60,000 security personnel to monitor the crowds.

On 12 January 2006, a stampede during the ritual stoning of the devil on the last day of the Hajj in Mina, Saudi Arabia, killed at least 346 pilgrims and injured at least 289 more. The stoning ritual is the most dangerous part of the pilgrimage because the ritual can cause people to be crushed, particularly as they traverse the massive two-layer flyover-style Jamarat Bridge (Figure 15) that affords access to the pillars. The incident occurred shortly after 1:00 pm local time, when a passenger bus shed its load of travelers at the eastern access ramps to the bridge. This caused pilgrims to trip, rapidly resulting in a lethal crush. An estimated 2 million people were performing the ritual at the time. Tragically, the stampede was the second fatal tragedy of the Islamic month of Dhu al-Hijjah in 2006. On January 5, 2006, the Al Ghaza Hotel had collapsed. The death toll was seventy-six and the number of injured was sixty-four.

There is a long and tragic history for the Hajj stampede. The surging crowds, trekking from one station of the pilgrimage to the next, cause a stampede. Panic spreads, pilgrims jostle to avoid being trampled, and hundreds of deaths can result. A list of stampede and other accidents during the Hajj season follows.

  • In December 1975, an exploding gas cylinder caused a fire in a tent colony, 200 pilgrims were killed.
  • On November 20, 1979, a group of approximately 200 militant Muslims occupied Mecca’s Grand Mosque. They were driven out by special commandos—allowed into the city under these special circumstances despite their being non-Muslims—after bloody fighting that left 250 people dead and 600 wounded.
  • On July 31, 1987, Iranian pilgrims rioted, causing the deaths of more than 400 people.
  • On July 9, 1989, two bombs exploded, killing one pilgrim and wounding sixteen. Saudi authorities beheaded sixteen Kuwaiti Shiite Muslims for the bombings after originally suspecting Iranian terrorists.
  • On April 15, 1997, 343 pilgrims were killed and 1,500 injured in a tent fire.
  • On July 2, 1990, a stampede inside a pedestrian tunnel—Al-Ma’aisim tunnel—leading out from Mecca toward Mina and the Plains of Arafat led to the deaths of 1,426 pilgrims.
  • On May 23, 1994, a stampede killed at least 270 pilgrims at the stoning of the devil ritual.
  • On April 9, 1998, at least 118 pilgrims were trampled to death and 180 injured in an incident on Jamarat Bridge.
  • On March 5, 2001, 35 pilgrims were trampled in a stampede during the stoning of the devil ritual.
  • On February 11, 2003, the stoning of the devil ritual claimed 14 pilgrims’ lives.
  • On February 1, 2004, 251 pilgrims were killed and another 244 injured in a stampede during the stoning ritual in Mina.
  • A concrete multistory building located in Mecca close to the Grand Mosque collapsed on January 5, 2006. The building—Al Ghaza Hotel—is said to have housed a restaurant, a convenience store, and a hostel. The hostel was reported to have been housing pilgrims to the 2006 Hajj. It is not clear how many pilgrims were in the hotel at the time of the collapse. As of the latest reports, the death toll is 76, and the number of injured is 64.

Critics say that the Saudi government should have done more to prevent such tragedies. The Saudi government insists that any such mass gatherings are inherently dangerous and difficult to handle, and that they have taken a number of steps to prevent problems.

One of the biggest steps, that is also controversial, is a new system of registrations, passports, and travel visas to control the flow of pilgrims. This system is designed to encourage and accommodate first-time visitors to Mecca, while imposing restrictions on those who have already embarked on the trip multiple times. Pilgrims who have the means and desire to perform the Hajj several times have protested what they see as discrimination, but the Hajj Commission has stated that they see no alternative if further tragedies are to be prevented.

Following the 2004 stampede, Saudi authorities embarked on major construction work in and around the Jamarat Bridge area. Additional access-ways, footbridges, and emergency exits were built, and the three cylindrical pillars were replaced with longer and taller oblong walls of concrete to enable more pilgrims’ simultaneous access to them without the jostling and fighting for position of recent years. The government has also announced a multimillion-dollar project to expand the bridge to five levels; the project is planned for completion in time for the 1427 AH Hajj (December 2006–January 2007).

Smith and Dickie’s (1993) book is about engineering for crowd safety, and they list dozens of crowd disasters, including the recurring Hajj stampedes. Helbing et al. (2002) discuss simulation of panic situations from the point of view of nonlinear dynamical systems theory.

Al-Salam Boccaccio 98


Al-Salam Boccaccio 98 was an Egyptian ROPAX (passenger roll on–roll off) ferry, operated by al-Salam Maritime Transport that sank on February 3, 2006 in the Red Sea en route from Duba, Saudi Arabia, to Safaga in southern Egypt. Its last known position was 100 km from Duba, when it lost contact with the shore at about 22:00 EET (20:00 UTC).

The vessel was built by the Italian company Italcantieri in 1970 with IMO number 6921282 and named the Boccaccio at Castellammare di Stabia, Italy. It was originally intended for Italian domestic service. Its dimensions included 130.99-m length overall, with 23.60-m beam and 5.57-m draft. The main engines were rated at 16,560 kW for a maximum speed of 19 knots (35 km/h). The vessel had an original capacity of 200 automobiles and 500 passengers. Five sister ships were built.

The vessel was rebuilt in 1991 by INMA at La Spezia, maintaining the same outer dimensions albeit with a higher superstructure, changing the draught to 5.90 m. At the same time, its automobile capacity was increased to 320, and the passenger capacity was increased to 1,300. The most recent gross registered tonnage was 11,799.

The Boccaccio was purchased in 1999 by al-Salam Maritime Transport, headquartered in Cairo, the largest private shipping company in Egypt and the Middle East, and renamed al-Salam Boccaccio 98; the registered owner is Pacific Sunlight Marine, Inc., of Panama. The ferry is also referred to as Salam 98.

On the doomed voyage, the ship was carrying 1,312 passengers and 96 crew members, according to Mamdouh Ismail, head of al-Salaam Maritime Transport. Originally, an Egyptian embassy spokesman in London had mentioned 1,310 passengers and 105 crews, while the Egyptian presidential spokesman mentioned 98 crew and the Transport Minister said 104. The majority of passengers are believed to have been Egyptians working in Saudi Arabia. Passengers also included pilgrims returning from the Hajj in Mecca. The ship, pictured in Figure 16, was also carrying about 220 vehicles.

First reports of statements by survivors indicated that smoke from the engine room was followed by a fire that continued for some time. There were also reports of the ship listing soon after leaving port, and that after continuing for some hours the list became severe and the ship capsized within 10 minutes as the crew fought the fire. In a BBC radio news broadcast, an Egyptian ministerial spokesman said that the fire had started in a storage area, was controlled, but then started again. The ship turned around and as it turned, the capsize occurred. The significance of the fire was supported by statements attributed to crewmembers who were reported to claim that the firefighters essentially sank the ship when sea water they used to battle the fire collected in the hull because drainage pumps were not working.


The Red Sea is known for its strong winds and tricky local currents, not to mention killer sharks. The region had been experiencing high winds and dust storms for several days at the time of the sinking. These winds may have contributed to the disaster and may have complicated rescue efforts.

There are several theories expressed about possible causes of the sinking:

  • Fire: Some survivors dragged from the water reported that there was a large fire on board before the ship sank, and there were eyewitness accounts of thick black smoke coming from the engine rooms.
  • Design flaws: The al-Salam Boccaccio 98 was a roll on–roll off (ro–ro) ferry. This is a design that allows vehicles to drive on one end and drive off the other. This means that neither the ship nor any of the vehicles need to turn around at any point. It also means that the cargo hold is one long chamber going through the ship. To enable this to work, the vehicle bay doors must be very near the waterline, so if these are sealed improperly, water may leak through. Even a small amount of water moving about inside can gain momentum and capsize the ship, what is known as the free surface effect.
  • Modifications: In the 1980s, the ship was reported to have had several modifications, including the addition of two passenger decks, and the widening of cargo decks. This would have made the ship less stable than it was designed to be, particularly as its draught was only 5.9m. Combined with high winds, the tall ship could have been toppled easily.
  • Vehicle movement: Another theory is that the rolling ship could have caused one or more of the 220 vehicles in its hold to break loose and theoretically be able to puncture a hole in the side of the ship.

At 23:58 UTC on February 2, 2006, the air–sea rescue control room at RAF Kinloss in Scotland detected an automatic distress signal relayed by satellite from the ship’s position. The alert was passed on via France to the Egyptian authorities, but almost 12 hours passed before a rescue attempt was launched. As of February 3, 2006, some lifeboats and bodies were seen in the water. It was then believed that there were still survivors. At least 314 survivors and around 185 dead bodies have been recovered. Reuters reported that “dozens” of bodies were floating in the Red Sea.

Rescue boats and helicopters, including four Egyptian frigates, searched the area. Britain diverted the warship HMS Bulwark that would have arrived in a day and a half, but reports conflict as to whether the ship was indeed recalled. Israeli sources report that an offer of search-and-rescue assistance from the Israeli Navy was declined. Egyptian authorities did, however, accept a United States offer of a P-3 Orion maritime naval patrol aircraft after initially having said that the help was not needed.

The sinking of al-Salam Boccaccio 98 is being compared to that of the 1987 M/S Herald of Free Enterprise disaster, which killed 193 passengers, and also to other incidents. In 1991, another Egyptian ferry, the Salem Express, sunk off the coast of Egypt after hitting a small habili reef and 464 Egyptians lost their lives. The ship is now a landmark shipwreck for SCUBA divers along with the SS Thistlegorm. In 1994, the M/S Estonia sank, claiming 853 lives. On  September 26, 2002, the M/S Joola, a Senegalese government-owned ferry, capsized off the coast of Gambia, resulting in the deaths of at least 1,863 people. On October 17, 2005, the Pride of al-Salam 95, a sister ship of the al-Salam Boccaccio 98, also sank in the Red Sea, after being struck by the Cypriot-registered cargo ship Jebal Ali. In that accident, 2 people were killed and another 40 were injured, some perhaps during a stampede to leave the sinking ship. After evacuating all the ferry passengers and crew, the Jebal Ali went astern and the Pride of al-Salam 95 sank in about 3 minutes.

What is most tragic about the al-Salam Boccaccio 98’s incident is the utter ineptness, corruption, and collusion of both the Egyptian authorities and the holding company staff, particularly its owner, a member of the upper chamber of Parliament and a close friend to an even more powerful politician in the inner circle of the president. The 35-year-old ferry was not fit for sailing, and was in fact prevented from doing so in European waters, yet licensed to ferry passengers despite past violations and other mishaps by this and other ships owned by the same company. The captain of the doomed ferry refused to turn the ship around to its nearer point of origin despite the fire on board, and a passing ship owned by the same company ignored the call for help from the sinking ferry. Rescue attempts by the government did not start for almost 12 hours after the sinking, despite a distress signal from the ship that went around the globe and was reported back to the Egyptian authorities. Many officials failed to react promptly because an “important” soccer game was being televised. Rescued passengers told tales of the ship’s crew, including the captain, taking the few lifeboats available to themselves before attempting to help the helpless passengers. The company’s owner and his family were allowed to flee the country shortly after the disaster despite a court order forbidding them from leaving Egypt. Local news media provided inaccurate reporting and then ignored the story altogether within a few weeks to focus on another important soccer event. Victims and their relatives were left to fend for themselves, all because they were the poorest of the poor, insignificant to the rich, powerful and mighty. Disasters occur everywhere, but in a civilized country, inept response as which occurred in Egypt would have meant the fall of the government, the punishment of few a criminals and, most important, less tragic loss of life.

Bird Flu


A pandemic is a global disease outbreak. A flu pandemic occurs when a new influenza virus emerges for which people have little or no immunity, and for which there is no vaccine. The disease spreads easily from person to person, causes serious illness, and can sweep across countries and around the world in a very short time. It is difficult to predict when the next influenza pandemic will occur or how severe it will be. Wherever and whenever a pandemic starts, everyone around the world is at risk. Countries might, through measures such as border closures and travel restrictions, delay arrival of the virus, but they cannot prevent or stop it.

The highly pathogenic avian H5N1 avian flu is caused by the influenza A virus that occurs naturally among birds. There are different subtypes of these viruses because of changes in certain proteins (hemagglutinin [HA] and neuraminidase [NA]) on the surface of the influenza A virus and the way the proteins combine. Each combination represents a different subtype. All known subtypes of influenza A viruses can be found in birds. The avian flu currently of concern is the H5N1 subtype.

Wild birds worldwide carry avian influenza viruses in their intestines, but they usually do not get sick from them. Avian influenza is very contagious among birds and can make some domesticated birds, including chickens, ducks, and turkeys, very sick and even kill them. Infected birds shed influenza virus in their saliva, nasal secretions, and feces. Domesticated birds may become infected with avian influenza virus through direct contact with infected waterfowl or other infected poultry, or through contact with surfaces (e.g., dirt or cages) or materials (e.g., water or feed) that have been contaminated with the virus.

Avian influenza infection in domestic poultry causes two main forms of disease that are distinguished by low and high extremes of virulence. The “low pathogenic” form may go undetected and usually causes only mild symptoms such as ruffled feathers and a drop in egg production. However, the highly pathogenic form spreads more rapidly through flocks of poultry. This form may cause disease that affects multiple internal organs and has a mortality rate that can reach 90% to 100%, often within 48 hours.

Human influenza virus usually refers to those subtypes that spread widely among humans. There are only three known A subtypes of influenza viruses (H1N1, H1N2, and H3N2) currently circulating among humans. It is likely that some genetic parts of current human influenza A viruses originally came from birds. Influenza A viruses are constantly changing, and other strains might adapt over time to infect and spread among humans. The risk from avian influenza is generally low to most people because the viruses do not usually infect humans. H5N1 is one of the few avian influenza viruses to have crossed the species barrier to infect humans, and it is the most deadly of those that have crossed the barrier.

Since 2003, a growing number of human H5N1 cases have been reported in Azerbaijan, Cambodia, China, Egypt, Indonesia, Iraq, Thailand, Turkey, and Vietnam. More than half of the people infected with the H5N1 virus have died. Most of these cases are all believed to have been caused by exposure to infected poultry (e.g., domesticated chicken, ducks, and turkeys) or surfaces contaminated with secretion/excretions from infected birds. There has been no sustained human-to-human transmission of the disease, but the concern is that H5N1 will evolve into a virus capable of human-to-human transmission. The virus has raised concerns about a potential human pandemic because it is especially virulent; it is being spread by migratory birds; it can be transmitted from birds to mammals and, in some limited circumstances, to humans; and similar to other influenza viruses, it continues to evolve.

In 2005, animals perished by the bird flu were left in the muddy streets of a village in Egypt, exasperating an already dire situation. Rumors were rampant about contaminating the entire water supply in Egypt, which comes from the Nile River. Cases of the deadly H5N1 bird flu virus have been reported in at least fifteen governorates, and widespread panic among Egyptians has been reported. The Egyptian government has ordered the slaughter of all poultry kept in homes as part of an effort to stop the spread of bird flu in the country. A ban on the movement of poultry between governorates is in place. Measures already announced include a ban on the import of live birds, and officials say there have been no human cases of the disease. The government has called on Egyptians to stay calm, and not to dispose of slaughtered or dead birds in the roads, irrigation canals, or the Nile River.

Symptoms of avian influenza in humans have ranged from typical human influenza-like symptoms (e.g., fever, cough, sore throat, muscle aches) to eye infections, pneumonia, severe respiratory diseases such as acute respiratory distress, and other severe and life-threatening complications. The symptoms of avian influenza may depend on which virus caused the infection.

A pandemic may come and go in waves, each of which can last for six to eight weeks. An especially severe influenza pandemic could lead to high levels of illness, death, social disruption, and economic loss. Everyday life would be disrupted because so many people in so many places would become seriously ill at the same time. Impacts can range from school and business closings to the interruption of basic services such as public transportation and food delivery.

If a pandemic erupts, a substantial percentage of the world’s population will require some form of medical care. Health care facilities can be overwhelmed, creating a shortage of hospital staff, beds, ventilators, and other supplies. Surge capacity at non-traditional sites such as schools may need to be created to cope with demand. The need for vaccines is likely to outstrip supply, and the supply of antiviral drugs is also likely to be inadequate early in a pandemic. Difficult decisions will need to be made regarding who gets antiviral drugs and vaccines. Death rates are determined by four factors: 1) the number of people who become infected, 2) the virulence of the virus, 3) the underlying characteristics and vulnerability of affected populations, and 4) the availability and effectiveness of preventive measures.

The U.S. government site lists the following pandemic death tolls since 1900:

  • 1918–1919; United States 675,000+; worldwide 50 million+
  • 1957–1958; United States 70,000+; worldwide 1–2 million
  • 1968–1969; United States 34,000+; worldwide 700,000+


The United States is collaborating closely with eight international organizations, including the UN’s World Health Organization (WHO), the Food and Agriculture Organization also of the UN, the World Organization for Animal Health, and 88 foreign governments to address the situation through planning, greater monitoring, and full transparency in reporting and investigating avian influenza occurrences. The US and its international partners have led global efforts to encourage countries to heighten surveillance for outbreaks in poultry and significant numbers of deaths in migratory birds and to rapidly introduce containment measures. The U.S. Agency for International Development and the U.S. Department of State, Department of Health and Human Services, and Department of Agriculture are coordinating future international response measures on behalf of the White House with departments and agencies across the federal government. Together, steps are being taken to minimize the risk of further spread in animal populations, reduce the risk of human infections, and further support pandemic planning and preparedness. Ongoing detailed mutually coordinated onsite surveillance and analysis of human and animal H5N1 avian flu outbreaks are being conducted and reported by the USGS National Wildlife Health Center, the Centers for Disease Control and Prevention, the WHO, the European Commission, as well as others.

Energy Crisis / Global Warming

The energy crisis and its intimately related global warming problem are two examples of slowly-evolving disasters that do not get the attention they deserve, at least until recently. Energy crisis is defined as any great shortfall (or price rise) in the supply of energy resources to an economy. There is no immediacy to this type of calamity, despite the adverse effects on the health, economic, and social well-being of billions of people around the globe. Herein I offer a few personal reflections on energy, global warming and the looming crisis, with the United States in mind. The arguments made, however, may apply with equal intensity to many other countries.

Nothing can move, let alone survive, without it. Yet, until a gallon of gas hit $4, the word “energy” was rarely uttered during the 2008 presidential campaign. Promises to effect somehow a lower price of gas at the pump, or of a Federal gas tax break during this summer, are at best a short-term band-aid to what should be a much broader and longer-term national debate. During two visits to Saudi Arabia that took place on January 15, 2008 and May 16, 2008, President Bush pleaded with King Abdullah to open the oil spigots, while the Royal told his eminent visitor how worried he is about the impact of oil prices on the world economy. The spigots did not open; and even if they were, such pleas and worries are not going to solve the energy problem or the global warming crisis.

Much like company executives, politicians ponder envision and articulate issues in terms of years, not decades. A four-year horizon is about right, as this is the term for a president, twice that for a representative, and two-thirds of a senate term. The tenure of a typical CEO is even shorter than that for a senator. But the debate on energy should ideally be framed in terms of a human lifespan, currently about 75 years. The reason is two folds. First, fossil fuel, such as oil, gas and coal, is being consumed at a much faster rate than nature can make it. These are not renewable resources. Considering the anticipated population growth (with a conservative albeit unrealistic assumption of no increase in the per capita demand) and the known reserves of this type of energy sources, the world supply of oil is estimated to be exhausted in 0.5 lifespan, of gas in one lifespan, and of coal in 1.5 lifespan. Second, alternative energy sources must be developed to prevent a colossal disruption of our way of life. But, barring miracles, those cannot be found overnight, but rather over several decades of intensive research and development. The clock is ticking, and few people seem to be listening to the current whisper and, inevitably, the future thunder.

Uranium fission power plants currently supply about 8% of the U.S. total energy need, which is about 100 Quad/year or $1020 Joule/year. (Total energy consumed is in the form of electricity [40%], the burning of fossil fuel to directly generate heat for buildings and industrial processes [30%], and mechanical energy for transportation systems [30%].) Coal, natural gas and nuclear power plants respectively generate 50, 20 and 20% of our electricity need. The corresponding numbers in France are, respectively, 4, 4 and 80%. Even at that modest rate of consumption and with current nuclear reactor technology, the United States will exhaust its supply of uranium in about two lifespan. Real and imagined concerns about the safety of nuclear energy and depositions of their spent fuel have brought to a halt all new constructions since the mid-1970s. Happily, 2007 breathed new life into the nuclear issue. There are now 7 new nuclear reactors in the early planning stages for the U.S. market, and over 65 more for China, Japan, India, Russia and South Korea.

Fission-based power generation not only can reduce the country’s insatiable appetite for fossil fuel, but also no carbon dioxide or any other heat-trapping gases are generated as a result of nuclear power generation. Along with other pollutants, a coal-fired power plant, in contrast, annually releases 10 billion kg of carbon dioxide into the atmosphere for each 1,000 MW of (fully utilized) electric capacity. Nuclear power generation must be part of the solution to both the energy and global warming crises.

Controlled nuclear fusion, also a non-polluting source of energy, has the potential to supply inexhaustibly all of our energy need, but, even in the laboratory, we are far from achieving the breakeven point (meaning getting more energy from the reactor than needed to sustain the reaction).

With 5% of the world population, the United States consumes 25% of the world annual energy usage, generating in the process a proportional amount of greenhouse gases. Conservation alone is not going to solve the problem; it will merely relegate the anticipated crises to a later date. A whopping 20% conservation effort this year will be wiped out by a 1% annual population increase over the next 20 years. But that does not mean it should not be done. Without conservation, the situation will be that much worse.

The energy crises exemplified by the 1973 Arab oil embargo brought about a noticeable shift of attitudes toward energy conservation. During the 1970s and 1980s, governments, corporations and citizens around the world, but particularly in the industrialized countries, invested valuable resources searching for methods to conserve energy. Dwellings and other buildings became better insulated, and automobiles and other modes of transportation became more energy efficient. Plentiful fossil fuel supplies during the 1990s and the typical short memory of the long gas lines during 1973 have, unfortunately, somewhat dulled the urgency and enthusiasm for energy conservation research as well as practice. Witness—at least in the United States—the awakening of the long-hibernated gas-guzzler automobile and the recent run on house-size sport utility vehicles, a.k.a. land barges. The $140 plus barrel of crude oil this year has reignited interest in conservation. But in my opinion, the gas at the pump needs to skyrocket to a painful $10 per gallon to have the required shock value. The cost is close to that much in Europe, and the difference in attitudes between the two continents is apparent.

Conservation or not, talk of energy independence is just that, unless alternative energy sources are developed. The United States simply does not have traditional energy sources in sufficient quantities to become independent. In fact, our energy dependence has increased steadily since the 1973 oil crisis. The non-traditional sources are currently either non-existent or too expensive to compete even with the $4 per gallon at the pump. But a $10 price tag will do the trick, one day.

How do we go from here to there? We need to work on both the supply side and the demand side. On the latter, consumers need to moderate their insatiable appetite for energy. Homes do not have to be as warm in the winter as a crowded bus, or as cold in the summer as a refrigerator. A car with a 300-horsepower engine (equivalent to 300 live horses, really) is not needed to take one person to work via congested city roads. Additionally, new technology can provide even more efficient air, land and sea vehicles than exist today. Better insulated buildings, less wasteful energy conversion, storage and transmission systems, and many other measures save energy; every bit helps.

On the supply side, we need to develop the technology to deliver non-traditional energy sources inexpensively, safely and with minimum impact on the environment. The U.S. and many other countries are already searching for those alternative energy sources. But are we searching with sufficient intensity? With enough urgency? I think not, simply because the problem does not affect, with sufficient pain, this or the next presidential election, but rather the 5th or 10th one down the road. Who is willing to pay more taxes now for something that will benefit the next generation? Witness the unceremonious demise of former President Carter’s Energy Security Corporation, which was supposed to kick off with the issuance of $5 billion energy bonds. One way to assuage the energy problem is to increase usage taxes, thus help curb demands, and to use the proceeds to develop new supplies. Amazingly, few politicians are considering decreasing those taxes.

Let us briefly appraise the non-traditional sources known or even (sparingly) used today. The listing herein is not exhaustive, and other technologies unforeseen today may be developed in the future. Shale oil comes from sedimentary rock containing dilute amounts of near-solid fossil fuel. The cost, in dollar as well as in energy, of extracting and refining that last drop of oil is currently prohibitive. Moreover, the resulting fuel is not any less polluting than other fossil fuels. There are also the so-called renewable energy sources. Though the term is a misnomer because once energy is used it is gone forever, those sources are inexhaustible in the sense that they cannot be used faster than nature makes them. The Sun is the source of all energy on Earth, providing heat, light, photosynthesis, winds, waves, life and its eventual albeit very slow decay into fossil fuel, etc. Renewable energy sources will always be here as long as the Sun stays alight, hopefully for a few more billion years.

Using the Sun’s radiation, when available, to generate either heat or electricity is limited by the available area, the cost of the heat collector or the photovoltaic cell, and the number of years of operation it takes the particular device to recover the energy used in its manufacturing. The U.S. is blessed with its enormous land, and can in principle generate all of its energy need via solar cells utilizing less than 3% of available land area. Belgium, in contrast, requires an unrealistic 25% of its land area to supply its energy need using the same technology. Solar cells are presently inefficient as well as expensive. They also require about 5 years of constant operation just to recover the energy spent on their manufacturing. Extensive R & D is needed to improve on all those fronts.

Wind energy, though not constant, is also inexhaustible, but it has similar limitations to those of solar cells. Without tax subsidies, generating electricity via windmills currently cannot compete with fossil fuel or even nuclear power generation. Other types of renewable energy sources include hydroelectric power; biomass; geophysical and oceanic thermal energy; and ocean waves and tides. Food-based biomass is a low-carbon fuel when compared to fossil oil. Depending on how they are produced, however, biofuels may or may not offer net reduction of carbon dioxide emissions (Science, DOI: 10.1126/science.1153747, published online February 7, 2008). Hydrogen provides clean energy, but has to be made using a different source of energy, for example photovoltaic cells. Despite all the hype, the hydrogen economy is not a net energy saver, but has other advantages nevertheless. Even such noble cause as hydrogen-fueled or battery-powered automobiles will reduce pollution and dependence on fossil fuel only if nuclear power or other non-fossil, non-polluting energy sources are used to produce the hydrogen or to generate the electricity needed to charge the batteries.

Are we investing enough to solve the energy crisis? We recite some alarming statistics provided in a recent article (Domenici, 2006) by the then chair of the U.S. Senate Energy and Natural Resources Committee, Pete V. Domenici. Federal funding for energy Research and Development has been declining for years, and it is not being made up by increased private sector R & D expenditure. Over the 25-year period from 1978 to 2004, federal appropriations fell from $6.4 billion to $2.75 billion in constant 2000 dollars, nearly 60% reduction. Private sector investment fell from about $4 billion to $2 billion during the period from 1990 to 2006. Compared to high-technology industries, energy R & D expenditure is the least intensive. For example, the private sector R & D investment is about 12% of sales in the pharmaceuticals industry and 15% in the airline industry, while the combined federal and private-sector energy R & D expenditure is less than 1% of total energy sales.

What is now needed is a visionary leader that will inspire the nation to accept the pain necessary to solve its energy problems and in the process help the world slow down global warming. The goal is to reduce significantly the country’s dependence on foreign and domestic fossil fuel, replenishing the deficit with renewable, non-polluting sources of energy. The scale of the challenge is likely to be substantially larger than that of the 1940s Manhattan Project or the 1960s Apollo program. In his ‘malaise’ speech of July 15, 1979, Jimmy Carter lamented, “Why have we not been able to get together as a nation to resolve our serious energy problem?” Why not indeed Mr. President.

Scope of the Sample Disasters


In this subsection we evaluate the scope of the thirteen case studies (two earthquakes are discussed in a single subsection, 7.1) used as examples of natural and manmade disasters. The metric introduced in Section 2 is utilized to rank those disasters. Recall, the scope of a disaster is based on the number of people adversely affected by the extreme event (killed, injured, evacuated, etc.) or the extent of the stricken geographical area. The results are summarized in Table 1.

Disaster Date Scope Descriptor Basis
San Francisco earthquake 18 April 1906 V Gargantuan 3,000 death; 300,000 homeless
Hyatt Regency walkway collapse 17 July 1981 III Large 114 deaths; 200 injured
Loma Prieta earthquake 17 October 1989 IV Enormous 66 death; 3,757 injured
Izmit earthquake 17 August 1999 V Gargantuan 17,000–35,000 deaths
September 11 11 September 2001 IV Enormous 2,993 deaths; 6,291 injured
Pacific tsunami 26 December 2004 V Gargantuan 283,100 deaths
Hurricane Katrina 25 August 2005 V Gargantuan 1,836 deaths; 1.2 million evacuated
Kashmir earthquake 8 October 2005 V Gargantuan 73,276 deaths; 3.3 million homeless
Hurricane Wilma 18 October 2005 V Gargantuan 63 deaths in US; 

500,000 evacuated in Cuba

Hajj stampede 12 January 2006 III Large 346 deaths; 289 injured
Al-Salam Boccaccio 98 3 February 2006 IV Enormous 1,094 deaths
Bird flu 2003–present III Large Number of stricken in the hundreds
Energy crises/global warming Since industrial revolution V Gargantuan Cover entire Earth

Table 1.  Scope of the disasters described.


For the Izmit earthquake the number of deaths reported by the government differs from that widely believed to be the case, hence the range shown in the table. Either number puts the disaster at the worst possible category (V) and therefore the number of injured or homeless becomes immaterial to the categorization; the scope cannot get any higher.

Of note is the scope of the September 11 manmade disaster, which is less than the scope of, say, hurricane Katrina. The number of people directly and adversely affected by September 11 is less than those in the case of Katrina (number of deaths is not the only measure). On the other hand, September 11 has a huge after effect in the United States and elsewhere, shifting the geopolitical realities and triggering the ensuing war on terrorism that still rages many years later. The number of people adversely affected by that war is not considered in assigning a scope to September 11.

The avian influenza is still in its infancy and fortunately has not yet materialized into a pandemic, hence the relatively low scope. The energy crises and its intimately related global warming problem have not yet resulted in widespread deaths or injuries, but both events are global in extent essentially affecting the entire world. Thus the descriptor gargantuan assigned to both is based on the size of the adversely affected geographical area.

Concluding Remarks

The prediction, control, and mitigation of both natural and manmade disasters is a vast field of research that no one article can cover in any meaningful detail. In this article, we defined what constitutes a large-scale disaster, introduced a metric to evaluate its scope, and described the different facets of disaster research. Basically, any natural or manmade event that adversely affects many humans or an expanded ecosystem is a large-scale disaster. Such catastrophes tax the resources of local communities and central governments and disrupt social order. The number of people tormented, displaced, injured or killed and the size of the area adversely affected determine the disaster’s scope.

In this paper, we showed how science can help in predicting different types of disaster and reducing their resulting adverse effects. We listed a number of recent disasters to provide a few examples of what can go right or wrong with managing the mess left behind by every large-scale disaster.

The laws of nature, reflected in the science portion of any particular calamity, and even crisis management, reflected in the art portion, should be the same, or at least quite similar, no matter where or what type of disaster strikes. Humanity should benefit from the science and the art of predicting, controlling, and managing large-scale disasters, as extensively and thoroughly discussed in this paper.

The last annus horribilis, in particular, has shown the importance of being prepared for large-scale disasters, and how the world can get together to help alleviate the resulting pain and suffering. In its own small way, this article better prepares scientists, engineers, first responders, and, above all, politicians to deal with manmade and natural disasters.

The most significant contribution of this article is perhaps the proposal to consider all natural and manmade disasters as dynamical systems. Though not always easy, looking at the problem from that viewpoint and armed with the modern tools of dynamical systems theory may allow better prediction, control and mitigation of future disasters. It is hoped that few readers of the Journal of Critical Incident Analysis who are not already involved in the mechanistic viewpoint of disaster research would benefit from this particular framework whose practical importance cannot be overstated.




[1] Of course, the number of residents of Egypt was far less than 80 million when the disaster commenced in 1952. Fortunately, the youth revolution, commencing on 25 January 2011, seems to have lifted the long nightmare.


2 Actually delayed by a few years due to World War I and relocation to France. Richardson chose that particular time and date because upper air and other measurements were available to him some years before.


3 Newtonian implies a linear relation between the stress tensor and the symmetric part of the deformation tensor (rate of strain tensor). The isotropy assumption reduces the 81 constants of proportionality in that linear relation to two constants. Fourier fluid is that for which the conduction part of the heat flux vector is linearly related to the temperature gradient, and again isotropy implies that the constant of proportionality in this relation is a single scalar.


4 An assumption that obviously needs to be relaxed for most atmospheric flows, where radiation from the Sun during the day and to outer space during the night plays a crucial rule in weather dynamics. Estimating radiation in the presence of significant cloud cover is one of the major challenges in atmospheric science.


5 The number of first-order ordinary differential equations, each of the form , which completely describe the autonomous system’s evolution, is in general equal to the number of degrees of freedom N. The latter number is in principle infinite for a dynamical system whose state is described by partial differential equation(s). For example, a planar pendulum has two degrees of freedom, a double planar pendulum has three, a single pendulum that is free to oscillate in three dimensions has four, and a turbulent flow has infinite degrees of freedom. The single pendulum is incapable of producing chaotic motion in a plane, the double pendulum does if its oscillations have sufficiently large (nonlinear) amplitude, the single, non-planar, nonlinear pendulum is also capable of producing chaos, and turbulence is spatiotemporal chaos whose infinite degrees of freedom can be reduced to a finite but large number under certain circumstances.


6 Meaning analytical solutions of the differential equations governing the dynamics are not obtainable, and numerical integrations of the same lead to chaotic solutions.




Abbott, P. (2005). Natural Disaster. San Diego, CA: McGraw-Hill.

Adamatzky, A. (2005). Dynamics of crowd-minds: Patterns of irrationality in emotions,

beliefs and actions. London: World Scientific.

Alexander, D. (1993). Natural disaster. Berlin: Springer.

Alexander, D. (2000). Confronting catastrophe: New perspectives on natural disasters.

London: Oxford University Press.

Allen, J. (2005). Predicting natural disasters. Farmington Hills, Michigan: Thomson Gale.

Allen, I. (Producer), & Allen, I. (Director). (1978). The swarm [Motion picture]. United States:

Warner Bros. Pictures.

Allen, I. (Producer), Marshall, S. (Producer), Broidy, S. (Producer), & Neame, R. (Director).

(1972). The Poseidon adventure [Motion picture]. United States: Twentieth Century Fox

Film Corporation, & Kent Productions.

Allen, J. (Producer), Marshall, S. (Producer), & Guillermin, J. (Director). (1974). The towering

inferno [Motion picture]. United States: Irwin Allen Productions, Twentieth Century Fox

Film Corporation, & Warner Bros. Pictures.

Altay, N., & Green III, W. G. (2006). OR/MS research in disaster operations management.           European Journal of Operational Research., 175, 475–493.

Bankoo, G., Frerks, G., & and Hilhorst, D. (2004). Mapping vulnerability: Disasters,

development, and people. Gateshead, United Kingdom: Earthscan Publications.

Bradshaw, J. (Producer), Parkes, W. F. (Producer), Spielberg, S. (Producer), Brown, D.

(Producer), Zanuck, R. D. (Producer), Easton, D. S. (Producer), & Leder, M. (Director).

(1998). Deep impact [Motion picture]. United States: Paramount Pictures, DreamWorks

SKG, Zanuck/Brown Productions, & Manhattan Project.

Bunde, A., Kropp, J., &, and Schellnhuber, H. J. (2002). The science of disasters: Climate

disruptions, heart attacks, and market crashes. Berlin: Springer.

Burby, R. J. (1998). Cooperating with nature: Confronting natural hazards with land-

use planning for sustainable communities. Washington, DC: John Henry Press.

Chasin, L. (Producer), Hayward, D. (Producer), Bevan, T. (Producer), Fellner, E. (Producer),

Levin, L. (Producer), Bronner, M. (Producer), Solomon, K. (Producer), Bett, M.

(Producer), & Greengrass, P. (Director). (2006). United 93 [Motion picture]. United

States, United Kingdom, & France: Universial Pictures, Studio Canal, Sidney Kimmel

Entertainment, & Working Title Films.

Childs, D. R., & Dietrich, S. (2002). Contingency planning and disaster recovery: A

small business guide. New York: Wiley.

Churchman, C. W., Ackoff, R. L., & Arnoff, E. L. (1957). Introduction to operations

research. New York: Wiley.

Cooper, C., & Block, R. (2006). Disaster: Hurricane Katrina and the failure of Homeland

Security. New York: Times Books.

Craig, D. (Producer), Fine, D. (Producer), Gerber, D. (Producer), George, C. (Producer), &

Markle, P. (Director), (2006). Flight 93 [Motion picture]. United States, & Canada: A&E.

Cutter, S. L. (2001). American hazardscapes: The regionalization of hazards and

disasters. Washington, DC: John Henry Press.

Davis, S. H. (2000). Interfacial fluid dynamics. In G. K. Batchelor, H. K. Moffatt, & M. G.

Worster (Eds.), Perspectives in fluid dynamics: A collective introduction to

current research, (pp. 1–51). London: Cambridge University Press.

de Boer, J., & van Remmen, J. (2003). Order in chaos: Modelling medical disaster

management using emergo metrics. Culemborg, The Netherlands: LiberChem

Publication Solution.

der Heide, E. A. (1989). Disaster response: Principles of preparation and coordination. St.

Louis, MO: C. V. Mosby.

Dilley, M., Chen, R. S., Deichmann, U., Lerner-Lam, A. L., & Arnold, M. (2005). Natural

disaster hotspots: A global risk analysis. Washington, DC: World Bank Publications.

Domenici, P. V. (2006). Meeting our long-term energy needs through federal R&D. APS News 15(9), 8.

Emerson, J. (Producer), Hyman, B. H. (Producer), & Van Dyke W. S. (Director). (1936). San

Francisco [Motion picture]. United States: Metro-Goldwyn-Mayer

Engelbert, P., Deschenes, B., Nagel, R., & Sawinski, D. M. (2001). Dangerous planet—

The science of natural disasters. Farmington Hills, Michigan: Thomson Gale.

Fischer, III, H. W. (1998). Response to disaster: Fact versus fiction & its perpetuation:

The sociology of disaster (2nd Ed.). Lanham, Maryland: University Press of America.

Fischer, III, H. W. (August, 2003a). The sociology of disaster: Definition, research

questions, measurements in a post-September 11, 2001 environment. Presented

at the 98th Annual Meeting of the American Sociological Association, Atlanta, GA

Fischer, III, H. W. (2003b). The sociology of disaster: Definition, research questions, and

measurements. Continuation of the discussion in a post-September 11 Environment.

Internal Journal of Mass Emergencies & Disasters, 21, 91–107.

Fraedrich, K., & C.D. Schönwiese, C. D. (2002). Space–time variability of the European

climate. In A. Bunde, J. Kropp, & H. J. Schellnhuber (Eds.), The science of            disasters: Climate disruptions, heart attacks, and market crashes (pp. 105–139). Berlin:          Springer.

Gad-el-Hak, M. (2008). Large-scale Disasters: Prediction, control, and mitigation, London:          Cambridge University Press.

Gad-el-Hak, M. (1995). Questions in fluid mechanics: Stokes’ hypothesis for a Newtonian,

isotropic fluid. Journal of Fluids Engineering, 117, 3–5.

Gad-el-Hak, M. (1999). The fluid mechanics of micro devices—The Freeman scholar

lecture. Journal of Fluids Engineering, 121, 5–33.

Gad-el-Hak, M. (2000). Flow control: Passive, active, and reactive flow management.

London: Cambridge University Press.

Gad-el-Hak, M. (2006). The MEMS handbook (2nd Ed., Vol. I-IV). Boca Raton, FL: CRC Taylor             & Francis.

Gad-el-Hak, M. (2010). Facets and scope of large-scale disasters. Natural Hazards Review

11, 1–6.

Gall, R., & Parsons, D. (2006). It’s hurricane season: Do you know where your

storm is? IEEE Spectrum, 43, 27–32.

Garcia, R. (Producer), Globus, Y. (Producer), Golan, M. (Producer), Goldston, R. A. (Producer),

Raz, M. (Producer), Weinstein, H. T. (Producer), Whitmore, R. (Producer), &

Konchalovskiy, A. (Director). (1985). Runaway train [Motion picture]. United States:

Golan-Globus Productions, & Northbrook Films.

Garrett, C. (2000). The dynamic ocean. In G. K. Batchelor, H. K. Moffatt, & M. G. Worster

(Eds.), Perspectives in Fluid Dynamics: A Collective Introduction to Current Research

(pp. 507–556). London: Cambridge University Press.

Gist, R., & Lubin, B. (1999). Response to disaster: Psychosocial, community, and

ecological Approaches. Philadelphia, PA: Bruner/Mazel.

Glass, T. A. (2001). Understanding public response to disasters. Public Health Reports, 116,


Glass, T. A., & Schoch-Spana, M. (2002). Bioterrorism and the people: How to vaccinate a

city against panic. Clinical Infectious Diseases, 34, 217–223.

Golightly, N. (Producer), Lee, Jr., D. J. (Producer), Borman, M. (Producer), Hill, D. (Producer),

Shamberg, M. Producer), Sher, S. (Producer), Wilson, R. S. (Producer), Feghali, C.

(Producer), & Stone, O. (Director). (2006). World trade center [Motion picture]. United

States: Paramount Pictures, Double Feature Films, Intermedia Films, & Kernos

Filmproduktionsgesellschaft & Company.

Hasselmann, K. (2002). Is climate predictable? In A. Bunde, J. Kropp, & H. J. Schellnhuber

(Eds.), The science of disasters: Climate disruptions, heart attacks, and market

crashes (pp. 141–169). Berlin: Springer.

Helbing, D., Farkas, I. J., & Vicsek, T. (2002). Crowd disasters and simulation of panic

situations. In A. Bunde, J. Kropp, & H. J. Schellnhuber (Eds.), The science of

disasters: Climate disruptions, heart attacks, and market crashes (pp. 331–350). Berlin:      Springer.

Henderson, D. (Producer), Kopelson, A. (Producer), Katz, G. (Producer), Kopelson, A.

(Producer), Peterson, W. (Producer), Brown, S. (Producer), Greenwald, N. (Producer),

Panitch, S. (Producer), & Peterson, W. (Director). (1995). Outbreak [Motion picture].

United States: Warner Bros. Pictures, Arnold Kopelson Productions, & Punch       Productions.

Hensleigh, J. (Producer), Oman, C. (Producer), Van Wyck, J. (Producer), Bay, M. (Producer),

Bruckheimer, J. (Producer), Hurd, G. A. (Producer), Sandston, P. (Producer), Waldman,

B. H. (Producer), Bates, K. (Producer), & Bay, M. (Director). (1998). Armageddon

[Motion picture]. United States: Touchstone Pictures, Jerry Bruckheimer Films, &

Valhalla Motion Pictures.

Howard, S. (Producer), Katzka, G. (Producer), Orgolini, A. H. (Producer), Parvin, T. R.

(Producer), Shaw, R. R. (Producer), & Neame, R. (Director). (1979). Meteor [Motion

picture]. United States: American International Pictures, Meteor Joint Venture, &

Palladium Productions.

Huppert, H. E. (2000). Geological fluid mechanics. In G. K. Batchelor, H. K. Moffatt, & M.

G. Worster (Eds.), Perspectives in fluid dynamics: A collective introduction to

current research (pp. 447–506). London: Cambridge University Press.

Hunter, R. (Producer), Mapes, J., & Seaton, G. (Director). (1970). Airport [Motion picture].

United States: Universal Pictures.

Kamen, A. (2007, October 26). FEMA meets the press, which happens to be…FEMA. The

Washington Post, pp. A19

Kunreuther, H., & Roth, Sr., R. J. (1998). The status and role of insurance against natural

disasters in the United States. Washington, DC: John Henry Press.

Lawson, A. C. (1908). The California earthquake of April 18, 1906: Report of the State

Earthquake Investigation Commission (Vols. I & II). Carnegie Institution of

Washington Publication 87, Washington, DC.

Lang, J. (Producer), Robson, M. (Producer), Donnenfeld, B. (Producer), & Robson, M.

(Director). (1974). Earthquake [Motion picture]. United States: Universal Pictures, &

The Filmakers Group.

Linden, P. F. (2000). Convection in the environment. In G. K. Bachelor, H. K. Moffatt, & M.

G. Worster (Eds.), Perspectives in fluid dynamics: A collective introduction to

current research (pp. 289–345). London: Cambridge University Press.

Lorenz, E. N. (1963). Deterministic nonperiodic flow. Journal of Atmospheric Science, 20,


Lorenz, E. N. (1967). The nature and theory of the general circulation of the atmosphere. World   Meteorological Organization, Geneva, Switzerland.

MacDonald, L. (Producer), Molen, G. R. (Producer), Parkes, W. F. (Producer), Salloum, G.

(Producer), Spielberg, S. (Producer), Bryce, I. (Producer), Crichton, M. (Producer),

Kennedy, K. (Producer), & de Bont, J. (Director). (1996). Twister [Motion picture].

United States: Warner Bros. Pictures, Universal Pictures, Amblin Entertainment, &

Constant C Productions.

MacQuitty, W. (Producer), St. John, E. (Producer), & Baker, R. W. (Director). (1958). A night to

remember [Motion picture]. United Kingdom: The Rank Organisation.

McFedries, P. (2006). Changing climate, changing language. IEEE Spectrum, 43, August, 60.

McIntyre, M. E. (2000). On Global-scale atmospheric circulations. In G. K. Batchelor, H.

K. Moffatt, & M. G. Worster (Eds.), Perspectives in fluid dynamics: A collective

introduction to current research (pp. 557–624). London: Cambridge University Press.

McKee, K., & Guthridge, L. (2006). Leading people through disasters: An action

guide. San Francisco, CA: Berrett-Koehler.

Mileti, D. S. (1999). Disasters by design: A reassessment of natural hazards in the United

States. Washington, DC: Joseph Henry Press.

Olasky, M. (2006). The politics of disaster: Katrina, big government, and a new strategy for           future crises. Nashville, TN: W Publishing Group.

Ott, E. (1993). Chaos in dynamical systems. London: Cambridge University Press Cambridge.

Panton, R. L. (2005). Incompressible flow (3rd Ed.). Hoboken, NJ: Wiley.

Patanè, D., de Gori, P., Chiarabba, C., & Bonaccorso, A. (2006). Magma ascent and the

pressurization of Mount Etna’s volcanic system. Science, 299, 2061–2063.

Pelling, M. (2003). The vulnerability of cities: Natural disaster and social resilience.

Gateshead, United Kingdom: Earthscan Publications.

Pickett, S. T. A., & White, P. S. (1985). The ecology of natural disturbance and patch

dynamics. San Diego, CA: Academic Press.

Posner, R. A. (2004). Catastrophe: Risk and response. Oxford: Oxford University Press.

Quarantelli, E. L. (1998) What is a disaster: Perspectives on the question. London:


Richardson, L. F. (1922). Weather prediction by numerical process (Reissued in 1965).

New York: Dover.

Sanchini, R. (Producer), Cameron, J. (Producer), Landau, J. (Producer), Easley, P. (Producer),

Giddings, A. (Producer), Hill, G. (Producer), Mann, S. (Producer), & Cameron, J.

(Director). (1997). Titanic [Motion picture].United States: Twentieth Century Fox Film

Corporation, Paramount Pictures, & Lightstorm Entertainment.

Shuler-Donner, L. (Producer), Stuber, S. (Producer), Davis, A. Z. (Producer), Moritz, N. H.

(Producer), Chaffin, S. (Producer), Fottrell, M. (Producer), Cotton, M. R. (Producer), &

Jackson, M. (Director). (1997). Volcano [Motion picture]. United States: Twentieth

Century Fox Film Corporation, Donn/Shuler-Donner Productions, Moritz Original, Fox

2000 Pictures.

Smith, K. (1991). Environmental hazards: Assessing risk and reducing disaster. London:


Smith, R. A., & Dickie, J. F. (1993). Engineering for crowd safety. Amsterdam: Elsevier.

Stein, S., & Wysession, M. (2003). An introduction to seismology, earthquakes, and earth

structure. Boston: Blackwell Publishing.

Steinberg, T. (2000). Acts of God: The unnatural history of natural disasters in America.

London: Oxford University Press.

Tierney, K. J., Lindell, M. K., & Perry, R. W. (2001). Facing the unexpected: Disaster

preparedness and response in the United States. Washington, DC: John Henry Press.

Tobin, G. A. (1997). Natural hazards: Explanation and integration. New York: Guilford Press.

U.S. Department of Health and Human Services. About the flue. Retrieved from

Vale, L. J., & Campanella, T. J. (2005). The resilient city: How modern cities recover from             disaster. London: Oxford University Press.

Wallace, M., & Webber, L. (2004). The disaster recovery handbook. New York: Amacom.

Winston, W. L. (1994). Operations research: Applications and algorithms (3rd Ed.) Belmont,        CA: Duxbury Press.

Wise, R. (Producer), & Wise, R. (Director). (1975). The Hindenburg [Motion picture]. United

States: The Filmmakers Group, & Universal Pictures.