The system was blinking red: Awareness Contexts and Disasters

Vivian B. Martin, Central Connecticut State University, USA

Abstract

The awareness context has been a source of inspiration for grounded theories for more than 50 years; yet little has been done to extend the theory beyond nursing and the medical field, and a few works on identity. This paper extends the awareness context by examining its role in several high-profile disasters, natural and man-made, where gaining a clear sense of what was going on was often blocked by poor information flow and general communication failures, interpersonal and technological. Selective coding and the introduction of new concepts after analyzing hundreds of pages of documents issued by special commissions in the aftermath of the 9/11 attacks, Hurricane Katrina, the Deepwater Horizon oil spill in the Gulf, and the Sago Mine Disaster not only explain various processes around awareness in the midst of crisis, but also illuminate pre-crisis patterns that, if attended, could have mitigated the impact of the disasters.

Keywords: Awareness context, crisis communication, sociology of disaster, situational awareness, 9/11 attacks, Hurricane Katrina, Deepwater Horizon explosion, Sago Mine Disaster.

Introduction

Whether it is in personal interactions, professional life, or community activities, we are always communicating and processing information. Some of this information is innocuous and of no immediate consequence, while other information may have direct bearing on our wellbeing, that of our families, or colleagues. In such high stakes situations, it is important to have immediate access to information that is complete and credible. Seen from this perspective, Glaser and Strauss’s awareness context (1964, 1965) addresses a fundamental communication process of everyday life. We move in and out of awareness contexts throughout daily life. The identification of a typology of awareness in which interactions among health professionals and patients are shaped by whether a patient is aware of a terminal diagnosis was a critical intervention in nursing and medical studies, and continues to be a starting point for much research (Andrews & Nathaniel, 2010).

The subject—dying—and discipline in which this theory has been embedded and extended across numerous illnesses and concerns may mask the essential work of the awareness context as a theory about the managing and sharing of information, a concern throughout organizations and institutions. Of course, the awareness context has not been limited to health issues. The role of identity and the interactions that occur when people are uncertain of the identify of another is highlighted in the American Sociological Review article Glaser and Strauss (1964) published prior to the release of Awareness of Dying; Ekins’s (1997) work on cross-dressing is a successful extension of the awareness context into this realm. But awareness as a concept offers many more possibilities for explaining phenomena that impede the distribution of critical communication across many spheres.

This paper extends the awareness context by examining its role in several high-profile disasters, natural and man-made, where gaining a clear sense of what was going on was often blocked by poor information flow and general communication failures, interpersonal and technological. Selective coding and the introduction of new concepts from analyzing hundreds of pages of documents issued by special commissions in the aftermath of the 9/11 attacks, Hurricane Katrina, the Deepwater Horizon oil spill in the Gulf, and the Sago Mine Disaster not only explain various processes around awareness in the midst of crisis, but also illuminate pre-crisis patterns which, if attended, could have mitigated the crises. The awareness context becomes an important contribution to crisis communication and organizational communication. Concepts such as abridging awareness, discounting awareness, situational awareness, and information gaps and information rationing help tease out the ways awareness is undermined in and across agencies assigned to work together. This paper is a methodological essay and brief discussion of ongoing theory development on awareness processes. It is also a challenge to grounded theorists to identify areas in their fields where the awareness context might have greater explanatory power than current theories allow.

Extending the Concept

The awareness context offers a typology explaining a mix of interactions determined by whether patients were aware they had a terminal diagnosis. In other words, whether they knew they were dying. In closed awareness situations where the patient was not aware of the diagnosis, health professionals worked to avoid disclosures, blocking and reframing information that might make its way to the patient:

To prevent the patient’s comprehension of the truth, the personnel utilize a number of “situation as normal” interaction tactics. They seek to act in his presence as if he were not dying but only ill. They talk to him as if he were going to live. They converse about his future, thus enhancing his belief that he will regain his health. They tell him stories about others (including themselves) who have recovered from similar or worse illnesses. By such indirect signaling they offer him a false biography. Of course, they may directly assure him that he will live, lying with a clear purpose. (Glaser and Strauss, 1964, p. 672)

The staff cannot control the flow of information fully, thus the typology explains other types of awareness and the interactions that flow out of them. The other types—suspicious, pretense, open—have attendant behaviors, all of which require ways of managing information and interactions. The power of the concept lies in its processual nature, as it captures the transition from various types of awareness and the interplay of interactions and structures indicative of different awareness contexts. The particulars of the typology have been discussed throughout the grounded theory literature over the decades, so it is not necessary to give an extensive account; however, it is important to reiterate what Glaser and Strauss (1964) meant by awareness and how it differs from concepts such as consciousness and attention, which have become more active areas of scholarships since the introduction of awareness. The concept of awareness itself has competing definitions, including some conflation with consciousness and attention in some disciplines. A footnote from their 1964 article provides a definition and potential broad applicability of the awareness context:

A more general definition of awareness context is the total combination of what specific people in groups, organizations, communities or nations know what about a specific issue. Thus, this structural concept can be used for the study of virtually any problem entailing awareness at any structural level of analysis. (p. 670)

I proceed with this definition, making a distinction between awareness and the more intentional behavior of attention. Awareness can lead to attention, but not necessarily. My original exploration of the awareness context revolved around news-attending as it became evident that news attending occurs in an awareness context (Martin, 2008). This context became important for understanding my theory of purpose attending, which describes a loop in which awareness triggers some initial attention, though relevance is needed to sustain it and make news-attending more purposeful. Increasing awareness based on relevance and attending recalibrates what is deemed worth attending in the next cycle. However, the wrench here is the limits to emergent awareness, which is often disrupted. Much news or information does not make it through everyday filters: people have limited interest or context and are often embedded in social networks that enable the filters.

Discounting Awareness

My work subsequently led to my interest in developing the concept of discounting awareness to better understand how people avoid information they tag as uncomfortable.

Discounting awareness is evident in everyday communication “from the most innocuous decision-making, such as how much credence one should give a weather forecast of rain, to behaviors that marginalize others and poison public discourse” (Martin, 2011, p. 300). It is the triage that sorts memoranda as important and less important or lends credibility to some testimony and discredits others. The image of a child with his hands over his ears to avoid hearing his parents order him to bed or deliver news he does not care to hear visually captures the concept in its more comic form. Some discounting awareness is childish and may just create annoyance for others, but as I address here, discounting awareness, in the form of dismissing, ignoring, or shrouding information in secrecy has also resulted in the loss of lives.
The concept is not fully my discovery. In Awareness of Dying, Glaser and Strauss (1965) devote a chapter to discounting awareness, a process in which researchers observed medical professionals engaging when they spoke openly in the presence of premature babies, comatose patients, and the senile and dying, whom they assumed to have no awareness of what was being said. In situations where professionals discounted awareness of patients, they made no effort to hide information and maintain a sense of everything as routine—the ritual they enacted in closed awareness. I embraced these conceptions but expanded discounting awareness as a broader behavior working on intrapersonal, interpersonal and macro communicative levels.

I initially tried out the concept with some selective coding using news reports and observations on a number of different phenomena. I also became intrigued with the many questions raised by the 9/11 attacks and subsequent claims that the signs of an impending terrorist attack had been evident but ignored. As it became public that the national security team in the Bush administration had not given adequate attention to a series of memos and communications that were indicating there was a strong threat of an imminent attack in 2001—“the system was blinking red” during the summer prior to the attacks CIA Director George Tenet told the commission (9/11 report, 2004, p. 277)—I decided to do selective coding for such incidents in the 9/11 Commission Report. Discounting awareness was evident across the Clinton and Bush administrations, but more important for my analysis, the blocks to the circulation of information across agencies seemed to be a complicated phenomenon that spoke to the awareness context more broadly (Martin, 2011). Incidents across the commission report revealed missed signals, failure to share information, lack of trust across agencies, weak distribution channels, and generally what has been described as a “failure of imagination” to connect the dots between available information.
My next question was whether the incidents in the 9/11-commission report were anomalies or whether there was a pattern of discounting awareness regularly enacted across other institutions leading up to and during various disasters. This pattern led me to sample other commission reports created in the aftermath of large-scale tragedies to map discounting and other awareness processes based on questions raised during memo-writing. The reports were created in response to Hurricane Katrina (2005), where the bursting of the levee system and flooding following a near-category-four hurricane led to the deaths of 1,100 people and destroyed sections of the city and revealed government unprepared to respond; the Sago Mine Collapse (2006), where 12 miners died and others injured during a mine explosion in West Virgina; and the BP Deepwater Horizon oil rig explosion (2010) that killed 11, injured 16, and dumped four million gallons of oil in the Golf of Mexico.

Commission reports are useful for researchers, including grounded theorists. Typically launched with bipartisan cooperation, these government-empowered inquiries have access to most leading participants in agencies and others with special knowledge about the disasters and aftermath. For some events there are series of reports or different parties with reports—for example, the miners’ union after Sago—and numerous supplements; reports are available. In some cases, such as the panel charged with investigating the problems during and leading up to Katrina that contributed to the death of approximately 1,100 people in New Orleans, including many who were trapped in their homes after failing to evacuate, the interviews include ordinary citizens alongside government officials and first responders in the community.

The data, like any, come with imperfections but provide an opportunity to examine patterns after the initial media interest and conventional wisdom have moved on to other topics. As observed by Vaughan (1997), who studied hundreds of pages of official reports and conducted interviews following the explosion of the Challenger on January 28, 1986, which was launched despite engineers’ reservation about the impact of the cold on the O-ring that held together sections of the shuttle, the public narrative that emerges is often simplistic or incorrect. After the blowup of the Challenger, the general view was that concerns with costs and politics of sending a teacher into space with the astronauts put extra pressure on NASA to push forward with the launch and ignore any cautions. What Vaughan discovered instead was that the organizational cultures could not easily accommodate the reservations that had been expressed. The engineers had reservations about how low temperatures might impact the O-rings but could not quantify their objections; they could not make a definitive case for not going forward, which was the best way to be heard within the paradigm in which they worked. A successful argument for aborting the launch would have had to break through various structures with long established paths to decision-making. The decrease in technical expertise as information traveled closer to the top of the pyramid was also part of the abridgement of awareness that occurs along information chains.

Vaughan (1997) provided a typology of signals (routine, weak, strong) and argued that verbal complaints and memos in organizations are weak signals due to their informality. Her concept of “structural secrecy” (p. 238), meanwhile, is also an indicator of an awareness context. Certainly, there are other dynamics involved, but disclosure and information flow in NASA and the contractors working with it have many similarities to patterns in the data from the crises I studied. The BP well explosion and Sago Mine are shaped in some ways by profiteering and regulation issues. But the abridgements of awareness were evident in those tragedies as they were in the lead-up to the 9/11 attacks and prior to and in the aftermath of Hurricane Katrina. The awareness context is the landscape actors must navigate. This analysis moves it from the hospital ward to a web of institutions in which networks of information and actors operate. During certain types of crises and disasters the context moves further out into the world, affecting communities and individuals, and necessitating different levels of analyses.

A Methodological Note

The brief research report in this paper is part of a larger project on awareness processes; therefore, it would take the discussion off track to address the various methodological issues inherent in building formal theory. One observation worth sharing, however, is that the notion that one can move from a substantive theory into a formal theory without new data, thereby relying on extant literature in other areas; such a notion is problematic. Awareness, a concept that continues to elude social scientists, needs more fleshing out and discovery of its contours, making data such as the commission reports especially welcome. Extant literature requires unpacking based on the methods used and the nature of the data. The data underlying some of the literature is not often clear or represented well enough to evaluate prior to its integration in theory.

I also incorporated strategies that are out of the comfort zone of many classic grounded theorists but must be considered when databases become large. Although my initial coding was on paper copies of the reports, I utilized NVivo10, not just for retrieval, but for its matrices, word trees, cluster analysis with quantitative measures; other tools also helped me look at my data across the large documents and better account for coding patterns. An example of something that could not be done by hand was the ability to run a Jaccard’s coefficient, an index that reveals where coding intersects. The tool also allowed me to determine that certain words across the documents were often in the same places; for example, awareness, communication, or failure are tightly connected with an index number of 1 (tight correspondence). Typically, these sections contain references to incidents of communication failures, giving strength to the conceptualization I was doing. While this extra level of accountability is not necessary for all classic grounded theorists, especially those without access to or training on NVivo, having both I chose to use this extra bit of auditing given the high-profile nature of the reports, the volume of the data and as a source of reassurance for different audiences.

Abridging Awareness

The typology of an awareness context in which critical information is managed across different people, departments and organizations is relevant in all four of the circumstances studied in this paper. The emphasis differs across sites. The Sago Mine explosion had the earmarks of the crisis in which quick orientation was needed, but its pre-crisis culture was less an area of focus in the report, though some of these issues were implied in the history of citations and other problems. The pre-crisis awareness contexts are addressed more explicitly in the other three reports and contain elements of closed awareness that might have contributed to the tragedies or impacted the aftermath negatively. I use abridging awareness or the abridgement of awareness to conceptualize the mix of practices that block the flow of information and decrease awareness in the agencies and organizations under study, particularly prior to the crises.

The pre-crisis and crisis contexts bring different properties to the forefront. The pre-crisis context is the norm under which organizations and institutions operate and include all of the communication and information practices. For example, the following two brief descriptions are full of implications for understanding the routine awareness context prior to 9/11 as a willful disattending, a vacating of accountability found throughout the crises studied. The following incidents are reported in the 9/11 Commission report:

President Clinton appointed George Tenet as DCI in 1997, and by all accounts terrorism was a priority for him. But Tenet’s own assessment, when questioned by the Commission, was that in 2004, the CIA’s clandestine service was still at least five years away from being fully ready to play its counterterrorism role. And while Tenet was clearly the leader of the CIA, the intelligence community’s confederated structure left open the question of who really was in charge of the entire U.S. intelligence effort. (p. 93)

Moreover, the FAA’s intelligence unit did not receive much attention from the agency’s leadership. Neither Administrator Jane Garvey nor her deputy routinely reviewed daily intelligence, and what they did see was screened for them. She was unaware of a great amount of hijacking threat information from her own intelligence unit, which, in turn, was not deeply involved in the agency’s policymaking process. Historically, decisive security action took place only after a disaster had occurred or a specific plot had been discovered. (p. 83)

In the pre-crisis “normal,” people operate under a type of awareness that is often closed, but the rituals of organizational are such that there is much pretense around knowledge in some strata. The structures in which the people in the aforementioned examples worked enabled their ability to push away responsibility and accountability with impunity. The abridgement of awareness comes to light when crisis hits as communities are left with the fallout as they try to achieve awareness, sometimes to save their lives.

When disaster strikes, awareness becomes foreground and is the main concern before action is taken, rather than a tacit aspect of routine organizational life where people are often unaware of what they do not know. Temporality becomes a critical property of the awareness context as the emergency quickens. The passengers on hijacked planes during 9/11 had minutes to ascertain their situation, and had few options once they achieved some awareness. The circumstances surrounding Hurricane Katrina, however, had a longer trajectory of struggle for awareness. Warnings about the severity of the imminent hurricane, as well as knowledge about the vulnerability of the levees in New Orleans were well known—National Hurricane Center, which perfectly predicted landfall days in advance, was one of the few agencies credited with doing its job well—yet local leaders failed to force evacuations until it was too late for many. Of the four crises examined, Katrina is the one most vividly illustrative of an awareness context with many broken nodes. It is the one case where it is not overreaching to say that a healthier structure of awareness could have resulted in a far less tragic situation. As the authors of “A Failure of Initiative” (2006), the commission report on Katrina, wrote:

Many of the problems we have identified can be categorized as ‘information gaps’—or at least problems with information-related implications, or failures to act decisively because information was sketchy at best. Better information would have been an optimal weapon against Katrina. Information sent to the right people at the right place at the right time. Information moved within agencies, across departments, and between jurisdictions of government as well. Seamlessly. Securely. Efficiently. (p. 1)

Information gaps, an in vivo code I’ve adapted to my work, are components in awareness contexts. Information is the currency that spurs action, or causes impasses if it is not credible. Information that is rationed and only shared among a few, or not delivered with appropriate context, can derail plans and put lives in peril, as we see happening in the data. A seamless, secure, efficient network of information flowing back and forth is an ideal expressed in the excerpt from the Katrina report, but awareness contexts have many actors with different agendas, degrees of flexibility, and competence. In reviewing the explosion of the Deepwater oil rig, which resulted in the death of 11 men, injured 16, and caused the release of four million gallons of oil in the Golf of Mexico, the special commission found:

BP, Transocean, and Halliburton failed to communicate adequately. Information appears to have been excessively compartmentalized at Macondo as a result of poor communication. BP did not share important information with its contractors, or sometimes internally even with members of its own team. Contractors did not share important information with BP or each other. As a result, individuals often found themselves making critical decisions without a full appreciation for the context in which they were being made (or even without recognition that the decisions were critical). (p. 123)

A particularly illustrative indicator of pre-crisis information rationing with tragic consequences in the Deepwater Horizon oilrig explosion was an advisory Transocean, the company drilling for BP, failed to share with the Deepwater team. Four months prior to the Deepwater explosion in 2010 at Maconda, there was a near-miss on one of its rigs in the North Sea in December 2009. Gas entered a riser while the crew was conducting an operation in a manner similar to the crew in Louisiana. A crew had declared a previous test a success—which also occurred at Maconda—but a barrier failed and hydrocarbon rushed in, according to the commission report. The crew in the North Sea was able to shut the well before a blowout erupted, but as the commission learned, “Nearly one metric ton of oil-based mud ended up in the ocean. The incident cost Transocean 11.2 days of additional work and more than 5 million British pounds in Expenses” (p. 124).

Transocean subsequently created an internal PowerPoint presentation warning that ‘[t]ested barriers can fail’ and that ‘risk perception of barrier failure was blinkered by the positive inflow test [negative test].’ The presentation noted that ‘[f]luid displacements for inflow test [negative test] and well clean up operations are not adequately covered in our well control manual or adequately cover displacements in under balanced operations.’ It concluded with a slide titled ‘Are we ready?’ and ‘WHAT IF?’ containing the bullet points: ‘[h]igh vigilance when reduced to one barrier underbalanced,’ ‘[r]ecognise when going underbalanced—heightened vigilance,’ and ‘[h]ighlight what the kick indicators are when not drilling.’ (p. 124)

Transocean sent out an “operations advisory” using what the commission described as “less pointed and vivid” language than in the PowerPoint to its fleet in the North Sea. However, the commission quotes Transocean as conceding that neither the advisory or PowerPoint made it to Deepwater Horizon. In a fairly bold act of discounting awareness, Transocean took issue with suggestions that informing the Deepwater crew might have made people more cautious about the test barriers, possibly averting the disaster. Transocean argued that a different test barrier was involved in the North Sea, but the commission found that the differences are “cosmetic” and wrote, “The basic facts of both incidents are the same. Had the rig crew been adequately informed of the prior event and trained on its lessons, events at Macondo may have unfolded very differently” (p. 125). The heavy editing of the advisories and restricted flow suggests the circulation of these types of alerts within a company would be useful for further development of closed awareness as a spectrum when applied to organization culture.

Achieving Situational Awareness

Situational awareness is a term most immediately associated with military operations but has spread to aeronautics and other fields. In essence, it is knowledge of what is going on in a given situation and what some of the moving issues might be. Former Secretary of Defense Donald Rumsfeld, in speaking to the presidential commission on 9/11, described himself as eager to establish “situational awareness” upon learning of the attacks on 9/11. The term is used frequently in the reports to describe the challenges of orienting to the fast-changing crises of 9/11 and Katrina. The word is probably used in these documents and not the other two because of the immense difficulties gaining situational awareness in the midst of the attacks and hurricane and the subsequent chaos in which thousands of people were thrown and lost lives. In the case of Katrina, there is a direct link between the everyday pre-crisis practices abridging awareness across the agencies involved and what happened when the hurricane struck. Failed infrastructure, including massive power outages, dwindling supplies, uncertainty about chains of command, equipment that did not work and paralyzed leadership all contributed to the delays understanding what was going on, sometimes for days. The report observes:

Without sufficient working communications capability to get better situational awareness, the local, state, and federal officials directing the response in New Orleans had too little factual information to address—and, if need be, rebut—what the media were reporting. This allowed terrible situations—the evacuees’ fear and anxiety in the Superdome and Convention Center—to continue longer than they should have and, as noted, delayed response efforts by, for example, causing the National Guard to wait to assemble enough force to deal with security problems at the Convention Center that turned out to be overstated. (p. 171)

As a concept, situational awareness helps switch the context from everyday routines to the live event, where the context now expands to multitudinous actors and scenarios; beating the clock also becomes a factor. Situational awareness can be conceptualized as having two distinct phases: the initial jolt of disruption and immediate need for information triage in which people must obtain, verify, and evaluate information they must accept or discount; and action thresholds—the point at which people, their awareness limited, may need to take a leap of faith or risk death. These phases can cycle out within minutes. Tentatively I have conceptualized a third phase, opening awareness, which would encompass the continual response to the crisis and aftermath. It might be that the commissions assembled to create reports are part of the opening of awareness longterm. The proposition here is not that all would be transparent. Awareness is recalibrated to move to a new level of response, though there is no guarantee the pre-crisis awareness context would change much. In fact, the commission reports contain a lot of material suggesting that the agencies involved had failed to learn from past lessons or were slow to implement them. However, conceiving of situational awareness as a cyclical subcore helps link the pre-crisis context, the immediate crisis and aftermath.

A few examples from the crises provide indicators to explore situational awareness unfolding. The Sago Mine disaster was especially painful for the country to witness. At one point after families had waited anxiously to hear whether the miners had been rescued, the governor and media reported that all but one of the 13 miners, who had been trapped, had survived. But the celebrations were short-lived. There had been a communication mix up: only one of the 13 miners survived. The so-called “fog” associated with war impedes awareness in these early moments. Stories from 9/11 of people attempting to evacuate the towers but being told to remain in place are indicators of the confusion and misdirection that makes verification so difficult.

The 911 system remained plagued by the operators’ lack of awareness of what was occurring. Just as in the North Tower, callers from below and above the impact zone were advised to remain where they were and wait for help. The operators were not given any information about the inability to conduct rooftop rescues and therefore could not advise callers that they had essentially been ruled out. This lack of information, combined with the general advice to remain where they were, may have caused civilians above the impact not to attempt to descend, although Stairwell A may have been passable. (National Commission on Terrorist . . . , 2004, p. 295)

Escalating contingency explains the ways in which the limited amount of awareness is outstripped by the fast pace of events. Those people in the midst of disaster often found themselves replacing one unworkable plan with another that was too little too late. Yet also in need of better understanding are the action thresholds that cause some people to move forward. Some of the risk—running in the same direction of the rest of the crowd as people did during the 9/11 attacks, going back into the mines to save brother miners—are instinctual and hence easier to explain. But situations where people are paralyzed by inaction for stretches of time need closer examination. From Katrina, we know that there was much procrastination and desperation as people realized they were on their own as the floodwaters rose. At hospitals where the elderly and informed had limited mobility, hospital staff had to make the difficult decision to simply leave the sickest to die. The commission reports include examples of heroism and personal initiative such as the doctor who, with help from police, broke into a pharmacy to get medicine to help victims. But, much of the Katrina story is one of tragedy that many people in government let happen due to the closed awareness context they built.

Awareness Context as Culture

Awareness contexts make up the culture of organizations, revealing how such entities communicate within and outside of their walls. The communication patterns are deeply embedded and difficult to steer in counter directions. Unlike the medical professionals in the hospital observed by Glaser and Strauss, knowledge blackouts, or the process of often do not know what they do not know or need to know. Many are also incompetent, vacating accountability with the assistance of the organizational structure. Rather than a strictly closed system, the pretense type of awareness identified by Glaser and Strauss (1964, 1965) is worth exploring to capture the ways in which people perpetuate knowledge gaps willingly. These steps are among my next course of action. Any organization would have a business-as-usual awareness context that could be mapped, particularly around crisis or less threatening disruptions. Yet, its history aside, the awareness context would need to earn its way anew.

This article is an attempt to share some ongoing work on awareness processes, and more immediately an effort to suggest that the awareness context is underutilized and rich for development. Grounded theorists and researchers in general often delay projects for want of data. Increasingly, government websites make available reports and inquiries of high-profile disasters and other crises; these reports could be useful for any number of projects. The large numbers of documents that exist on various topics lend themselves to solo or group projects. Imagine an international team of grounded theorists taking on a topic and coming up with a theory with grab that sets the world straight on a public problem. That is a flight of fantasy to some degree, but I use it to suggest that our toolbox can bring a lot of light, awareness, out there.

References

Ekins, R. (1997). Male femaling: A grounded theory approach to cross-dressing and sex-changing. New York, NY: Routledge.

Glaser, B. G. & Strauss, A. L. (1964). Awareness contexts and social interaction. American Sociological Review, 29, 669-679. doi:10.2307/2091417

Glaser, B. G. & Strauss, A. L. (1965). Awareness of dying. Chicago, IL: Aldine Publishing.

Martin, V. B. (2008). Attending the news: A grounded theory about a daily regiment. Journalism: Theory, Practice & Criticism, 9, 76-94. doi:10.1177/1464884907084341

Martin, V. B. (2011). The Power of and Enduring Concept. In V. Martin & A. Gynnild (Eds.), Grounded Theory: The Philosophy, Method, and Work of Barney Glaser (pp-297-308). Boca Raton, FL: Brown Walker Press.

Nathaniel, A. & Andrews, T. (2010). The modifiability of grounded theory. The Grounded Theory Review, 9(1), 65-67. Retrieved from http://groundedtheoryreview.com

National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling. 2011) Deepwater: The Gulf oil disaster and the future of offshore drilling. Retrieved from http://www.gpo.gov/fdsys/pkg/GPO-OILCOMMISSION/pdf/GPO-OILCOMMISSION.pdf

National Commission on Terrorist Attacks on the United States. (2004). The 9/11 Commission Report. New York, NY: Norton. Retrieved from http://www.9-11commission.gov/report/911Report.pdf

Select Bipartisan Committee to Investigate the Preparation for and Response to Hurricane Katrina. (2006). A failure of initiative: The final report of the select
bipartisan committee to investigate the preparation
for and response to Hurricane Katrina. Retrieved from http://katrina.house.gov/full_katrina_report.htm

United States Department of Labor/Mine Safety and Health Administration. (2007). Report of investigation. Fatal underground coal mine explosion. Jan 2, 2006, Sago Mine. Retrieved from http://www.msha.gov/FATALS/2006/Sago/sagoreport.asp

Vaughan, D. (1997). The challenger launch decision: Risky technology, culture, and deviance at NASA. Chicago, IL: The University of Chicago Press.

Facebooktwitterredditpinterestlinkedinmail