Visualising Deteriorating Conditions

By Tom Andrews, RN, B.Sc. (Hons), M.Sc., Ph.D. & Heather Waterman, RN, B.Sc. (Hons), Ph.D. Abstract The research aims were to investigate the difficulties ward staff experienced in detecting deterioration and how these were resolved. The emphasis within the literature tends to be on identifying premonitory signs that may be useful in predicting deterioration. Changes in respiratory rate is the most consistent of these (Fieselmann et al. 1993; Sax and Charlson 1987; Schein et al. 1990; Smith and Wood 1998) but in common with other signs, it lacks sensitivity and specificity. The sample consisted of 44 nurses, doctors (Interns) and health care support workers from a general medical and surgical ward. Data were collected by means of nonparticipant observations and interviews, using grounded theory as originated by (Glaser and Strauss 1967) and (Glaser 1978). As data were collected, the constant comparative method and theoretical sensitivity were used as outlined in grounded theory. A core category of “visualising deteriorating conditions” emerged, together with its sub-core categories of “intuitive knowing”, “baselining” and “grabbing attention”. The main concern in visualising deteriorating conditions is to ensure that patients suspected of deterioration are successfully referred to medical staff. The aim is to convince those who can treat or prevent further deterioration to intervene. Through intuitive knowing they pick up that patients have changed in a way that requires a medical assessment. To make the referral more credible, nurses attempt to contextualise any changes in patients by baselining (establishing baselines). Finally with the backup of colleagues, nurses refer patients by providing as much persuasive information as possible in a way that grabs attention. The whole process is facilitated by knowledge and experience, together with mutual trust and respect. Background Mortality from shock of whatever aetiology remains depressingly high, and avoidable components are contributing to physiological deterioration (McQuillan et al. 1998) often resulting in cardiorespiratory arrest (Rosenberg et al. 1993). Of all patients undergoing resuscitation75% will not survive more than a few days (George et al. 1989) with a survival rate to hospital discharge of 10% to 15% (Peterson et al. 1991; Schultz et al. 1996). Out of 9% of patients discharged from hospital having survived cardiopulmonary resuscitation, 4.3% were in a vegetative state, signifying severe neurological damage (Franklin and Mathew 1994). In an effort to detect shock early, a number of parameters have been measured. Blood pressure, heart rate, respiratory rate, temperature, conscious levels, shock index, central venous pressure, blood gases, blood lactate, pulmonary artery blood pressure, cardiac index, all correlate poorly with physiological deterioration and severity of shock (Rady et al. 1994). Early detection of physiological deterioration remains elusive. A further difficulty is that there are over two hundred normal physiological reflexes that affect the pulse and respiratory rate (Shoemaker et al. 1988). Current emphasis in the literature is on the early detection of physiological deterioration either through premonitory signs such as changes in respiratory rate (Fieselmann et al. 1993; Franklin and Mathew 1994; Goldhill et al. 1999; Sax and Charlson 1987; Schein et al. 1990) or more recently an early warning score (Department of Health 2000; McArthur-Rouse 2001). The latter attaches a score to changes in such variables as blood pressure, pulse rate, respiratory rate and temperature as a means of detecting early signs of physiological deterioration. The greater the score, the greater is the risk of physiological deterioration. To date these variables lack sensitivity and specificity. The current study is an attempt to redress the continued emphasis on physiological variables by exploring...

Grounded Theory and Heterodox Economics

By Frederic S. Lee, Ph.D. Abstract The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. The challengers face many obstacles, the most significant of which is the actual creation of an alternative economic theory. However heterodox economists have not settled on what the methodology of theory creation should be. The aim of this paper is to advocate that the method of grounded theory is the best set of guidelines for theory creation. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Grounded Theory and Heterodox Economics The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. Heterodox economics can be understood in two ways. The first is as a collective term of many different approaches to economic analysis, such as radical and Marxian economics, Post Keynesian economics, institutional economics, feminist economics, and social economics. Each of these approaches rejects various methodological and theoretical aspects of mainstream economics, including supply and demand curves, equilibrium, marginal products, deductivist approach to theory creation, methodological individualism and the optimality of markets. Because the different approaches utilize somewhat different theoretical arguments and methods of theory creation, there has been little progress over the last forty years towards developing an encompassing theoretical alternative to mainstream theory. But in recent years, this fragmentation among the heterodox approaches has declined as heterodox economists have taken positive steps towards developing a coherent synthesis. This activity has generated the second meaning for heterodox economics; that of referring to the development of a coherent theory that is an alternative to and replacement for mainstream theory. This alternative theory is based on the view that the discipline of economics should be concerned with explaining the process that provides the flow of goods and services required by society to meet the needs of those who participate in its activities. Heterodox economists believe that any explanation or theory of the social provisioning process must be grounded in the real world of actual historical events, must incorporate radical uncertainty and social individuals, and must tell a causal analytical story. Consequently, they reject the method of theory creation and development utilized by mainstream economists which is based on positivism, empirical realism, and deductivism. Numerous suggestions for an alternative method of theory creation have been raised by heterodox economists, but none have been widely accepted; and without a widely accepted method, progress towards developing an alternative heterodox theory will be slow indeed. The aim of this paper is to overcome this roadblock by advocating the method of grounded theory as the best set of guidelines for the creation of heterodox economic theory. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Thus, the first section of this paper will delineate the method of grounded theory. This is followed, in the second section, by a discussion of three methodological issues–the nature of data, the role of case studies, and mathematics and models–as they relate to the grounded theory method. The final section concludes the paper with a brief discussion of the historical nature of grounded economic theories. The Method of Grounded Theory To develop a theory that analytically explains causally related, historically contingent economic events, the critical realist...

The Grounded Theory Bookshelf

By Vivian B. Martin, Ph.D. Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory. Reworking Qualitative Data, Janet Heaton (Sage, 2004). Paperback, 176 pages, $29.95. Hardcover also available. Unlike quantitative research, where secondary analysis of data is common, qualitative research has yet to understand or take advantage of the possibilities of secondary analysis. Janet Heaton’s book focuses more on the hurdles to qualitative secondary analysis — the ethical and legal issues, as well as the operational challenges of analyzing interviews one did not conduct or witness — rather than providing protocols. But of special interest to grounded theorists are the possibilities grounded theory might offer for secondary analysis. Heaton does not launch such an argument; however, in the book’s preface, Heaton notes that Barney Glaser—yes, the co-developer of grounded theory— provided some of the first discussion in the literature about the possibilities of secondary analysis. She quotes from a 1962 Social Problems article in which Glaser writes: To be sure, secondary analysis is not limited to quantitative data. Observation notes, unstructured interviews and documents can also be usefully analyzed. In fact, some field workers may be delighted to have their notes, long buried in their files, reanalyzed from another point of view. Man is a data-gathering animal. (Glaser, 1962: 74). Grounded theorists would run into some of the same hurdles as other researchers viewing qualitative materials for which they could not go back to interviewees and seek elaboration, though grounded theory’s limited concern with full coverage might decrease such hurdles. Heaton does cite some secondary analyses projects for which grounded theory was invoked as the method for re-use. However, the main issue addressed in the book is the limited number of secondary analyses in general. The “secondary analysis of qualitative data remains an enigma” (viii), she writes. Heaton provides a literature review of secondary studies, though they are primarily in the health and social care literature. Importantly, calls for re-use of data have been explicit in these areas, and funding from the Economic and Social Research Council in the UK supported the initial literature review of the health studies. Heaton provides a typology to discuss secondary analyses thus far, but she acknowledges that “secondary analysis” is a vague term, and many studies that appear to be secondary analyses do not make it explicit. Secondary analyses, according to Heaton, include (p. 38): Supra analysis: Transcends the original topic for which the data were collected. Supplementary analysis: Expands on some aspects of the original study through more in-depth investigation. Re-analysis: Verifies or corroborates original premises. Amplified analysis: Combines data from two or more studies for comparison. Assorted analysis: Combines secondary data with primary research and/or naturalistic data. Most of the secondary analyses Heaton examined involved researchers going back to their own data. She notes that, although some researchers espouse the idea of making data available to others for secondary analysis, many have not taken the next step to make such data accessible. Nonetheless, Heaton finds encouragement in the increase in archives of qualitative data, and she...

Remodeling Grounded Theory

By Barney G. Glaser Ph.D., Hon. Ph.D. with the assistance of Judith Holton Abstract This paper outlines my concerns with Qualitative Data Analysis’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article will clarify my concerns with the continuing enthusiasm but misunderstood embrace of GT by QDA methodologists and serve as a preliminary guide to novice researchers who wish to explore the fundamental principles of GT. Introduction The difference between the particularistic, routine, normative data we all garner in our everyday lives and scientific data is that the latter is produced by a methodology. This is what makes it scientific. This may sound trite, but it is just the beginning of many complex issues.Whatever methodology may be chosen to make an ensuing research scientific has many implicit and explicit problems. It implies a certain type of data collection, the pacing and timing for data collection, a type of analysis and a specific type of research product. In the case of qualitative data, the explicit goal is description. The clear issue articulated in much of the literature regarding qualitative data analysis (QDA) methodology is the accuracy, truth, trustworthiness or objectivity of the data. This worrisome accuracy of the data focuses on its subjectivity, its interpretative nature, its plausibility, the data voice and its constructivism. Achieving accuracy is always worrisome with a QDA methodology. These are a few of the problems of description. Other QDA problems include pacing of data collection, the volume of data, the procedure and rigor of data analysis, generalizability of the unit findings, the framing of the ensuing analysis and the product. These issues and others are debated at length in the qualitative research literature.Worrisome accuracy of qualitative data description continually concerns qualitative researchers and their audiences. I have addressed these problems at length in “The Grounded Theory Perspective: Conceptualization Contrasted with Description” (Glaser, 2001). In this paper I will take up the conceptual perspective of classic Grounded Theory (GT). (In some of the research literature, classic GT methodology has also been termed Glaserian GT although I personally prefer the term “classic” as recognition of the methodology’s origins.) The conceptual nature of classic GT renders it abstract of time, place and people. While grounded in data, the conceptual hypotheses of GT do not entail the problems of accuracy that plague QDA methods. The mixing of QDA and GT methodologies has the effect of downgrading and eroding the GT goal of conceptual theory. The result is a default remodeling of classic GT into just another QDA method with all its descriptive baggage. Given the ascending focus on QDA by sheer dint of the number of researchers engaged in qualitative analysis labeled as GT, the apparent merger between the two methodologies results in default remodeling to QDA canons and techniques. Conceptual requirements of GT methodology are easily lost in QDA problems of accuracy, type data, constructivism, participant voice, data collection rigor according to positivistic representative requirements, however couched in a flexibility of approach (see Lowe, 1997). The result is a blocking of classic GT methodology and the loss of its power to transcend the strictures of worrisome accuracy – the prime concern of QDA methods to produce conceptual theory that explains fundamental social patterns within the substantive focus of inquiry. I will address some, but not all, of the myriad of remodeling blocks to classic...

Pluralistic dialoguing: A theory of interdisciplinary teamworking...

By Antoinette McCallin, Ph.D., M.A. (hons), B.A., RGON Abstract The aim of this emerging grounded theory study was to discover the main concerns of health professionals working in interdisciplinary teams, and to explain the processes team members used to continually resolve practice problems. Data collected from forty-four participants from seven disciplines in two teaching hospitals in New Zealand, included eighty hours each of interviewing and participant observation. In this paper the theory of pluralistic dialoguing is presented. It is argued that interdisciplinary work is possible when the team replaces the discipline focus with a client-focused care and thinks differently about service delivery. Thinking cooperatively requires individual team members to dialogue with colleagues, thereby deconstructing traditional ways of thinking and reconstructing new approaches to interdisciplinary practice. Although dialoguing was an informal process occurring within clinical spaces, as the effects of health reform and restructuring intensify teams also need to establish formal dialogue groups to facilitate team practice development and support team learning in the continually changing fast-paced practice context. Introduction Over the past decade the interdisciplinary team has received mixed reviews. While the interdisciplinary team is generally seen as a means to change professional practice and foster interprofessional collaboration (Leathard, 2003; Sullivan, 1998) it is also viewed as a means to promote clinical improvement in care and the outcomes of care, thereby improving public health and quality service provision (Lax & Galvin, 2002; Manion, Lorimer & Leander, 1996). As the care needs of clients have changed health care organisations have challenged traditional models of service delivery and endorsed the interdisciplinary team as a new model of practice that will supposedly reduce costs and improve the quality of care (Dodge, 2003). Interdisciplinary teams are usually expected to provide efficient, effective integrated care in restructuring health organisations (De Back, 1999). While team effectiveness is important (Millward & Jeffries, 2001; Schofield & Amodeo, 1999) integrating the disciplines in practice is much more challenging. This suggests that the process of teamworking has received less attention despite the fact that no one discipline can provide integrated care for clients with multiple needs, which often crosses many disciplinary boundaries (Gillam & Irvine, 2000). The interdisciplinary team is defined as one in which clinicians from various disciplines such as medicine, nursing, occupational therapy, physiotherapy, and social work cooperate with each other, sharing leadership, assessment, goal setting, problem-solving and decision making so that care is coordinated and client outcomes optimised. While the assumption that clinicians from different disciplines will automatically integrate care effectively is a worthy goal the reality may be somewhat different (Long, 2001; Masterton, 2002) suggesting that interdisciplinary team members may lack understanding of what is involved (O’Connell, 2001). Too often interdisciplinary teamwork seemingly evolves from trial and error learning. Indeed, Long (2001) observes that while there is longstanding general support for interdisciplinary work many variables limit implementation in less-than-ideal environments. Long though urges colleagues to concentrate on the successes. In this paper one of the successes, some of the findings from an investigation into interdisciplinary teamwork in the acute care hospital are presented (McCallin 1999a, McCallin 1999b). The theory of pluralistic dialoguing is introduced and hopefully offers insights into interdisciplinary teamworking explaining how health professionals from different disciplines support colleagues as they put aside disciplinary differences, thinking through and learning new ways of working cooperatively for the common good of the client. Discussion begins with a brief outline of the research topic, the approach and the findings of the research. Next, the meaning of pluralistic...