Issue no.3 , November 2008

The Constant Comparative Method of Qualitative Analysis

[This paper was originally published in Social Problems, 12(1965), pp. 436-45 and later as Chapter V in Glaser, B.G. & Strauss, A.L. (1967). The Discovery of Grounded Theory: Strategies fro qualitative research. New York: Aldine DeGruyter.] Barney G. Glaser, Ph.D. Currently, the general approaches to the analysis of qualitative data are these: 1.) If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i 2.) If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties. We wish to suggest a third approach to the analysis of qualitative data- one that combines, by an analytic procedure of constant comparison, the explicit coding procedure of the first approach and the style of theory development of the second. The purpose of the constant comparative method of joint coding and analysis is to generate theory more systematically than allowed by the second approach, by using explicit coding and analytic procedures. While more systematic than the second approach, this method does not adhere completely to the first, which hinders the development of theory because it is designed for provisional testing, not discovering, of hypotheses.iii This method of comparative analysis is to be used jointly with theoretical sampling, whether for collective new data or on previously collected or compiled qualitative data. Systematizing the second approach (inspecting data and redesigning a developing theory) by this method does not supplant the skills and sensitivities required in generating theory. Rather, the constant comparative method is designed to aid the analyst who possesses these abilities in generating a theory that is integrated, consistent, plausible, close to the dataand at the same time is in a dorm clear enough to be readily, if only partially, operationalized for testing in quantitative research. Still dependent on the skills and sensitivities of the analyst, the constant comparative method is not designed (as methods of quantitative analysis are) to guarantee that two analysts working independently with the same data will achieve the same results; it is designed to allow, with discipline, for some of the vagueness and flexibility that aid the creative generation of theory. If a researcher using the first approach (coding all data first) wishes to discover some or all of the hypotheses to be tested, typically he makes his discoveries by using the second approach of inspection and memo-writing along with explicit coding. By contrast, the constant comparative method cannot be used for both provisional testing and discovering theory: in theoretical sampling, the data collected are not extensive enough and, because of theoretical saturation, are not coded extensively enough to yield provisional tests, as they are in the first approach. They are coded only enough to generate, hence to suggest, theory. Partial testing of theory, when necessary, is...

Anticipatory Caring

[This paper was originally published as Sandgren, A., Thulesius, H., Petersson, K. & Fridlund, B. (2007). Doing good care ? A study of palliative home nursing care. International Journal of Qualitative Studies on Health and Well-Being, 2:4, 227-235 and is reprinted here with the permission of the publisher] Anna Sandgren, RN, MSc, PhD Candidate; Hans Thulesius, MD, PhD; Kerstin Petersson, RNT, PhD; Bengt Fridlund, RNT, PhD Abstract Today, more and more people die in own homes and nursing homes, which fundamentally affects community nursing. The aim of this study was to develop a grounded theory of palliative home nursing care and we analyzed interviews and data related to the behavior of community nurses caring for palliative cancer patients. Doing Good Care emerged as the pattern of behavior through which nurses deal with their main concern, their desire to do good care. The theory Doing Good Care involves three caring behaviors; anticipatory caring, momentary caring and stagnated caring. In anticipatory caring, which is the optimal caring behavior, nurses are doing their best or even better than necessary, in momentary caring nurses are doing best momentarily and in stagnated caring nurses are doing good but from the perspective of what is expected of them. When nurses fail in doing good, they experience a feeling of letting the patient down, which can lead to frustration and feelings of powerlessness. Depending on the circumstances, nurses can hover between the three different caring behaviors. We suggest that healthcare providers increase the status of palliative care and facilitate for nurses to give anticipatory care by providing adequate resources and recognition. Introduction The demographics of dying have changed with more people dying at home or in nursing homes. The number of hospital beds has declined and homecare has increased, and more own home deaths are expected in the future (Burge, Lawson & Johnston, 2003; Higginson, Astin & Dolan, 1998; Socialstyrelsen, 2006). The extension of palliative care varies in different parts of Sweden (Socialstyrelsen, 2006) and fewer hospital beds increases the strain for both acute hospital care and homecare (Fürst, 2000). The acute hospital care has a high pace and a “culture of quickness” (Andershed & Ternestedt, 1997) and this high pace was found to be one explanation to why nurses suffered emotional overload while caring for palliative cancer patients in acute hospitals (Sandgren, Thulesius, Fridlund & Petersson, 2006). In the contrast to the high pace in the acute hospitals, the hospice philosophy has a “culture of slowness” (Andershed & Ternestedt, 1997) and it has thus been suggested that the hospice philosophy should be spread to all care settings with dying people (Clark, 1993). At the same time, it has been proposed that palliative care should be available wherever the patient is. In addition, the patients and their families should receive the same standard of care irrespective of domicile and source of service delivery (Dunne, Sullivan & Kernohan, 2005; SOU, 2001). In homecare, the community nurses have a central position (Wright, 2002), but their work is in a way an invisible work, predominantly conducted in the patients’ homes (Goodman, Knight, Machen & Hunt, 1998; Luker, Austin, Caress & Hallett, 2000). Community nursing has shown to offer stimulation and appreciation, especially from patients and relatives, but also a possibility for nurses to use all their professional skills (Dunne et al., 2005; Goodman et al., 1998). However caring for palliative cancer patients in their homes has also been shown to be stressful (Berterö, 2002; Dunne et al., 2005), emotionally...

Navigating New Experiences: A basic social process

Kara L. Vander Linden, Ed.D. Abstract This grounded theory study was initiated to discover the process adult learners go through when engaging in new learning experiences. Data came from 12 open-ended intensive interviews with adult learners involved in various educational endeavors. Theoretical sampling led to several additional interviews with individuals not engaged in post-secondary education but more generally in new learning experiences. The basic social process of navigating explains three cyclical stages of behaviors used to successfully traverse new experiences. The stages are Mapping, Embarking, and Reflecting. Mapping consists of three behaviors: locating, assessing one’s location in relation to the goal; surveying, gathering information; and plotting, creating a plan. Embarking involves engaging in normalizing and strategizing behaviors to guide one’s self through the experience while encountering unexpected factors that influence one’s course and progress. Reflecting techniques and approaches are discussed in the third stage. Although providing an understanding of the process and behaviors used by adult learners, the theory is also applicable in other settings. Introduction One cannot go through life without encountering new experiences. While at times people find themselves in experiences of no choice of their own, many new experiences are entered voluntarily. One such experience is adults returning to college classrooms to continue their education. Today, more than at any other time in history, adults are returning to the college classroom to continue their education. These adult learners are referred to as “nontraditional” and are characterized by “one or more of the following characteristics: not a high school graduate; did not enroll in an institution of higher education directly after high school; are attending part-time; are working full-time; or are financially independent, married, or have dependents” (Wolanin, 2003, p 7). While adult learners have prior experience in education, many factors and conditions of adult life make the experience very different than their earlier experiences. These factors and conditions also contribute to lower retention rates. As Bosworth et al. (2007) reported, “Financially independent, working full time, with dependents and family responsibilities to juggle, and back in school after an extended time out—adult learners are at great risk of not achieving their postsecondary education goals” (p. 8) . There is substantial research and numerous theories and models on adult education and learning. Research repeatedly categorizes the challenges faced by adult learners into four general categories: accessibility, affordability, lack of time and other responsibilities, such as family and/or job responsibilities (Merriam, S. & Caffarella, R., 1999; Bosworth et al., 2007). Research has also suggested and studied strategies to increase learner retention and degree achievement. These strategies primarily address the issues of accessibility and affordability. While education institutions are making strides in these areas, there is a dearth of research on strategies for addressing the categories of lack of time and other responsibilities faced by adult learners. Although outside the control of educational institutions, these issues still affect adult learners’ success in reaching their goals. As an instructor and mentor of adult learners, I have little control over the four categories of factors that affect the retention and degree achievement of adult learners. Despite this lack of control over, part of my job is to help adult learners be successful. Often this means helping them succeed in spite of these factors. A desire to understand the learning experiences of adults and the challenges they face from their perspective provided the original area of interest and starting point for this study which was conducted for my dissertation. Classic grounded theory (GT)...

Reaching Out: Network building by US non-profit welfare organizations

Chandrasekhar Commuri Abstract Contemporary non-profit organizations operate in a fast changing and challenging environment. While the challenges at the sector level have been well documented, there is a gap in the literature in examining this issue at the local level. Based on interviews with non-profit executives, and using grounded theory methodology, this paper proposes that non-profits are using a reaching out strategy to deal with their most commonly experienced challenges of overwhelming complexity, distancing, and fragmentation. Reaching out involves different forms and levels of inter-organizational networking. These forms may be characterized as differentiation, symbiotic relations, and advocacy networks. Introduction Even though the idea of a ‘welfare state’ may conjure up images of a monolithic government bureaucracy, the modern welfare state in the United States is a collection of public, nonprofit, and private organizations with cross-cutting funding and program interrelationships. Non-profits have been playing an especially important role in the welfare system since the 1980s (Salamon, 1987; Wolch, 1990; Salamon, 1995; Savas, 2000; Chambre, 1999). Acting as contractors of the state, they are delivering more services, reaching new, especially previously marginalized, clients, and introducing novel solutions to sticky social problems. They do this because they are generally more nimble and efficient than public agencies. To that extent, the welfare state has benefited from its relationship with the nonprofit sector. In return, non-profit organizations have also reaped benefits from this relationship. By some estimates, over a third of the non-profit sector’s revenues come from the government (Independent Sector, 2002). Among other things, this increased funding led to a growth in the number of non-profits over the last forty years. Public policy and administration trends like ‘contracting out’ and ‘New Public Management’ (Osborne & Gaebler, 1993; Frumkin, 2002; Brooks, 2003) facilitated this growth in state support for the non-profit sector. Among other things these trends emphasized cross-sector collaboration, the use of multiple delivery agencies in order to encourage competition and creativity, establishing service delivery close to the clients, and outcomes measurement (Savas 1982, 2000). The operating environment for the non-profit sector changed quantitatively and qualitatively as a result. This environment provided many opportunities for the non-profit sector, but it also brought with it new problems and challenges. Non-profits found themselves in unfamiliar terrain in the legal, collaboration, personnel, and accountability areas. Therefore, along with growth and vibrancy, the sector also experienced voluntary ‘failures’ (Salamon, 1987). Examples of these are an inability to sustain the organization in the long term due to increased competition, shortage of professional managers, lack of accountability, the simultaneous existence of duplication and gaps in services (Chambre, 1999), and the altering of program priorities to follow changes in funding trends. Local level organizations are the primary workhorses of the non-profit sector. They directly deliver more services and reach more clients than national organizations and typically have fewer (financial and human) resources than state or national organizations. Previous research on this topic has focused more on macro level challenges faced by the non-profit sector as a whole rather than on exploring problems and challenges faced by local organizations. The literature has explored such macro level challenges as the sector’s overreliance on government funding, the relative loss of autonomy of the sector due to such reliance on government funds, the crowding out of private charity, the lack of investment in capacity building, and the dysfunctional aspects of competition between traditional non-profits on the one hand and faith based organizations and private corporations on the other (Salamon, 2002; Abrams & Schmitz, 1984;...

Doing Quantitative Grounded Theory: A review

Tina L. Johnson, PhD Whenever I review materials I do so with three eyes. One is as an educator of Ph.D. students who are just beginning their knowledge research methodologies. Another eye is towards the needs of the Ph.D. student in the midst of crafting and defending their dissertation proposal and finished product. Finally I view the book from my own educational needs or does this book provide me as an experienced grounded theorist with needed or new knowledge of my craft? From two of these three perspectives I view Dr. Barney Glaser’s new book Doing Quantitative Grounded Theory as potentially useful and from the third (my own learning) give the book a must own rating. It is well documented that grounded theory attracts Ph.D. students as a methodology to be applied to their dissertation. These students, however, rarely have either solid experience using grounded theory and have little to no training in the method. Most will receive what training they do have from brief exposure during general qualitative research courses. This quantitative grounded theory book might be well applied to growing number of situations where students are introduced to grounded theory methodology as a complete course. In universities, where the focus is on quantitative methods, I can see this book being an asset as an initial introduction to the method. The only caution is that instructors of the course may need to assist students in delving through the occasionally dense vocabulary and writing. An additional need that Ph.D. students encounter occurs as they write and defend their dissertation proposal and finished product to committees who very often have little knowledge of the method themselves. This book’s rich inclusion of historical documentation can, in part, provide resources for the student being pushed for background knowledge and methodological defense. Dr. Glaser provides a rich discussion of his early graduate work with Lazarsfeld as well as his first stab at grounded theory (quantitative theory) while completing his dissertation. Although I have read some of the included historical information in other resources (Glaser, 1998; Glaser & Strauss, 1999; Thulesius, 2003); This background information is more expansive as it is the focus of the first chapter and woven throughout the discussion about and examples from his initial study provided in the latter chapters (Glaser, 1964). The main purpose of this book and the perspective that is of most interest to the experienced grounded theory user is providing assistance in conducting quantitative grounded theory analysis. This instruction is not always an easy read especially for the researcher who is not well versed in quantitative methods. The book is vocabulary rich but also rich in content that if carefully examined coupled with (at least in this author’s case) some remediation of general multivariate statistics can provide clear guidelines to follow in developing a grounded theory using quantitative data. Three broad topics stood out as vitally important both in the clarification of grounded theories in general and specific ideals associated with branching out into the use of quantitative, especially secondary quantitative data. These include the idea of data fishing to establish theoretical hypothesis. That it is important to select data that has been collected using instruments created by some sort of grounding process. Finally that it is important to select data that can be tracked over time and/or across specific grouping subcategories (Glaser, 2008). Both qualitative and quantitative grounded theory employs the use of ‘data fishing’ (Johnston 2006). This process, held in taboo by quantitative researchers,...

Doing Quantitative Grounded Theory: A theory of trapped travel consumption...

Mark S. Rosenbaum, Ph.D. All is data. Grounded theorists employ this sentence in their quest to create original theoretical frameworks. Yet researchers typically interpret the word gdatah to mean qualitative data or, more specifically, interview data collected from respondents. This is not to say that qualitative data is deficient; however, grounded theorists may be missing vast opportunities to create pioneering theories from quantitative data. Indeed, Glaser and Strauss (1967) argued that researchers would use qualitative and/or quantitative data to fashion original frameworks and related hypotheses, and Glaserfs (2008) recently published book, titled Doing Quantitative Grounded Theory, is an attempt to help researchers understand how to use quantitative data for grounded theory (GT). Quantitative Grounded Theory Glaser introduces quantitative grounded theory (QGT) by providing readers with a historical background of the methodology, which has ties to Glaserfs sociological training by Paul Lazarsfeld at Columbia University. Although some readers may question the purpose of this introductory section, they should understand that QGT is willing to forgo some empirical rigor to generate frameworks that can be empirically tested at a future time. This stance contradicts the empirical rigor that Lazarsfeld was requesting and which represents the standard in the United States today. As a result, as social scientists, it is not surprising that we continue to learn increasingly more about increasingly less. Today, we clamor for complex structural equation models that illustrate putative causal relationships between and among exogenous and endogenous variables, which are either observed or latent. These research endeavors are far from inexpensive, as researchers must spend tremendous amounts of money gathering large data sets that are sizable enough to replicate an identifiable variance.covariance matrix. I am confident that contemporary social scientists who publish in leading journals can relate to the pressures involved in collecting data and to waiting for a gGod-likeh approval through an RMSEA or a CFI indicator. Glaser is not denigrating empirical rigor; however, he makes readers question whether they can learn as much about the world by relying on relatively simple chi-square tests. He questions whether social scientists are wasting data, which many may perceive as meaningless based on an interpretation of model fit indexes. Then, QGT sets out to fashion creative models based on extant data sets and to do so in a way that assumes the data is nonparametric, but rich enough to capture a social phenomenon. Glaser spends the next major portion of the book providing novices with a thorough methodological instruction of QGT. One of the greatest challenges to accomplishing a QGT study may be obtaining a data set. The data set must contain variables that evaluate a socially relevant and interesting condition. In essence, a core category must arise from quantitative data; thus, irrelevant empirical data can never lead to a core category. Although Glaser suggests that researchers can talk to fellow researchers to obtain raw data sets, I encourage researchers to explore Internet sites such as the Centers for Disease Control and Prevention, Roper Center for Public Research, Pew Research Center, and the U.S. Census for data sets that capture real social phenomenon. As a way to discuss QGT, I provide an actual example from one of my data sets. The data set I selected for QGT was based on an actual project that I conducted for a 400-square-foot retail store aboard a ferry that transports people between Oahu and Maui. The trip lasts for three hours. The ferryfs management sought input regarding product assortment. On a broader level, many...