Issue no.2, March 2005

Volume 4, Issue no. 2, March 2005

                                 GT Review vol4 no2 ←   Volume 4, Issue no. 2, March 2005  The Impact of Symbolic Interaction on Grounded Theory  Barney G. Glaser  As I stated in the introduction to chapter 9, GT is a general inductive method possessed by no discipline or theoretical perspective or data type.  Yet the takeover of GT by Symbolic Interaction (SI) and all the departments and institutes that SI informs and resides in is massive and thereby replete with the remodeling of GT.  The literature on qualitative methodology is massive and replete with the assertion that SI is the foundation theoretical perspective of GT.  GT is reported as a SI method.  That GT is a general inductive method is lost.   Beyond the Physical Realm: A proposed theory regarding a consumer’s place experience  Mark Rosenbaum Marketers view place as a marketing mix tool that denotes activities associated with the distribution of products and services.  Thus, the discipline believes that places are alienated from consumers’ lives and experiences.  This article looks at the place concept anew and offers an original theory of consumers’ experience in place.   Visualising Deteriorating Conditions Tom Andrews & Heather Waterman The research aims were to investigate the difficulties ward staff experienced in detecting deterioration and how these were resolved. The emphasis within the literature tends to be on identifying premonitory signs that may be useful in predicting deterioration.  Changes in respiratory rate is the most consistent of these (Fieselmann et al. 1993; Sax and Charlson 1987; Schein et al. 1990; Smith and Wood 1998) but in common with other signs, it lacks sensitivity and specificity.   The sample consisted of 44 nurses, doctors (Interns) and health care support workers from a general medical and surgical ward.  Data were collected by means of non-participant observations and interviews, using grounded theory as originated by (Glaser and Strauss 1967) and (Glaser 1978).  As data were collected, the constant comparative method and theoretical sensitivity were used as outlined in grounded theory.  A core category of “visualising deteriorating conditions” emerged, together with its sub-core categories of “intuitive knowing”, “baselining” and “grabbing attention”. The main concern in visualising deteriorating conditions is to ensure that patients suspected of deterioration are successfully referred to medical staff.  The aim is to convince those who can treat or prevent further deterioration to intervene.  Through intuitive knowing they pick up that patients have changed in a way that requires a medical assessment.  To make the referral more credible, nurses attempt to contextualise any changes in patients by baselining (establishing baselines).  Finally with the backup of colleagues, nurses refer patients by providing as much persuasive information as possible in a way that grabs attention.  The whole process is facilitated by knowledge and experience, together with mutual trust and respect. Grounded Theory and Heterodox Economics Frederic S. Lee The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics.  The challengers face many obstacles, the most significant of which is the actual creation of an alternative economic theory.  However heterodox economists have not settled on what the methodology of theory creation should be.  The aim of this paper is to advocate that the method of grounded theory is the best set of guidelines for theory creation.  In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation   The Grounded Theory Bookshelf  Vivian B. Martin (no abstract found)...

The Impact of Symbolic Interaction on Grounded Theory

By Barney G. Glaser, Ph.D., Hon. Ph.D. (Chapter 10, The Grounded Theory Perspective III: Theoretical Coding, Sociology Press, 2005) As I stated in the introduction to chapter 9, GT is a general inductive method possessed by no discipline or theoretical perspective or data type. Yet the takeover of GT by Symbolic Interaction (SI) and all the departments and institutes that SI informs and resides in is massive and thereby replete with the remodeling of GT. The literature on qualitative methodology is massive and replete with the assertion that SI is the foundation theoretical perspective of GT. GT is reported as a SI method. That GT is a general inductive method is lost. Sure, GT can use SI type data and its perspective, but as a general method it can use any other type data, even other types of qualitative data, as well as quantitative, visual, document, journalistic and in any combination, and any other theoretical perspective, such as e.g. systems theory, social structural theory, structural functional theory, social organization theory, cultural theory etc. Thus, the takeover of GT as an SI perspective methodology is just discipline-perspective dominance, as discussed above, and nothing more. It, of course, dominates with a set of TCs (process, strategies, conditions, context etc) I have considered at length in chapters above. Researchers, especially in nursing, just want a theoretical perspective. SI institutionalizes GT as its own! Researchers like it because it gives them an ontology (what is data) and an epistemology (a philosophy of research). The takeover becomes structurally induced by researchers, especially nursing, in their research, since they want a theoretical perspective in advance. It gives them a feeling of power, while they do not realize that the SI takeover reduces the general method power of GT. The writers on GT as a SI method use as their legitimating source because of Strauss’s (my co-author of discovery of GT) training in SI. They ignore the roots of GT in my training in concept-indicator index construction in quantitative survey research. In the following pages, I will discuss these issues at length. Much has already been said in this book about SI and its set of TCs. This chapter just focuses and adds some ideas. The goal of this chapter, as in all the above chapters, is to free GT from this dominance so GT analysts will have the fullest range of TCs – from any and all perspectives– possible at his fingertips for emergence. No one discipline with/and its theoretical perspective defines and possesses GT, as I discussed at length in chapter 9. Obviously many GTs use a SI perspective (as well as others), whether bounded or not by it. Earned, emergent relevance is the TC of best choice. Sources of SI Dominance Obviously, the impact, dominance and possession of SI on GT came from Anselm Strauss’s training in SI at University of Chicago. Many authors assert this one source of SI. Carolyn Weiner (op. cit. page 6) says: “GT derived from the tradition of SI, this sociological stance is based on the perspective of George Herbert Mead as developed by the Chicago school of sociology and asserts that people select and interpret meanings from their environment, formed in many definitions of the situation. The individual acquires a commonality of perspective with others as they learn and develop together the symbols by which aspects of the world are identified. In other words there is a social construction of reality.” Marjorie MacDonald and Rita Schreiber (op...

Beyond the Physical Realm: A proposed theory regarding a consumer’s place experience...

By Mark Rosenbaum, Ph.D. Abstract Marketers view place as a marketing mix tool that denotes activities associated with the distribution of products and services. Thus, the discipline believes that places are alienated from consumers’ lives and experiences. This article looks at the place concept anew and offers an original theory of consumers’ experience in place. Introduction The concept of place is well engrained in the marketing discipline as a basic marketing mix tool that refers to distributional and to organizational activities associated with making products and services available to targeted consumers (Kotler 2000, p. 87). As a result of this conceptualization, it is not surprising that marketers perceive that places are isolated from consumers’ personal lives and experiences. Indeed, pundits often chastise contemporary retailers for creating an urban marketplace that represents a rendition of human alienation and that is replete with impersonal, cold relationships between buyers and sellers. This perception of place, as a mere subdivision of physical space (Sherry 2000), is especially prevalent among marketing researchers who adhere to the regional school of thought (Sheth and Garrett 1986; Sheth, Gardner, and Garrett 1988). Researchers, in this school, consider marketing as a form of economic activity that bridges the geographic gap, or spatial gaps, between buyers and sellers (see Grether 1983). Consequently, these researchers are guided by a philosophy of consumption which espouses that general laws exist for predicting spatial regularities between consumers’ residential location and their selected shopping areas. Although regional researchers have been developing models since the 1930’s, no encompassing marketing theory has yet emerged from their endeavors (Sheth, Gardner, and Garrett 1988). Marketing’s conceptualization of place has been unwavering since its inception in the early 1960’s (McCarthy 1960); however, as the discipline entered the new millennium, Sherry (2000) suggested that all is not sanguine with it. Sherry’s (1998, 2000) point of contention with the place concept is that marketers deem consumption settings, or servicescapes (Bitner 1992; Sherry 1998), as being comprised of physical elements (Turley and Milliman 2000). Thus, he believes that marketers fail to consider that places may also be comprised of intangible, symbolic realms, which may be integral to consumers’ personal worlds and experiences. Rather than consider that consumers view places as pointsof- exchange where they satisfy essential consumption needs, Sherry posits that places have different dimensions of meaning for consumers, based upon their personal experiences in them. In addition, he speculates that the impact of these meanings, on consumer behavior, ranges on a continuum from the subtle to the profound. However, like Trickster, Sherry (1998, 2000) stops conjecturing mid-stream; leaving future researchers with the challenge of generating a theory of consumer’s being-in-place. The goal of this article is to heed Sherry’s (2000) challenge by conceiving a theory that (1) illustrates why and how consumers experience places in their lives, (2) uncovers major antecedents that impact consumers’ place experience, (3) links place experience to patronizing behavior, and (4) is parsimonious, relevant, and modifiable. The theory serves as a milestone for marketing as it addresses a chasm in the marketing mix. Namely, that marketing mix, along with its consideration of place as distribution, is not entirely complete, is somewhat inconsiderate of consumers’ needs, and focuses on investigating unidimensional relationships between stimuli and responses, rather than on the much richer concept of exchange relationships (van Waterschoot 2000; van Waterchoot and Van den Bulte 1992). To date, the majority of place studies in marketing have attempted to discern stimulus-response regularities between specific environmental conditions (e.g., music, crowding,...

Visualising Deteriorating Conditions

By Tom Andrews, RN, B.Sc. (Hons), M.Sc., Ph.D. & Heather Waterman, RN, B.Sc. (Hons), Ph.D. Abstract The research aims were to investigate the difficulties ward staff experienced in detecting deterioration and how these were resolved. The emphasis within the literature tends to be on identifying premonitory signs that may be useful in predicting deterioration. Changes in respiratory rate is the most consistent of these (Fieselmann et al. 1993; Sax and Charlson 1987; Schein et al. 1990; Smith and Wood 1998) but in common with other signs, it lacks sensitivity and specificity. The sample consisted of 44 nurses, doctors (Interns) and health care support workers from a general medical and surgical ward. Data were collected by means of nonparticipant observations and interviews, using grounded theory as originated by (Glaser and Strauss 1967) and (Glaser 1978). As data were collected, the constant comparative method and theoretical sensitivity were used as outlined in grounded theory. A core category of “visualising deteriorating conditions” emerged, together with its sub-core categories of “intuitive knowing”, “baselining” and “grabbing attention”. The main concern in visualising deteriorating conditions is to ensure that patients suspected of deterioration are successfully referred to medical staff. The aim is to convince those who can treat or prevent further deterioration to intervene. Through intuitive knowing they pick up that patients have changed in a way that requires a medical assessment. To make the referral more credible, nurses attempt to contextualise any changes in patients by baselining (establishing baselines). Finally with the backup of colleagues, nurses refer patients by providing as much persuasive information as possible in a way that grabs attention. The whole process is facilitated by knowledge and experience, together with mutual trust and respect. Background Mortality from shock of whatever aetiology remains depressingly high, and avoidable components are contributing to physiological deterioration (McQuillan et al. 1998) often resulting in cardiorespiratory arrest (Rosenberg et al. 1993). Of all patients undergoing resuscitation75% will not survive more than a few days (George et al. 1989) with a survival rate to hospital discharge of 10% to 15% (Peterson et al. 1991; Schultz et al. 1996). Out of 9% of patients discharged from hospital having survived cardiopulmonary resuscitation, 4.3% were in a vegetative state, signifying severe neurological damage (Franklin and Mathew 1994). In an effort to detect shock early, a number of parameters have been measured. Blood pressure, heart rate, respiratory rate, temperature, conscious levels, shock index, central venous pressure, blood gases, blood lactate, pulmonary artery blood pressure, cardiac index, all correlate poorly with physiological deterioration and severity of shock (Rady et al. 1994). Early detection of physiological deterioration remains elusive. A further difficulty is that there are over two hundred normal physiological reflexes that affect the pulse and respiratory rate (Shoemaker et al. 1988). Current emphasis in the literature is on the early detection of physiological deterioration either through premonitory signs such as changes in respiratory rate (Fieselmann et al. 1993; Franklin and Mathew 1994; Goldhill et al. 1999; Sax and Charlson 1987; Schein et al. 1990) or more recently an early warning score (Department of Health 2000; McArthur-Rouse 2001). The latter attaches a score to changes in such variables as blood pressure, pulse rate, respiratory rate and temperature as a means of detecting early signs of physiological deterioration. The greater the score, the greater is the risk of physiological deterioration. To date these variables lack sensitivity and specificity. The current study is an attempt to redress the continued emphasis on physiological variables by exploring...

Grounded Theory and Heterodox Economics

By Frederic S. Lee, Ph.D. Abstract The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. The challengers face many obstacles, the most significant of which is the actual creation of an alternative economic theory. However heterodox economists have not settled on what the methodology of theory creation should be. The aim of this paper is to advocate that the method of grounded theory is the best set of guidelines for theory creation. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Grounded Theory and Heterodox Economics The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. Heterodox economics can be understood in two ways. The first is as a collective term of many different approaches to economic analysis, such as radical and Marxian economics, Post Keynesian economics, institutional economics, feminist economics, and social economics. Each of these approaches rejects various methodological and theoretical aspects of mainstream economics, including supply and demand curves, equilibrium, marginal products, deductivist approach to theory creation, methodological individualism and the optimality of markets. Because the different approaches utilize somewhat different theoretical arguments and methods of theory creation, there has been little progress over the last forty years towards developing an encompassing theoretical alternative to mainstream theory. But in recent years, this fragmentation among the heterodox approaches has declined as heterodox economists have taken positive steps towards developing a coherent synthesis. This activity has generated the second meaning for heterodox economics; that of referring to the development of a coherent theory that is an alternative to and replacement for mainstream theory. This alternative theory is based on the view that the discipline of economics should be concerned with explaining the process that provides the flow of goods and services required by society to meet the needs of those who participate in its activities. Heterodox economists believe that any explanation or theory of the social provisioning process must be grounded in the real world of actual historical events, must incorporate radical uncertainty and social individuals, and must tell a causal analytical story. Consequently, they reject the method of theory creation and development utilized by mainstream economists which is based on positivism, empirical realism, and deductivism. Numerous suggestions for an alternative method of theory creation have been raised by heterodox economists, but none have been widely accepted; and without a widely accepted method, progress towards developing an alternative heterodox theory will be slow indeed. The aim of this paper is to overcome this roadblock by advocating the method of grounded theory as the best set of guidelines for the creation of heterodox economic theory. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Thus, the first section of this paper will delineate the method of grounded theory. This is followed, in the second section, by a discussion of three methodological issues–the nature of data, the role of case studies, and mathematics and models–as they relate to the grounded theory method. The final section concludes the paper with a brief discussion of the historical nature of grounded economic theories. The Method of Grounded Theory To develop a theory that analytically explains causally related, historically contingent economic events, the critical realist...

The Grounded Theory Bookshelf

By Vivian B. Martin, Ph.D. Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory. Reworking Qualitative Data, Janet Heaton (Sage, 2004). Paperback, 176 pages, $29.95. Hardcover also available. Unlike quantitative research, where secondary analysis of data is common, qualitative research has yet to understand or take advantage of the possibilities of secondary analysis. Janet Heaton’s book focuses more on the hurdles to qualitative secondary analysis — the ethical and legal issues, as well as the operational challenges of analyzing interviews one did not conduct or witness — rather than providing protocols. But of special interest to grounded theorists are the possibilities grounded theory might offer for secondary analysis. Heaton does not launch such an argument; however, in the book’s preface, Heaton notes that Barney Glaser—yes, the co-developer of grounded theory— provided some of the first discussion in the literature about the possibilities of secondary analysis. She quotes from a 1962 Social Problems article in which Glaser writes: To be sure, secondary analysis is not limited to quantitative data. Observation notes, unstructured interviews and documents can also be usefully analyzed. In fact, some field workers may be delighted to have their notes, long buried in their files, reanalyzed from another point of view. Man is a data-gathering animal. (Glaser, 1962: 74). Grounded theorists would run into some of the same hurdles as other researchers viewing qualitative materials for which they could not go back to interviewees and seek elaboration, though grounded theory’s limited concern with full coverage might decrease such hurdles. Heaton does cite some secondary analyses projects for which grounded theory was invoked as the method for re-use. However, the main issue addressed in the book is the limited number of secondary analyses in general. The “secondary analysis of qualitative data remains an enigma” (viii), she writes. Heaton provides a literature review of secondary studies, though they are primarily in the health and social care literature. Importantly, calls for re-use of data have been explicit in these areas, and funding from the Economic and Social Research Council in the UK supported the initial literature review of the health studies. Heaton provides a typology to discuss secondary analyses thus far, but she acknowledges that “secondary analysis” is a vague term, and many studies that appear to be secondary analyses do not make it explicit. Secondary analyses, according to Heaton, include (p. 38): Supra analysis: Transcends the original topic for which the data were collected. Supplementary analysis: Expands on some aspects of the original study through more in-depth investigation. Re-analysis: Verifies or corroborates original premises. Amplified analysis: Combines data from two or more studies for comparison. Assorted analysis: Combines secondary data with primary research and/or naturalistic data. Most of the secondary analyses Heaton examined involved researchers going back to their own data. She notes that, although some researchers espouse the idea of making data available to others for secondary analysis, many have not taken the next step to make such data accessible. Nonetheless, Heaton finds encouragement in the increase in archives of qualitative data, and she...