Issue 1, June 2013

Editorial

Astrid Gynnild, Editor GT constantly challenges grounded theorists to expand their skills and competencies in areas where they know little. Many researchers experience that theoretical coding is possibly the most difficult task of doing grounded theory.  One of the many myths is that most, or all, grounded theories are basic processes, or that they should be. As documented in Theoretical Sensitivity (Glaser, 1978) and Theoretical Coding (Glaser, 2005), there are dozens of theoretical codes and coding families available for grounded theorists to pick and choose from, depending on best fit for their particular theory. In this issue of the Grounded Theory Review, I am delighted to publish a new research by Barney Glaser. “Staying Open: The Use of Theoretical Codes in GT,” is soon to be published as chapter two in Dr. Glaser’s new book No Preconceptions: The Grounded Theory Dictum. In this chapter, Dr. Glaser discusses consequences of theoretical preconceptions and the importance of actively studying theoretical codes to expand one’s repertoire of TCs. His message is that by constantly comparing theoretical codes also beyond one’s field, the growing mastery of TCs will help researchers open up, let go of personal and professional preconceptions, and become more sensitive to the data. Following Barney Glaser’s often cited advice of using and exploring the constant comparative method beyond one’s field, Glen Gatin from Burdon University in Canada has generated a beginning formal theory of Keeping Your Distance. His starting point was the changing notions of distance prompted by ICT learning and social networking online. Dr. Gatin’s theory helps explain many apparent paradoxes related to extended openness of our time. Strategies for regulating distance are manifest in interactions between individuals and in the interactions between individuals and institutions. When we are accessible to “the whole world” wherever we are via new technologies, strategies for keeping your distance seems to be particularly important for identity formation. Colin Griffiths from Ireland has studied verbal and non-verbal interactions of people with severe and complex disabilities. After collecting visual micro-data using video, Griffiths spent months analyzing the videos, frame by frame, according to the GT protocol. He points out that baseline data, the fourth layer of data in grounded theory, is defined as the best description a participant can offer. In his study, baseline data constituted micro and macro behaviors such as vocalization, facial expressions and body activity gestures. Griffiths discusses the strengths and challenges of collecting data from raw footage following GT procedures. He concludes that visual micro-data are well suited for uncovering and explaining patterns of non-verbal behavior. In the next article, Gary Evans from the United Kingdom provides a ”Rationale for selection of classical grounded theory methodology” based on an examination of classic grounded theory, straussian grounded theory, constructivist grounded theory, and feminist theory respectively. Evans argues that the answer is in the data, but in order to find out which GT approach will be a good fit for you, one need to understand GT philosophy and decide which of them that best match your philosophy of research. Insights into the differences in coding procedures in particular, help identify personal preferences.  Writes Evans, ”Learning the different methodologies is a difficult journey as terminology often sounds similar to the novice researcher, but only by exploring the differences can the researcher rationalize their own choice.” Daniel Berry, Canada, and four colleagues have written an interesting methodological essay which demonstrates the power of a classic GT to identify what is happening in a practical situation...

Staying Open: The Use of Theoretical Codes in GT

Barney G. Glaser, PhD, Hon. PhD Theoretical codes (TC’s) are the abstract models that emerge during the sorting of mature memos in to a potential substantive theory. They conceptualize the integration of substantive codes into hypotheses of a substantive theory. The researcher is challenged to staying open to their emergence and earned relevance rather than their preconceived forcing, which is very strong. They not only bring in their framework, but also their theoretical perspective, which can easily force the data beyond emergence. For example, using a basic social process TC requires at least two stages and there may not be a process in the sorting emergence. It may be all just be dimensions or conditional. In this chapter I discuss the skill of staying open to the emergent TC. As the reader knows, there are many TC’s and each has its requirement for use and perspective. In chapter 3 of the book Getting out of the Data: I will discuss more fully all the TC effects that originate preconceptions. Introduction The full power of GT comes with staying open to the emergence of codes that fit with relevance when generating a GT. This power emerges especially with sorting mature memos into theoretical codes for writing up. Substantive coding comes comparatively easy and is exciting giving the researcher the exhilarating feeling of discovery. Putting the theory together with theoretical coding seems for many not so easy. It can have a beguiling mystique, with forcing implications for preconceptions. As Cutliffe says from his experience: “TC usage places the most demand on researcher’s creativity.” TC’s are frequently left out of otherwise quite good GT papers, monographs and dissertations although they are always implicit, such as range, dimension, or process. The novice GT researcher finds them hard to assimilate into his/her theory, which has to be finished for external requirements. Here I will consider several sources of difficulty with preconceptive consequences in using TC’s. Staying open to the non forced, non preconceived discovery and use of TC’s is the focus of this chapter. For a more extensive discussion on the emergence of TC’s during sorting see chapters 3,4,and 5 of my book: The Grounded Theory Perspective III: Theoretical Coding (Sociology Press, 2005). I hope to add new insights in using TC’s. Readers who are challenged in staying on the substantively abstract of conceptualization may find this chapter even more challenging. Keeping researchers on an abstract conceptual level is hard for those trained in immediate accurate description, such as medicine, nursing, business, management, social work etc., many of whom are attracted to GT research. Practical considerations of work easily take over. Staying open to the emergent conceptualization will actually increase their power of description, they soon learn. Getting on the TC level of abstraction even more so. TC’s are abstract models that integrate categories and their properties into a theory. They emerge and put a theory together when sorting mature memos. They are easily forced. Thus, staying open to their emergence is not easy for novices. Their use comes with experiencing many research studies as part of the experiential growth of doing GT and learning earned relevance with theoretical sensitivity. Remember they are always implicit in a substantive GT, the simplest being dimensional, if one doesn’t emerge for the novice. TC’s are not to be preconceived forced by a discipline, supervisor or a pet code. Pet codes happen with grab easily, such as basic social process or networking. They spread like wild fire like...

Keeping Your Distance

Glen Gatin, Brandon University Abstract This analysis began with inquiries into the substantive area of distance education using the classic grounded theory method. Analysis revealed a pattern of problem-solving behavior, from which the theory Keeping Your Distance emerged. The theory is an integrated set of concepts referring to the conscious and unconscious strategies that people use to regulate distance, physical and representative, in their everyday lives. Strategies are used to control physical, emotional, and psychological realities and to conserve personal energy in interactions with individuals and/or institutions. For all social interactions, people use a personalized algorithm of engagement that mitigates conditions and consequences and preserves optimal distance. Keeping Your Distance provides a theoretical starting point for considerations of the changing notions of distance. In part, these changes have been brought about by developments in the fields of Information and Communication Technology (ICT) and online social networking. Introduction This study began in the substantive area of distance education by analyzing the responses of people who used computer-mediated distance education as they solved problems and resolved concerns. Data were collected in face-to-face interviews as well as from institutional documents, collegial comments, casual conversation and observational data. Glaser’s (1998) dictum that all is data was interpreted to mean that not only is it possible to use a variety of data sources, but that as many data sources as possible should be examined. Initial participants for this analysis were chosen from related groups: students, support staff, administration and faculty involved in the distance education enterprise. After the analysis of the first three interviews a pattern began to form; after the sixth interview the core variable emerged. Interviews continued until the main properties were established and saturated. Early theoretical sampling looked beyond the initial groups from the distance education arena to test the generalizability of the core variable. Extant theory provided important data, particularly, Moore’s (1997) Theory of Transactional Distance. Data were coded and condensed into written memos. Memos were sorted according to analytical rules (Glaser, 1978). The most critical rule for sorting was the relationship of the memo to the core variable; if a memo was not related to the core variable or a property of the core variable, it was left out of the analysis. The analyst established rules for the determination of the core variable, the one that explains the most variation, recognizing that “the goal is not to cover all possible theoretical possibilities nor explain all variation.” (p.122). Other rules relate to the integrative fit of ideas and are “based on the assumption that the social organization of the world is integrated and the job of the grounded theorist is to discover it” (p. 123). The memos become the outline, and then the writer must merely connect and integrate the ideas together into a formal theory. The theory of keeping your distance emerged through at least three distinct levels of abstraction: concrete/descriptive, metaphoric/symbolic, and abstract/conceptual. What follows is the elucidation of that theory using the “conditions and consequences” model (Glaser, 1978, p. 74). These are not findings but an integrated set of hypotheses. Illustrations and examples are from data collected in this research and are provided for the purpose of establishing imagery and understanding. These illustrations and examples are for the purpose of making the theory clear and should not be considered as proofs or descriptions of the process used to derive the theory. References to theoretical work by others are not necessarily intended to seek verification of this theory or to...

Using Grounded Theory to Analyze Qualitative Observational Data that is Obtained by Video Recording...

Colin Griffiths, Trinity College Dublin Abstract This paper presents a method for the collection and analysis of qualitative data that is derived by observation and that may be used to generate a grounded theory. Video recordings were made of the verbal and non-verbal interactions of people with severe and complex disabilities and the staff who work with them. Three dyads composed of a student/teacher or carer and a person with a severe or profound intellectual disability were observed in a variety of different activities that took place in a school. Two of these recordings yielded 25 minutes of video, which was transcribed into narrative format. The nature of the qualitative micro data that was captured is described and the fit between such data and classic grounded theory is discussed. The strengths and weaknesses of the use of video as a tool to collect data that is amenable to analysis using grounded theory are considered. The paper concludes by suggesting that using classic grounded theory to analyze qualitative data that is collected using video offers a method that has the potential to uncover and explain patterns of non-verbal interactions that were not previously evident. Introduction Understanding how people communicate is difficult both for those who have the experience of an intellectual disability and for those who attempt to communicate with them (Caldwell 2007). This difficulty is magnified for people with profound intellectual and multiple disability (PIMD), who are confronted with many challenges in living their daily lives. Such challenges centre around how to comprehend the world that they live in. However, functionally, the primary practical concern that they have is how to communicate with a complex and at times forbidding world. The research study from which this paper is derived aimed to develop a theory to explain how people with PIMD confront that primary difficulty and communicate with others. The aim of this paper is to describe the method that was used in the study. Video was the tool used to collect the data, and this approach to data collection, combined with a meticulous analysis of the videotapes, revealed the micro-behaviours that constitute the basic building blocks of dyadic communication. The progression in the data analysis process is described from descriptions of these micro-behaviours towards the emergence of the concepts of the theory. A discussion of the arguments for and against video-taping in the context of the development of grounded theory is presented and finally the strengths and weakness of the method are considered. Background to the Study People with profound intellectual and multiple disability (PIMD) have an intelligence quotient below 25 points (American Psychiatric Association 2000). They require virtually total care in terms of assistance in activities of daily living (Cascella, 2005), they often have accompanying secondary disabilities such as epilepsy, physical disability or mental health difficulties (Nakken and Vlaskampf, 2007) and they do not use speech, but generally interact using non-verbal communications (Hogg et al., 2001). As well as being non-verbal, people with this severe degree of disability have only a restricted capacity to communicate in any mode (Grove et al., 1999).  The effect of the multiple difficulties that people with profound intellectual and multiple disability are confronted with is that they have to deal with a world where they receive restricted sensory inputs which they must interpret through the prism of a limited cognitive ability. Such a situation leads to communication difficulties both for the person with PIMD and the people who do not have a disability...

A Novice Researcher’s First Walk Through the Maze of Grounded Theory: Rationalization for Classical Grounded Theory...

Gary L. Evans, Liverpool John Moores University Abstract Being new to grounded theory the onus to understand the methodology and the various versions can be daunting.  Learning and understanding the differences between grounded theories methodologies can be as much a learning of one’s own research philosophy and this philosophy is often the deciding factor in methodology selection.  Learning the different methodologies is a difficult journey as terminology often sounds similar to the novice researcher, but only by exploring the differences can the researcher rationalize their own choice.  This paper offers the new researcher a view into the confusing world of grounded theory, where common terms are used but the secret lies in understanding the philosophy of the researcher and the topic of discovery.  Glaser was correct, the answer is in the data, but you need to understand the philosophy of the method and if it matches your philosophy of research. Theoretical Framework Grounded theory, developed by Barney Glaser and Anselm Strauss in the early 1960s, is a methodology for inductively generating theory (Patton, 1990).  Glaser’s definition of grounded theory is “a general methodology of analysis linked with data collection that uses a systematically applied set of methods to generate an inductive theory about a substantive area” (Glaser, 1992, p. 16).  While this definition is accepted by researchers, the approach and rigor in the data collection, handling and analysis created differences between Glaser and Strauss.  Strauss developed a more linear approach to the research methodology (Strauss & Corbin 1990).  Grounded theory is not new to business research and Mintzberg emphasized the importance of grounded research for qualitative inquiry within organization settings: “measuring in real organizational terms means first of all getting out, into real organizations.  Questionnaires often won’t do.  Nor will laboratory simulations…  The qualitative research designs, on the other hand, permit the researcher to get close to the data, to know well all the individuals involved and observe and record what they do and say” (Mintzberg, 1979, p. 586). As grounded theory became more popular for researchers, the substantial divide between the creators of the methodology was apparent.  The two original authors reached a diacritical juncture on the aims, principles, and procedures associated with the implementation of the method.  Two paths emerged, and these are marked by Strauss and Corbin’s 1990 publication, Basics of Qualitative Research: Grounded Theory Procedures and Techniques, to which Glaser responded harshly with accusations of distortion of the central objectives of parsimony and theoretical emergence (Glaser, 1992).  Glaser’s views were supported by other grounded theory researchers who agreed that the late Strauss’ 1990 publication was an erosion of the original 1967 methodology (Stern, 1994).  During the years since the opening of the debate on grounded theory, a number of researchers have firmly supported the classic grounded theory methodology CGT (Bowen 2005; Clark & Lang 2002; Davis 1996; Efinger, Maldonado & McArdie 2004; Holton 2007; Schreiber 2001). Various scholars have put forward a range of strategies and guidelines for the coding process (Charmaz 2006; Goulding 2005; Partington 2002; Patton 2002; Strauss & Corbin 1990, 1998).  The process and methods for coding have created the highest level of debate for users of grounded theory.  Some researchers have combined quantitative and qualitative forms of data collection when using grounded theory. And while nothing prohibits such combination, the purpose needs to be clear, otherwise a muddling of the methodology will occur (Baker, West & Stern 1992; Wells, 1995).  While the coding process is an important part of grounded theory,...

Requirements Specifications and Recovered Architectures as Grounded Theories...

Daniel M. Berry, University of Waterloo, Michael W. Godfrey, University of Waterloo, Ric Holt, University of Waterloo, Cory J. Kapser, Mobile Data Technologies, Isabel Ramos, University of Minho Abstract This paper describes the classic grounded theory (GT) process as a method to discover GTs to be subjected to later empirical validation. The paper shows that a well conducted instance of requirements engineering or of architecture recovery resembles an instance of the GT process for the purpose of discovering the requirements specification or recovered architecture artifact that the requirements engineering or architecture recovery produces. Therefore, this artifact resembles a GT. Introduction The purpose of this paper is to show that well conducted instances of two different activities in Software Engineering, requirements engineering (RE) and architecture recovery (AR) resemble grounded theory (GT) processes. Each verifies the power of the classic GT process, as discovered by Glaser and Strauss (1967), to identify what is happening in a practical situation, producing a working GT of the requirements or architecture of a system. The aim is to point out some striking similarities between the classic GT process and software engineers’ approaches to requirements engineering and architecture recovery, thus demonstrating how requirements engineering and architecture recovery practitioners might be producing working GTs. The purpose of requirements engineering is to use whatever data are available, from documents to spoken words, to construct a requirements specification for a software system. The purpose of architecture recovery is to use whatever data are available, from existing code and documentation to spoken words, to construct a recovered architecture for an existing software system. This paper is not trying to invent a new form of the GT process, but is simply showing, by appeal to a description of the classic GT process, that what software engineers are doing in either of these two specific cases amounts to a GT process and that the artifact produced, a requirements specification or a recovered architecture, resemble a GT. Section 2 describes the classic GT process and its resulting working GTs. Section 3 argues that two activities in Software Engineering, Requirements Engineering and Architecture Recovery, are GT processes. Section 4 describes related work, and Section 5 concludes the paper. In what follows, an arbitrary GT process practitioner is without loss of generality assigned the male gender and an arbitrary requirements or architectural analyst is without loss of generality assigned the female gender. Note also that architecture recovery is a major and essential component of reverse engineering, whose common acronym, “RE” is identical with the acronym used for “requirements engineering”. However, reverse engineering includes steps that are not considered in this paper and is thus regarded as outside the scope of this paper. 2 Grounded Theory The classic GT process is a method for developing grounded theories (Glaser & Strauss, 1967; Glaser, 1992), each of which is a theory about a named pattern of human behavior. In the 1960s, discomfort was growing with the application of traditional statistical methods to understanding and explaining social phenomena. The GT process was developed in response to this discomfort, and its purpose is to provide a means to gather detailed empirical evidence for theory that could be later subjected to traditional statistical empirical validation using controlled experiments or other means. The GT process is an adaptive research process for finding emergent theory that could not be anticipated in advance of the research. The researcher adapts the research process based on what he has learned from the data he has seen so far...