Volume 04

Managing Collaborative Synergy in the Crane Industry

By Keith Ng Y. N. (Ng, K.) Ph.D. Abstract This study explores the key factors vital to Principal-Distributor Collaboration (PDC) in the context of the crane industry in Singapore, Malaysia and Indonesia. It explains the social processes that Principals use to address differing interests throughout the course of the PDC. Applying Glaser’s (1978, 1992, 1998, 2001) emergent approach to grounded theory, 150 interviews were conducted with 50 participants from these countries. The main professional concern of participants throughout the course of the PDC was the need to achieve corporate objectives, within a certain time frame, whilst also having to rely on the cooperation of key managers from the partnering firm. Key decision makers continuously resolve their professional concern through the basic social process of Managing Collaborative Synergy (MCS). The theory of MCS suggests that the way in which Principal firms manage the PDC is by giving attention to the three interdependent dimensions of Competitiveness Initiating, Confidence Building and Conformance Setting. Background and Motivation to the Research This study took place during the Asian Financial crisis at a time when the crane industry was undergoing change. Principal firms are manufacturers of cranes or crane components. Distributors are those who resell, construct and service cranes or crane components of those principals that they represent. At the time when this study was conducted, Principals were gaining in their appreciation of the rewards associated with successful collaboration with Distributor firms in the pursuit of their corporate objectives. Similarly, Distributors were more alert to the benefits, in a limited market, of working in conjunction with their foreign counterparts to share risks and meet increasing customer demands. This environment of increasing cooperation between Principal and Distributor firms provided the overall context for this research study. It is a well-recognised fact that effective collaboration with Distributors plays a prominent role in the business-to-business arena (McQuiston, 2001; Mudambi & Aggarwal, 2003), and so collaborating with Distributors has been gaining popularity with Principal firms for two main reasons. First, it allows the Principal firms to focus on larger accounts (Ernst & Young, 1990; Emshweiler, 1991). Second, Distributors with a home territorial advantage often have a better knowledge of their local markets and are able to penetrate these markets with ease and greater success than can Principal firms (Douglas & Craig, 1989; Cavusgil & Zou, 1994). Given the prospects of mutual benefits, working with Distributors provides the possibility of reaching every segment of the business field. Therefore, an astute Principal firm will choose to work closely with their Distributors in order to stay competitive and ensure long-term corporate success (Noordewier, John & Nevin, 1990). Although there are no definitive data to account for the business volume that Distributors are directly responsible for, industry estimates in the United States indicate that there are 400,000 Distributors who make up as much as 50% of the upper-channel sales in business-to- business markets (Dishman, 1996). In the crane industry, 80% of crane firms in Singapore, Malaysia and Indonesia are Distributors who represent Principal firms that manufacture hoisting equipment. Given the large number of firms using Distributors, successfully managing and improving working relationship with Distributors is of paramount significance to any Principal firm (Merkel, 2001; Ng, 2002). However, despite the large numbers of Principal firms employing Distributors, little appears to be understood about how the Principals have gone about developing and maintaining Principal-Distributor relationships. While there are a number of models of building relationships in the business-to-business arena (such as Anderson & Narus, 1999,...

The Grounded Theory Bookshelf

Dr. Alvita Nathaniel, DSN, APRN, BC, West Virginia University The Bookshelf provides critical reviews and perspectives on books on theory and methodology of interest to grounded theory. In this issue, Dr. Alvita Nathaniel offers a review of Barney Glaser’s new book. The Grounded Theory Perspective III: Theoretical Coding, Barney G. Glaser (Sociology Press, 2005).  Not intended for a beginner, this book further defi nes, describes, and explicates the classic grounded theory (GT) method. Perspective III lays out various facets of theoretical coding as Glaser meticulously distinguishes classic GT from other subsequent methods. Developed many years after Glaser’s classic GT, these methods, particularly as described by Strauss and Corbin, adopt the grounded theory name and engender ongoing confusion about the very premises of grounded theory. Glaser distinguishes between classic GT and the adscititious methods in his writings, referring to remodeled grounded theory and its offshoots as Qualitative Data Analysis (QDA) models. The GT/QDA debate is reminiscent of the schism that developed between the philosopher Charles Sanders Peirce and his benefactor, William James at the beginning of the last century. Peirce was a brilliant philosopher and scientist. America’s most prolifi c philosopher, Peirce originated the doctrine of pragmatism. Because Peirce’s writings were a very high level of abstraction and diffi cult to understand, James attempted to make them accessible to the popular academic community through his own, more concrete writings. However, James never got it quite right. Unhappy with James, failing to clarify his ideas about pragmatism, and desiring to distinguish his original ideas from those proffered by the more popular James, Peirce eventually changed the name of his own theory to pragmaticism. Unfortunately, the new name never caught on and the theory of pragmatism continues to be popularly attributed to William James. Like Peirce and his theory of pragmatism, Glaser remains faithful to the original premises of classic GT. He continues the battle to distinguish classic GT from QDA, viewing QDA as a rigid method with a low level of abstraction and tendency toward preconception. He outlines in Perspective III many ways that QDA violates the foundational ideas of GT. In particular, Glaser emphasizes that an understanding of “what is going on” in an area of concern requires openness on the part of the analyst/researcher to the natural emergence of the theoretical code. The theoretical code emerges late in the GT process as the analyst painstakingly hand sorts conceptual memos. This process requires several elements such as the analyst’s proper use of conceptual memos, openness to emergence, perspicacity, and patience. The process is hindered or derailed entirely if the theoretical code is forced through the use of a preconceived theoretical framework, a conditional matrix, discipline specifi c codes, or “pet” codes. Glaser effectively clarifi es his points through critique of various writers and grounded theorists. He sorts through point by point the writings of grounded theory “experts” from a number of disciplines and comments on their level of understanding of the classic GT method. This discussion will be particularly helpful to Ph.D. students who are trying to learn both the fundamentals and the fi ner points of the classic grounded theory method. It will also be helpful as background for the Ph.D. student to use in discussions with dissertation/thesis examiners. Many quotes from what Glaser deems to be good examples of GT are also helpful for clarifi cation purposes. Glaser comments on elements of theories developed within a number of disciplines around the world. The words of the original writers...

The Impact of Symbolic Interaction on Grounded Theory

By Barney G. Glaser, Ph.D., Hon. Ph.D. (Chapter 10, The Grounded Theory Perspective III: Theoretical Coding, Sociology Press, 2005) As I stated in the introduction to chapter 9, GT is a general inductive method possessed by no discipline or theoretical perspective or data type. Yet the takeover of GT by Symbolic Interaction (SI) and all the departments and institutes that SI informs and resides in is massive and thereby replete with the remodeling of GT. The literature on qualitative methodology is massive and replete with the assertion that SI is the foundation theoretical perspective of GT. GT is reported as a SI method. That GT is a general inductive method is lost. Sure, GT can use SI type data and its perspective, but as a general method it can use any other type data, even other types of qualitative data, as well as quantitative, visual, document, journalistic and in any combination, and any other theoretical perspective, such as e.g. systems theory, social structural theory, structural functional theory, social organization theory, cultural theory etc. Thus, the takeover of GT as an SI perspective methodology is just discipline-perspective dominance, as discussed above, and nothing more. It, of course, dominates with a set of TCs (process, strategies, conditions, context etc) I have considered at length in chapters above. Researchers, especially in nursing, just want a theoretical perspective. SI institutionalizes GT as its own! Researchers like it because it gives them an ontology (what is data) and an epistemology (a philosophy of research). The takeover becomes structurally induced by researchers, especially nursing, in their research, since they want a theoretical perspective in advance. It gives them a feeling of power, while they do not realize that the SI takeover reduces the general method power of GT. The writers on GT as a SI method use as their legitimating source because of Strauss’s (my co-author of discovery of GT) training in SI. They ignore the roots of GT in my training in concept-indicator index construction in quantitative survey research. In the following pages, I will discuss these issues at length. Much has already been said in this book about SI and its set of TCs. This chapter just focuses and adds some ideas. The goal of this chapter, as in all the above chapters, is to free GT from this dominance so GT analysts will have the fullest range of TCs – from any and all perspectives– possible at his fingertips for emergence. No one discipline with/and its theoretical perspective defines and possesses GT, as I discussed at length in chapter 9. Obviously many GTs use a SI perspective (as well as others), whether bounded or not by it. Earned, emergent relevance is the TC of best choice. Sources of SI Dominance Obviously, the impact, dominance and possession of SI on GT came from Anselm Strauss’s training in SI at University of Chicago. Many authors assert this one source of SI. Carolyn Weiner (op. cit. page 6) says: “GT derived from the tradition of SI, this sociological stance is based on the perspective of George Herbert Mead as developed by the Chicago school of sociology and asserts that people select and interpret meanings from their environment, formed in many definitions of the situation. The individual acquires a commonality of perspective with others as they learn and develop together the symbols by which aspects of the world are identified. In other words there is a social construction of reality.” Marjorie MacDonald and Rita Schreiber (op...

Beyond the Physical Realm: A proposed theory regarding a consumer’s place experience...

By Mark Rosenbaum, Ph.D. Abstract Marketers view place as a marketing mix tool that denotes activities associated with the distribution of products and services. Thus, the discipline believes that places are alienated from consumers’ lives and experiences. This article looks at the place concept anew and offers an original theory of consumers’ experience in place. Introduction The concept of place is well engrained in the marketing discipline as a basic marketing mix tool that refers to distributional and to organizational activities associated with making products and services available to targeted consumers (Kotler 2000, p. 87). As a result of this conceptualization, it is not surprising that marketers perceive that places are isolated from consumers’ personal lives and experiences. Indeed, pundits often chastise contemporary retailers for creating an urban marketplace that represents a rendition of human alienation and that is replete with impersonal, cold relationships between buyers and sellers. This perception of place, as a mere subdivision of physical space (Sherry 2000), is especially prevalent among marketing researchers who adhere to the regional school of thought (Sheth and Garrett 1986; Sheth, Gardner, and Garrett 1988). Researchers, in this school, consider marketing as a form of economic activity that bridges the geographic gap, or spatial gaps, between buyers and sellers (see Grether 1983). Consequently, these researchers are guided by a philosophy of consumption which espouses that general laws exist for predicting spatial regularities between consumers’ residential location and their selected shopping areas. Although regional researchers have been developing models since the 1930’s, no encompassing marketing theory has yet emerged from their endeavors (Sheth, Gardner, and Garrett 1988). Marketing’s conceptualization of place has been unwavering since its inception in the early 1960’s (McCarthy 1960); however, as the discipline entered the new millennium, Sherry (2000) suggested that all is not sanguine with it. Sherry’s (1998, 2000) point of contention with the place concept is that marketers deem consumption settings, or servicescapes (Bitner 1992; Sherry 1998), as being comprised of physical elements (Turley and Milliman 2000). Thus, he believes that marketers fail to consider that places may also be comprised of intangible, symbolic realms, which may be integral to consumers’ personal worlds and experiences. Rather than consider that consumers view places as pointsof- exchange where they satisfy essential consumption needs, Sherry posits that places have different dimensions of meaning for consumers, based upon their personal experiences in them. In addition, he speculates that the impact of these meanings, on consumer behavior, ranges on a continuum from the subtle to the profound. However, like Trickster, Sherry (1998, 2000) stops conjecturing mid-stream; leaving future researchers with the challenge of generating a theory of consumer’s being-in-place. The goal of this article is to heed Sherry’s (2000) challenge by conceiving a theory that (1) illustrates why and how consumers experience places in their lives, (2) uncovers major antecedents that impact consumers’ place experience, (3) links place experience to patronizing behavior, and (4) is parsimonious, relevant, and modifiable. The theory serves as a milestone for marketing as it addresses a chasm in the marketing mix. Namely, that marketing mix, along with its consideration of place as distribution, is not entirely complete, is somewhat inconsiderate of consumers’ needs, and focuses on investigating unidimensional relationships between stimuli and responses, rather than on the much richer concept of exchange relationships (van Waterschoot 2000; van Waterchoot and Van den Bulte 1992). To date, the majority of place studies in marketing have attempted to discern stimulus-response regularities between specific environmental conditions (e.g., music, crowding,...

Visualising Deteriorating Conditions

By Tom Andrews, RN, B.Sc. (Hons), M.Sc., Ph.D. & Heather Waterman, RN, B.Sc. (Hons), Ph.D. Abstract The research aims were to investigate the difficulties ward staff experienced in detecting deterioration and how these were resolved. The emphasis within the literature tends to be on identifying premonitory signs that may be useful in predicting deterioration. Changes in respiratory rate is the most consistent of these (Fieselmann et al. 1993; Sax and Charlson 1987; Schein et al. 1990; Smith and Wood 1998) but in common with other signs, it lacks sensitivity and specificity. The sample consisted of 44 nurses, doctors (Interns) and health care support workers from a general medical and surgical ward. Data were collected by means of nonparticipant observations and interviews, using grounded theory as originated by (Glaser and Strauss 1967) and (Glaser 1978). As data were collected, the constant comparative method and theoretical sensitivity were used as outlined in grounded theory. A core category of “visualising deteriorating conditions” emerged, together with its sub-core categories of “intuitive knowing”, “baselining” and “grabbing attention”. The main concern in visualising deteriorating conditions is to ensure that patients suspected of deterioration are successfully referred to medical staff. The aim is to convince those who can treat or prevent further deterioration to intervene. Through intuitive knowing they pick up that patients have changed in a way that requires a medical assessment. To make the referral more credible, nurses attempt to contextualise any changes in patients by baselining (establishing baselines). Finally with the backup of colleagues, nurses refer patients by providing as much persuasive information as possible in a way that grabs attention. The whole process is facilitated by knowledge and experience, together with mutual trust and respect. Background Mortality from shock of whatever aetiology remains depressingly high, and avoidable components are contributing to physiological deterioration (McQuillan et al. 1998) often resulting in cardiorespiratory arrest (Rosenberg et al. 1993). Of all patients undergoing resuscitation75% will not survive more than a few days (George et al. 1989) with a survival rate to hospital discharge of 10% to 15% (Peterson et al. 1991; Schultz et al. 1996). Out of 9% of patients discharged from hospital having survived cardiopulmonary resuscitation, 4.3% were in a vegetative state, signifying severe neurological damage (Franklin and Mathew 1994). In an effort to detect shock early, a number of parameters have been measured. Blood pressure, heart rate, respiratory rate, temperature, conscious levels, shock index, central venous pressure, blood gases, blood lactate, pulmonary artery blood pressure, cardiac index, all correlate poorly with physiological deterioration and severity of shock (Rady et al. 1994). Early detection of physiological deterioration remains elusive. A further difficulty is that there are over two hundred normal physiological reflexes that affect the pulse and respiratory rate (Shoemaker et al. 1988). Current emphasis in the literature is on the early detection of physiological deterioration either through premonitory signs such as changes in respiratory rate (Fieselmann et al. 1993; Franklin and Mathew 1994; Goldhill et al. 1999; Sax and Charlson 1987; Schein et al. 1990) or more recently an early warning score (Department of Health 2000; McArthur-Rouse 2001). The latter attaches a score to changes in such variables as blood pressure, pulse rate, respiratory rate and temperature as a means of detecting early signs of physiological deterioration. The greater the score, the greater is the risk of physiological deterioration. To date these variables lack sensitivity and specificity. The current study is an attempt to redress the continued emphasis on physiological variables by exploring...

Grounded Theory and Heterodox Economics

By Frederic S. Lee, Ph.D. Abstract The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. The challengers face many obstacles, the most significant of which is the actual creation of an alternative economic theory. However heterodox economists have not settled on what the methodology of theory creation should be. The aim of this paper is to advocate that the method of grounded theory is the best set of guidelines for theory creation. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Grounded Theory and Heterodox Economics The dominant theory in the discipline of economics, known as neoclassical economics, is being challenged by an upstart, known as heterodox economics. Heterodox economics can be understood in two ways. The first is as a collective term of many different approaches to economic analysis, such as radical and Marxian economics, Post Keynesian economics, institutional economics, feminist economics, and social economics. Each of these approaches rejects various methodological and theoretical aspects of mainstream economics, including supply and demand curves, equilibrium, marginal products, deductivist approach to theory creation, methodological individualism and the optimality of markets. Because the different approaches utilize somewhat different theoretical arguments and methods of theory creation, there has been little progress over the last forty years towards developing an encompassing theoretical alternative to mainstream theory. But in recent years, this fragmentation among the heterodox approaches has declined as heterodox economists have taken positive steps towards developing a coherent synthesis. This activity has generated the second meaning for heterodox economics; that of referring to the development of a coherent theory that is an alternative to and replacement for mainstream theory. This alternative theory is based on the view that the discipline of economics should be concerned with explaining the process that provides the flow of goods and services required by society to meet the needs of those who participate in its activities. Heterodox economists believe that any explanation or theory of the social provisioning process must be grounded in the real world of actual historical events, must incorporate radical uncertainty and social individuals, and must tell a causal analytical story. Consequently, they reject the method of theory creation and development utilized by mainstream economists which is based on positivism, empirical realism, and deductivism. Numerous suggestions for an alternative method of theory creation have been raised by heterodox economists, but none have been widely accepted; and without a widely accepted method, progress towards developing an alternative heterodox theory will be slow indeed. The aim of this paper is to overcome this roadblock by advocating the method of grounded theory as the best set of guidelines for the creation of heterodox economic theory. In addition, I shall argue that the grounded theory method results in the creation of heterodox economic theories that are historical in structure, content and explanation. Thus, the first section of this paper will delineate the method of grounded theory. This is followed, in the second section, by a discussion of three methodological issues–the nature of data, the role of case studies, and mathematics and models–as they relate to the grounded theory method. The final section concludes the paper with a brief discussion of the historical nature of grounded economic theories. The Method of Grounded Theory To develop a theory that analytically explains causally related, historically contingent economic events, the critical realist...