Grounded Theory: Has Insufficient Grounded Theory (GT) Is A Systematic Methodology in The
Grounded Theory: Has Insufficient Grounded Theory (GT) Is A Systematic Methodology in The
Grounded Theory: Has Insufficient Grounded Theory (GT) Is A Systematic Methodology in The
From Wikipedia, the free encyclopedia This article includes a list of references, but its sources remain unclear because it has insufficient inline citations. Please help to improve this article by introducing more precise citations. (March 2009) Grounded theory (GT) is a systematic methodology in the social sciences involving the generation of theory from data.[1] It is mainly used in qualitative research, but is also applicable to quantitative data.[2] Grounded theory is a research method, which operates almost in a reverse fashion from traditional research and at first sight may appear to be in contradiction to the scientific method. Rather than beginning with a hypothesis, the first step is data collection, through a variety of methods. From the data collected, the key points are marked with a series of codes, which are extracted from the text. The codes are grouped into similar concepts in order to make them more workable. From these concepts, categories are formed, which are the basis for the creation of a theory, or a reverse engineered hypothesis. This contradicts the traditional model of research, where the researcher chooses a theoretical framework, and only then applies this model to the phenomenon to be studied.[3]
Contents
[hide]
4.1 Goals of grounded theory 4.2 GT nomenclature 4.3 Memoing 4.4 Sorting 4.5 Writing 4.6 No pre-research literature review, no taping and no talk 4.7 The Grounded Theory Institute 5.1 Differences
5 Strauss's approach
[edit] Development
Grounded theory was developed by two sociologists, Barney Glaser and Anselm Strauss. Their collaboration in research on dying hospital patients led them to write the book Awareness of Dying. In this research they developed the constant comparative method, later known as Grounded Theory.[4]
accuracy while the Glaserian method emphasizes conceptualization abstract of time, place and people. A grounded theory concept should be easy to use outside of the substantive area where it was generated.
[edit] GT nomenclature
A concept is the overall element and includes the categories which are conceptual elements standing by themselves, and properties of categories, which are conceptual aspects of
categories (Glaser & Strauss, 1967). The core variable explains most of the participants main concern with as much variation as possible. It has the most powerful properties to picture whats going on, but with as few properties as possible needed to do so. A popular type of core variable can be theoretically modeled as a basic social process that accounts for most of the variation in change over time, context, and behavior in the studied area. "GT is multivariate. It happens sequentially, subsequently, simultaneously, serendipitously, and scheduled" (Glaser, 1998). All is data is a fundamental property of GT which means that everything that gets in the researchers way when studying a certain area is data. Not only interviews or observations but anything is data that helps the researcher generating concepts for the emerging theory. Field notes can come from informal interviews, lectures, seminars, expert group meetings, newspaper articles, Internet mail lists, even television shows, conversations with friends etc. It is even possible, and sometimes a good idea, for a researcher with much knowledge in the studied area to interview herself, treating that interview like any other data, coding and comparing it to other data and generating concepts from it. This may sound silly since you dont have to interview yourself to know what you know, but you dont know it on the conceptual level! And GT deals with conceptual level data. Open coding or substantive coding is conceptualizing on the first level of abstraction. Written data from field notes or transcripts are conceptualized line by line. In the beginning of a study everything is coded in order to find out about the problem and how it is being resolved. The coding is often done in the margin of the field notes. This phase is often tedious since you are conceptualizing all incidents in the data, which yields many concepts. These are compared as you code more data, and merged into new concepts, and eventually renamed and modified. The GT researcher goes back and forth while comparing data, constantly modifying, and sharpening the growing theory at the same time as she follows the build-up schedule of GTs different steps. Strauss and Corbin (1990, 1998) also proposed the axial coding and defined it in 1990 as "a set of procedures whereby data are put back together in new ways after open coding, by making connections between categories." They proposed a "coding paradigm" (also discussed, among others, by Kelle, 2005) that involved "conditions, context, action/ interactional strategies and consequences. (Strauss & Corbin, 1990, p. 96) Selective coding is done after having found the core variable or what is thought to be the core, the tentative core. The core explains the behavior of the participants in resolving their main concern. The tentative core is never wrong. It just more or less fits with the data. After you have chosen your core variable you selectively code data with the core guiding your coding, not bothering about concepts with little importance to the core and its subcores. Also, you now selectively sample new data with the core in mind, which is called theoretical sampling a deductive part of GT. Selective coding delimits the study, which makes it move fast. This is indeed encouraged while doing GT (Glaser, 1998) since GT is not concerned with data accuracy as in descriptive research but is about generating concepts that are abstract of time, place and people. Selective coding could be done by going over old field notes or memos which are already coded once at an earlier stage or by coding newly gathered data. Theoretical codes integrate the theory by weaving the fractured concepts into hypotheses that work together in a theory explaining the main concern of the participants. Theoretical coding means that the researcher applies a theoretical model to the data. It is important that this model is not forced beforehand but has emerged during the comparative process of GT. So the theoretical codes just as substantives codes should emerge from the process of constantly comparing the data in field notes and memos.
[edit] Memoing
Theoretical memoing is "the core stage of grounded theory methodology" (Glaser 1998). "Memos are the theorizing write-up of ideas about substantive codes and their theoretically coded relationships as they emerge during coding, collecting and analyzing data, and during memoing" (Glaser 1998). Memoing is also important in the early phase of a GT study such as open coding. The researcher is then conceptualizing incidents, and memoing helps this process. Theoretical memos can be anything written or drawn in the constant comparison that makes up a GT. Memos are important tools to both refine and keep track of ideas that develop when you compare incidents to incidents and then concepts to concepts in the evolving theory. In memos you develop ideas about naming concepts and relating them to each other. In memos you try the relationships between concepts in two-by-two tables, in diagrams or figures or whatever makes the ideas flow, and generates comparative power. Without memoing the theory is superficial and the concepts generated not very original. Memoing works as an accumulation of written ideas into a bank of ideas about concepts and how they relate to each other. This bank contains rich parts of what will later be the written theory. Memoing is total creative freedom without rules of writing, grammar or style (Glaser 1998). The writing must be an instrument for outflow of ideas, and nothing else. When you write memos the ideas become more realistic, being converted from thoughts in your mind to words, and thus ideas communicable to the afterworld. In GT the preconscious processing that occurs when coding and comparing is recognized. The researcher is encouraged to register ideas about the ongoing study that eventually pop up in everyday situations, and awareness of the serendipity of the method is also necessary to achieve good results.
[edit] Sorting
In the next step memos are sorted, which is the key to formulate the theory for presentation to others. Sorting puts fractured data back together. During sorting lots of new ideas emerge, which in turn are recorded in new memos giving the memo-on-memos phenomenon. Sorting memos generates theory that explains the main action in the studied area. A theory written from unsorted memos may be rich in ideas but the connection between concepts is weak.
[edit] Writing
Writing up the sorted memo piles follows after sorting, and at this stage the theory is close to the written GT product. The different categories are now related to each other and the core variable. The theoretical density should be dosed so concepts are mixed with description in words, tables, or figures to optimize readability. In the later rewriting the relevant literature is woven in to put the theory in a scholarly context. Finally, the GT is edited for style and language and eventually submitted for publication.
her data by field-noting interviews and soon after generates concepts that fit with data, are relevant and work in explaining what participants are doing to resolve their main concern. No talk. Talking about the theory before it is written up drains the researcher of motivational energy. Talking can either render praise or criticism, and both diminish the motivational drive to write memos that develop and refine the concepts and the theory (Glaser 1998). Positive feedback makes you content with what you've got and negative feedback hampers your selfconfidence. Talking about the GT should be restricted to persons capable of helping the researcher without influencing her final judgments.
Theoretical sensitive coding, that is, generating theoretical strong concepts from the data to explain the phenomenon researched; theoretical sampling, that is, deciding whom to interview or what to observe next according to the state of theory generation, and that implies starting data analysis with the first interview, and writing down memos and hypotheses early; the need to compare between phenomena and contexts to make the theory strong.
[edit] Differences
Grounded theory according to Glaser emphasizes induction or emergence, and the individual researcher's creativity within a clear frame of stages, while Strauss is more interested in validation criteria and a systematic approach.
[edit] Criticism
Critiques of grounded theory have focused on its status as theory (is what is produced really 'theory'?), on the notion of 'ground' (why is an idea of 'grounding' one's findings important in qualitative inquirywhat are they 'grounded' in?) and on the claim to use and develop inductive knowledge. These criticisms are summed up by Thomas and James.[9] These authors also suggest that it is impossible to free oneself of preconceptions in the collection and analysis of data in the way that Glaser and Strauss say is necessary. They also point to the formulaic nature of grounded theory and the lack of congruence of this with open and creative interpretation - which ought to be the hallmark of qualitative inquiry. They suggest that the one element of grounded theory worth keeping is constant comparative method. Grounded theory was developed in a period when other qualitative methods were often considered unscientific. Of all qualitative methods it achieved the widest acceptance of its academic rigor. Thus, especially in American academia, qualitative research is often equated to grounded theory. This equation is sometimes criticized by qualitative researchers[who?] using other methodologies (for example, traditional ethnography, narratology, and storytelling).
Antipositivism Grounded Practical Theory Phronetic social science Positivism Social research Social science Sociology
[edit] References
1. ^ Patricia Yancey Martin & Barry A. Turner, "Grounded Theory and Organizational
Research," The Journal of Applied Behavioral Science, vol. 22, no. 2 (1986), 141..
2. ^ Glaser, 1967, chapter VIII. 3. ^ G. Allan, "A critique of using grounded theory as a research method," Electronic
of "Grounded Theory" Reconsidered. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [On-line Journal], 6(2), Art. 27, paragraphs 49 & 50. [1]
6. ^ Strauss, 1993, p. 12 7. ^ Glaser & Strauss 1967 8. ^ Glaser & Strauss 1967 9. ^ Thomas, G. and James, D. (2006). Reinventing grounded theory: some questions
about theory, ground and discovery, British Educational Research Journal, 32, 6, 767 795.
Grounded Theory Online (Supporting (Glaserian) GT researchers) Grounded Theory Review Sociology Press An Introduction to GT by the Action Research Unit, Southern Cross University Management School
Strauss, A. (1987). Qualitative analysis for social scientists. Cambridge, England: Cambridge University Press. Glaser, B. (1992). Basics of grounded theory analysis. Mill Valley, CA: Sociology Press. Charmaz, K. (2006). Constructing Grounded Theory: A Practical Guide Through Qualitative Analysis. Thousand Oaks, CA: Sage Publications. Clarke, A. (2005). Situational Analysis: Grounded Theory After the Postmodern Turn. Thousand Oaks, CA: Sage Publications. Kelle, Udo (2005). "Emergence" vs. "Forcing" of Empirical Data? A Crucial Problem of "Grounded Theory" Reconsidered. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [On-line Journal], 6(2), Art. 27, paragraphs 49 & 50. [2] Mey, G. & Mruck, K. (Eds.) (2007). Grounded Theory Reader (HSR-Supplement 19). Cologne: ZHSF. 337 pages Thomas, G. & James, D. (2006). Re-inventing grounded theory: some questions about theory, ground and discovery. British Educational Research Journal, 32 (6), 767795. Goulding, C. (2002). Grounded Theory: A Practical Guide for Management, Business and Market Researchers. London: Sage. Stebbins, Robert A. (2001) Exploratory Research in the Social Sciences. Thousand Oaks, CA: Sage. Glaser BG, Strauss A. Discovery of Grounded Theory. Strategies for Qualitative Research. Sociology Press [3], 1967 Glaser BG. Theoretical Sensitivity: Advances in the methodology of Grounded Theory. Sociology Press [4], 1978. Strauss A, Corbin J. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Sage, 1990. Glaser BG. Basics of Grounded Theory Analysis. Emergence vs Forcing. Sociology Press [5], 1992 Glaser BG (ed). Examples of Grounded Theory: A Reader. Sociology Press [6], 1993. Glaser BG (ed). More Grounded Theory Methodology: A Reader. Sociology Press [7], 1994. Glaser BG (ed). Grounded Theory 1984-1994. A Reader (two volumes). Sociology Press [8], 1995. Glaser BG (ed). Gerund Grounded Theory: The Basic Social Process Dissertation. Sociology Press [9], 1996. Glaser BG. Doing Grounded Theory - Issues and Discussions. Sociology Press [10], 1998.
[edit] Glaser
Glaser BG. The Grounded Theory Perspective I: Conceptualization Contrasted with Description. Sociology Press [11], 2001. Glaser BG. The Grounded Theory Perspective II: Description's Remodeling of Grounded Theory. Sociology Press [12], 2003. Glaser BG. The Grounded Theory Perspective III: Theoretical coding. Sociology Press, 2005. Goulding, C. Grounded Theory: A Practical Guide for Management, Business and Market Researchers. London: Sage Publications, 2002. Anselm L. Strauss; Leonard Schatzman; Rue Bucher; Danuta Ehrlich & Melvin Sabshin: Psychiatric ideologies and institutions (1964) Barney G. Glaser; Anselm L. Strauss: The Discovery of Grounded Theory. Strategies for Qualitative Research (1967) Anselm L. Strauss: Qualitative Analysis for Social Scientists (1987) Anselm L. Strauss; Juliet Corbin: Basics of Qualitative Research (1990) Anselm L. Strauss; Juliet Corbin: "Grounded Theory Research: Procedures, Canons and Evaluative Criteria", in: Zeitschrift fr Soziologie, 19. Jg, S. 418 ff. (1990) Anselm L. Strauss: Continual Permutations of Action (1993) Anselm L. Strauss; Juliet Corbin: "Grounded Theory in Practice" (1997) Legewie, Heiner & Schervier-Legewie, Barbara (September 2004). "Forschung ist harte Arbeit, es ist immer ein Stck Leiden damit verbunden. Deshalb muss es auf der anderen Seite Spa machen". Anselm Strauss interviewed by Heiner Legewie and Barbara Schervier-Legewie. Forum: Qualitative Social Research On-line Journal, 5(3), Art. 22. Interview as MP3 audio (english) / edited German translation of interview. Accessed on May 20, 2005.
[edit] Strauss
View page ratings Rate this page What's this? Trustworthy Objective Complete Well-written I am highly knowledgeable about this topic (optional) Categories:
Research methods Social sciences methodology Educational psychology research methods Psychological theories Log in / create account Article Talk Read
Bottom of Form
Toolbox
Main page Contents Featured content Current events Random article Donate to Wikipedia Help About Wikipedia Community portal Recent changes Contact Wikipedia
Interaction
Print/export Languages
esky Deutsch Espaol Franais Italiano Polski Suomi Svenska This page was last modified on 14 December 2011 at 00:34. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of use for details. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization. Contact us Privacy policy
///////////////////////
Content analysis
From Wikipedia, the free encyclopedia This article may require cleanup to meet Wikipedia's quality standards. (Consider using more specific cleanup instructions.) Please help improve this article if you can. The talk page may contain suggestions. (April 2008) This article or section is in need of attention from an expert on the subject. The following WikiProjects or Portals may be able to help recruit one: WikiProject Sociology Sociology Portal WikiProject Media Media Portal If another appropriate WikiProject or portal exists, please adjust this template accordingly. (April 2008) Content analysis or textual analysis is a methodology in the social sciences for studying the content of communication. Earl Babbie defines it as "the study of recorded human communications, such as books, websites, paintings and laws." According to Dr. Farooq Joubish, content analysis is considered a scholarly methodology in the humanities by which texts are studied as to authorship, authenticity, or meaning. This latter subject include philology, hermeneutics, and semiotics. Harold Lasswell formulated the core questions of content analysis: "Who says what, to whom, why, to what extent and with what effect?." Ole Holsti (1969) offers a broad definition of content analysis as "any technique for making inferences by objectively and systematically identifying specified characteristics of messages." Kimberly A. Neuendorf (2002, p. 10) offers a six-part definition of content analysis: "Content analysis is a summarising, quantitative analysis of messages that relies on the scientific method (including attention to objectivity, intersubjectivity, a priori design, reliability, validity, generalisability, replicability, and hypothesis testing) and is not limited as to the types of variables that may be measured or the context in which the messages are created or presented."
Contents
[hide]
[edit] Description
In 1931, Alfred R Lindesmith developed a methodology to refute existing hypotheses, which became known as a content analysis technique, and it gained popularity in the 1960s by Glaser and is referred to as The Constant Comparative Method of Qualitative Analysis in an article published in 1964-65. Glaser and Strauss (1967) referred to their adaptation of it as Grounded Theory." The method of content analysis enables the researcher to include large amounts of textual information and systematically identify its properties, e.g. the frequencies of most used keywords (KWIC meaning "Key Word in Context") by locating the more important structures of its communication content. Yet such amounts of textual information must be categorised analysis, providing at the end a meaningful reading of content under scrutiny. David Robertson (1976:73-75) for example created a coding frame for a comparison of modes of party competition between British and American parties. It was developed further in 1979 by the Manifesto Research Group aiming at a comparative content-analytic approach on the policy positions of political parties. Since the 1980s, content analysis has become an increasingly important tool in the measurement of success in public relations (notably media relations) programs and the assessment of media profiles. In these circumstances, content analysis is an element of media evaluation or media analysis. In analyses of this type, data from content analysis is usually combined with media data (circulation, readership, number of viewers and listeners, frequency of publication). It has also been used by futurists to identify trends. In 1982, John Naisbitt published his popular Megatrends, based on content analysis in the US media. The creation of coding frames is intrinsically related to a creative approach to variables that exert an influence over textual content. In political analysis, these variables could be political scandals, the impact of public opinion polls, sudden events in external politics, inflation etc. Mimetic Convergence, created by F. Lampreia Carvalho for the comparative analysis of electoral proclamations on free-to-air television is an example of creative articulation of variables in content analysis. The methodology describes the construction of party identities during long-term party competitions on TV, from a dynamic perspective, governed by the logic of the contingent. This method aims to capture the contingent logic observed in electoral campaigns by focusing on the repetition and innovation of themes sustained in party broadcasts. According to such post-structuralist perspective from which electoral competition is analysed, the party identities, 'the real' cannot speak without mediations because there is not a natural centre fixing the meaning of a party structure, it rather depends on ad-hoc articulations. There is no empirical reality outside articulations of meaning. Reality is an outcome of power struggles that unify ideas of social structure as a result of contingent interventions. In Brazil, these contingent interventions have proven to be mimetic and convergent rather than divergent and polarised, being integral to the repetition of dichotomised worldviews. Mimetic Convergence thus aims to show the process of fixation of meaning through discursive articulations that repeat, alter and subvert political issues that come into play. For this reason, parties are not taken as the pure expression of conflicts for the representation of interests (of different classes, religions, ethnic groups (see: Lipset & Rokkan 1967, Lijphart 1984) but attempts to recompose and re-articulate ideas of an absent totality around signifiers gaining positivity.
Every content analysis should depart from a hypothesis. The hypothesis of Mimetic Convergence supports the Downsian interpretation that in general, rational voters converge in the direction of uniform positions in most thematic dimensions. The hypothesis guiding the analysis of Mimetic Convergence between political parties' broadcasts is: 'public opinion polls on vote intention, published throughout campaigns on TV will contribute to successive revisions of candidates' discourses. Candidates re-orient their arguments and thematic selections in part by the signals sent by voters. One must also consider the interference of other kinds of input on electoral propaganda such as internal and external political crises and the arbitrary interference of private interests on the dispute. Moments of internal crisis in disputes between candidates might result from the exhaustion of a certain strategy. The moments of exhaustion might consequently precipitate an inversion in the thematic flux. As an evaluation approach, content analysis is considered by some to be quasi-evaluation because content analysis judgments need not be based on value statements if the research objective is aimed at presenting subjective experiences. Thus, they can be based on knowledge of everyday lived experiences. Such content analyses are not evaluations. On the other hand, when content analysis judgments are based on values, such studies are evaluations (Frisbie, 1986). As demonstrated above, only a good scientific hypothesis can lead to the development of a methodology that will allow the empirical description, be it dynamic or static. Content analysis. This is a closely related if not overlapping kind, often included under the general rubric of qualitative analysis, and used primarily in the social sciences. It is a systematic, replicable technique for compressing many words of text into fewer content categories based on explicit rules of coding (Stemler 2001). It often involves building and applying a concept dictionary or fixed vocabulary of terms on the basis of which words are extracted from the textual data for concording or statistical computation.
make inferences about the antecedents of a communication describe and make inferences about characteristics of a communication make inferences about the effects of a communication.
He also places these uses into the context of the basic communication paradigm. The following table shows fifteen uses of content analysis in terms of their general purpose, element of the communication paradigm to which they apply, and the general question they are intended to answer. Uses of Content Analysis by Purpose, Communication Element, and Question Purpose Element Question Use Make inferences about the Answer questions of antecedents of Source Who? disputed authorship communications (authorship analysis) Encoding process Why?
Secure political & military intelligence Analyse traits of individuals Infer cultural aspects & change
Channel How? Recipient To whom? Make inferences about the consequences of communications Decoding process With what effect?
Provide legal & evaluative evidence Analyse techniques of persuasion Analyse style Describe trends in communication content Relate known characteristics of sources to messages they produce Compare communication content to standards Relate known characteristics of audiences to messages produced for them Describe patterns of communication Measure readability Analyse the flow of information Assess responses to communications
Message
What?
Note. Purpose, communication element, & question from Holsti (1969). Uses primarily from Berelson (1952) as adapted by Holsti (1969).
2. How are they defined? 3. What is the population from which they are drawn?
4. What is the context relative to which the data are analysed?
5. What are the boundaries of the analysis? 6. What is the target of the inferences? The assumption is that words and phrases mentioned most often are those reflecting important concerns in every communication. Therefore, quantitative content analysis starts with word frequencies, space measurements (column centimeters/inches in the case of newspapers), time counts (for radio and television time) and keyword frequencies. However, content analysis extends far beyond plain word counts, e.g. with Keyword In Context routines words can be analysed in their specific context to be disambiguated. Synonyms and homonyms can be isolated in accordance to linguistic properties of a language.
Qualitatively, content analysis can involve any kind of analysis where communication content (speech, written text, interviews, images ...) is categorised and classified. In its beginnings, using the first newspapers at the end of 19th century, analysis was done manually by measuring the number of lines and amount of space given a subject. With the rise of common computing facilities like PCs, computer-based methods of analysis are growing in popularity. Answers to open ended questions, newspaper articles, political party manifestoes, medical records or systematic observations in experiments can all be subject to systematic analysis of textual data. By having contents of communication available in form of machine readable texts, the input is analysed for frequencies and coded into categories for building up inferences. Robert Philip Weber (1990) notes: "To make valid inferences from the text, it is important that the classification procedure be reliable in the sense of being consistent: Different people should code the same text in the same way" (p. 12). The validity, inter-coder reliability and intra-coder reliability are subject to intense methodological research efforts over long years (see Krippendorff, 2004). One more distinction is between the manifest contents (of communication) and its latent meaning. "Manifest" describes what (an author or speaker) definitely has written, while latent meaning describes what an author intended to say/write. Normally, content analysis can only be applied on manifest content; that is, the words, sentences, or texts themselves, rather than their meanings. Dermot McKeone (1995) has highlighted the difference between prescriptive analysis and open analysis. In prescriptive analysis, the context is a closely defined set of communication parameters (e.g. specific messages, subject matter); open analysis identifies the dominant messages and subject matter within the text. A further step in analysis is the distinction between dictionary-based (quantitative) approaches and qualitative approaches. Dictionary-based approaches set up a list of categories derived from the frequency list of words and control the distribution of words and their respective categories over the texts. While methods in quantitative content analysis in this way transform observations of found categories into quantitative statistical data, the qualitative content analysis focuses more on the intentionality and its implications.
[edit] References
Earl Babbie, 'The Practice of Social Research', 10th edition, Wadsworth, Thomson Learning Inc., ISBN 0-534-62029-9 Bernard Berelson: Content Analysis in Communication Research. Glencoe, Ill: Free Press 1971 (first edition from 1952) Ian Budge, Hans-Dieter Klingemann et al.: Mapping Policy Preferences. Estimates for Parties, Electors and Governments 1945-1998. Oxford 2001: Oxford University Press, ISBN 0-19-924400-6 (great example of application of content analysis methods in Political Science dealing with political parties and its impact on electoral systems)
Richard Frisbie: The use of microcomputer programs to improve the reliability and validity of content analysis in evaluation. Paper presented at the session, Qualitative research methods, at the Annual Meeting of the American Educational Research Association, San Francisco. 1986, April Graneheim, Ulla Hllgren, & Lundman, Berit (2004). Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Education Today, 24(2), 105-112. Ole R. Holsti: Content Analysis for the Social Sciences and Humanities. Reading, MA: Addison-Wesley. 1969 Lampreia Carvalho, F. 'Continuity and Innovation: Conservatism and Politics of Communication in Brazil'. (Continuidade e Inovacao: Conservadorismo e Politica da Comunicacao no Brasil) Journal Revista Brasileira de Ciencias Sociais, N.43, 2000. ANPOCS, Brazil. Klaus Krippendorff: Content Analysis: An Introduction to Its Methodology. 2nd edition, Thousand Oaks, CA: Sage 2004 Klaus Krippendorff and Mary Angela Bock (editors): The Content Analysis Reader. Thousand Oaks, CA: Sage Publications. 2008 Dermot McKeone: Measuring Your Media Profile, Gower Press, 1995 A general introduction to media analysis and PR evaluation for the communications industry Neuendorf, Kimberly A.: The Content Analysis Guidebook Thousand Oaks, CA: Sage Publications. 2002 Neuendorf, Kimberly A.: The Content Analysis Guidebook Online (2002): http://academic.csuohio.edu/kneuendorf/content/ Carl W. Roberts (ed.): Text Analysis for the Social Sciences: Methods for Drawing Inferences from Texts and Transcripts. Mahwah, NJ: Lawrence Erlbaum 1997 Robert Philip Weber: Basic Content Analysis. 2nd ed., Newbury Park, CA: Sage 1990 (recommended introductory reading) Roger D. Wimmer and Joseph R. Dominick Mass Media Research: An Introduction. 8th ed. (Belmont, CA: Wadsworth, 2005). Stemler Steve: An Overview of Content Analysis (2001):http://pareonline.net/getvn.asp?v=7&n=17
http://www.yoshikoder.org/downloads.html: an open source content analysis program http://academic.csuohio.edu/kneuendorf/content/ Web site of the Content Analysis Guidebook Online, provides some CATA software for free download, list of archives, bibliographies and other important sources http://www.impact-media.co.uk Contains a general introduction to media analysis and media profile measurement including an outline of the differences between open and prescriptive analysis
http://www.andersonanalytics.com/reports/AATAT.pdf: History of content analysis software in psychology and applications of content analysis, text mining and text analytics in market research. http://courses.washington.edu/socw580/contentsoftware.shtml - Software for Content Analysis, a list of programs for analyzing text, images, video and audio; a resource used by the Advanced Research Methods & Design course in the School of Social Work at the University of Washington. [hide]
v d e
Psychology
History Portal Psychologist Abnormal Affective science Affective neuroscience Behavioral neuroscience Cognitive Cognitive neuroscience Basic Comparative Cultural Developmental Differential psychology Evolutionary Experimental Intelligence Mathematical Personality Positive Psycholinguistics Psychophysics Psychophysiology Social Theoretical
Assessment Clinical Community psychology Consumer Counseling Educational Forensic Health Industrial and Applied psychology organizational Legal Media Military Occupational health Pastoral Political Psychometrics School Sport and exercise Systems Traffic
Animal testing Archival research Behavior genetics Case study Content analysis Experiments Human subject Methodologie research Interviews Neuroimaging Observation Qualitative s research Quantitative research Self-report inventory Statistical surveys Orientations
Adlerian Analytical Behaviorism Cognitive behavioral therapy Cognitivism Descriptive Ecological systems theory Existential therapy Family therapy Feminist therapy Gestalt psychology Humanistic Narrative therapy Philosophy Psychoanalysis Psychodynamic psychotherapy Rational
Alfred Adler Gordon Allport Albert Bandura Aaron Beck Raymond Cattell Kenneth and Mamie Clark Albert Ellis Erik Erikson Hans Eysenck Leon Festinger Sigmund Freud Eminent Donald O. Hebb Clark L. Hull William James Carl Jung psychologists Jerome Kagan Kurt Lewin Abraham Maslow David McClelland George A. Miller Neal E. Miller Walter Mischel Ivan Pavlov Jean Piaget Carl Rogers Stanley Schachter B. F. Skinner Edward Thorndike John B. Watson
Lists
Counseling topics Disciplines Important publications Organizations Psychologists Psychotherapies Research methods Schools of thought Timeline Topics
See also Wiktionary definition Wiktionary category Wikisource Wikimedia Commons Wikiquote Wikinews Wikibooks View page ratings Rate this page What's this? Trustworthy Objective Complete Well-written I am highly knowledgeable about this topic (optional) Categories:
Media articles needing expert attention Social sciences methodology Evaluation methods Log in / create account Article Talk Read Edit View history
Top of Form
Bottom of Form
Toolbox
Main page Contents Featured content Current events Random article Donate to Wikipedia Help About Wikipedia Community portal Recent changes Contact Wikipedia
Interaction
Print/export Languages
Deutsch Espaol Franais Hrvatski Polski Portugus / Srpski Svenska This page was last modified on 23 October 2011 at 17:17. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of use for details. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization. Contact us Privacy policy
/////////////////////////////////////
Ground-penetrating radar (GPR) is a geophysical method that uses radar pulses to image the subsurface. This nondestructive method uses electromagnetic radiation in the microwave band (UHF/VHF frequencies) of the radio spectrum, and detects the reflected signals from subsurface structures. GPR can be used in a variety of media, including rock, soil, ice, fresh water, pavements and structures. It can detect objects, changes in material, and voids and cracks.[1] GPR uses high-frequency (usually polarized) radio waves and transmits into the ground. When the wave hits a buried object or a boundary with different dielectric constants, the receiving antenna records variations in the reflected return signal. The principles involved are similar to reflection seismology, except that electromagnetic energy is used instead of acoustic energy, and reflections appear at boundaries with different dielectric constants instead of acoustic impedances. The depth range of GPR is limited by the electrical conductivity of the ground, the transmitted center frequency and the radiated power. As conductivity increases, the penetration depth decreases. This is because the electromagnetic energy is more quickly dissipated into heat, causing a loss in signal strength at depth. Higher frequencies do not penetrate as far as lower frequencies, but give better resolution. Optimal depth penetration is achieved in ice where the depth of penetration can achieve several hundred meters. Good penetration is also achieved in dry sandy soils or massive dry materials such as granite, limestone, and concrete where the depth of penetration could be up to 15 m. In moist and/or clay-laden soils and soils with high electrical conductivity, penetration is sometimes only a few centimetres. Ground-penetrating radar antennas are generally in contact with the ground for the strongest signal strength; however, GPR air launched antennas can be used above the ground. Cross borehole GPR has developed within the field of hydrogeophysics to be a valuable means of assessing the presence and amount of soil water.
Contents
[hide]
7 External links
[edit] Applications
Ground penetrating radar survey of an archaeological site in Jordan. GPR has many applications in a number of fields. In the Earth sciences it is used to study bedrock, soils, groundwater, and ice. Engineering applications include nondestructive testing (NDT) of structures and pavements, locating buried structures and utility lines, and studying soils and bedrock. In environmental remediation, GPR is used to define landfills, contaminant plumes, and other remediation sites, while in archaeology it is used for mapping archaeological features and cemeteries. GPR is used in law enforcement for locating clandestine graves and buried evidence. Military uses include detection of mines, unexploded ordnance, and tunnels. Before 1987 the Frankley Reservoir in Birmingham, England UK was leaking 540 litres of drinking water per second. In that year GPR was used successfully to isolate the leaks.[2] Borehole radars utilizing GPR are used to map the structures from a borehole in underground mining applications. Modern directional borehole radar systems are able to produce threedimensional images from measurements in a single borehole.
One of the other main applications for ground penetration radars to locate underground utilities, since GPR is able to generate 3D underground images of pipes, power, sewage and water mains.
[edit] Limitations
The most significant performance limitation of GPR is in high-conductivity materials such as clay soils and soils that are salt contaminated. Performance is also limited by signal scattering in heterogeneous conditions (e.g. rocky soils). Other disadvantages of currently available GPR systems include:
Interpretation of radargrams is generally non-intuitive to the novice. Considerable expertise is necessary to effectively design, conduct, and interpret GPR surveys. Relatively high energy consumption can be problematic for extensive field surveys.
Recent advances in GPR hardware and software have done much to ameliorate these disadvantages, and further improvement can be expected with ongoing development.
The "Mineseeker Project" seeks to design a system to determine whether landmines are present in areas using ultra wideband synthetic aperture radar units mounted on blimps.
[edit] References
1. ^ Daniels DJ (ed.) (2004). Ground Penetrating Radar (2nd ed.). Knoval (Institution
spectrum Matters (ERM); Code of Practice in respect of the control, use and application of Ground Probing Radar (GPR) and Wall Probing Radar (WPR) systems and equipment Borchert, Olaf: Receiver Design for a Directional Borehole Radar System Dissertation, University of Wuppertal, 2008, [1]
EUROGPR The European GPR regulatory body GprMax GPR numerical simulator based on the FDTD method Short movie showing acquisition, processing and accuracy of GPR readings Ground Penetrating Radar Fundamentals Animation of sample GPR propagation
View page ratings Rate this page What's this? Trustworthy Objective Complete Well-written I am highly knowledgeable about this topic (optional) Categories:
Radar Geophysical imaging Log in / create account Article Talk Read Edit View history
Top of Form
Bottom of Form
Toolbox
Main page Contents Featured content Current events Random article Donate to Wikipedia Help About Wikipedia Community portal Recent changes Contact Wikipedia
Interaction
Print/export Languages
Catal Deutsch Espaol Franais Bahasa Indonesia Magyar Nederlands Plattdtsch Polski / Srpski Trke This page was last modified on 10 January 2012 at 01:25. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of use for details. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization. Contact us /////////////////////////////////////
Ground-controlled approach
From Wikipedia, the free encyclopedia This article does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (June 2009) In aviation a ground-controlled approach (GCA), is a type of service provided by air-traffic controllers whereby they guide aircraft to a safe landing in adverse weather conditions based on radar images. Most commonly a GCA uses information from either a Precision Approach Radar (PAR, for precision approaches with vertical, glide path guidance) or an Airport Surveillance Radar (ASR, providing a non-precision Surveillance Radar Approach with no glide path guidance). Technically, the term GCA applies specifically to the precision radar approach with glide path guidance.
[edit] Overview
Ground-controlled approach is the oldest air traffic technique to fully implement radar to service a plane - it was largely used during the Berlin airlift in 1948-49. It requires close communication between ground-based air traffic controllers and pilots in approaching aircraft. Only one pilot is guided at a time (max. 2 under certain circumstances). The controllers monitor dedicated precision approach radar systems, to determine the precise course and altitude of approaching aircraft. The controllers then provide verbal instructions by radio to the pilots to guide them to a landing. The instructions include both descent rate (glide path) and heading (course) corrections necessary to follow the correct approach path. Two tracks are displayed on the GCA or Precision Approach Radar (PAR) scope: Azimuth, showing the aircraft's position relative to the extended runway centerline, and Elevation, showing vertical position relative to the ideal glide path.
By following both tracks a landing aircraft will arrive precisely over the runway's touchdown zone. Controllers issue position information and/or correction for both of them at least every five seconds. The guidance is stopped over the approximate touchdown point, but to continue the approach to a landing, pilots must be able to see the runway environment before reaching the published "decision height," usually 200-400 ft above the runway touchdown zone and 1/4 to 3/4 miles from the touchdown point (the published minimum visibility and decision height vary depending upon approach and runway lighting, obstacles in the approach corridor, type of aircraft, and other factors). Pilots of revenue flights periodically must demonstrate PAR approach proficiency, and GCA controllers must conduct a minimum number of such approaches in a year to maintain competency. Because of their labor-intensive nature -- one GCA controller is normally required for each aircraft on final approach -- GCAs are no longer in widespread use at civilian airports, and are being discontinued at many military bases. However, air traffic controllers at some locations in the United States are required to maintain currency in their use, while the Belgian Air Force still uses the PAR for ground-controlled approaches on a daily basis. NATO has kept GCA active for a long period while civil aviation adopted the instrument landing system (ILS). Global Positioning System (GPS) based approaches that provide both lateral and vertical guidance are coming into widespread use, with approach minima as good as, or nearly as good as, GCA or ILS. Modern ILS and GPS approaches eliminate the possibility of human error from the controller, and can serve many aircraft at the same time. The groundcontrolled approach is useful when the approaching aircraft is not equipped with sophisticated navigation aids, and may also become a life saver when an aircraft's on-board
navigation aids are inoperative, as long as one communication radio works. Sometimes the PAR-based ground-controlled approach is also requested by qualified pilots when they are dealing with an emergency on-board to lighten their workload, or to "back up" ILS or other approach guidance. Ground-controlled approaches have been depicted in several films, including Strategic Air Command, The Big Lift, Julie, and Skyjacked. Arthur C. Clarke's novel Glide Path fictionalizes the original development of GCA.
Beam Approach Beacon System AN/MPN "Radar Becomes Lifeline." Popular Science, July 1946, pp. 82-84, first detailed article for general public on GCA radar.
View page ratings Rate this page What's this? Trustworthy Objective Complete Well-written I am highly knowledgeable about this topic (optional) Categories:
Types of approaches Log in / create account Article Discussion Read Edit View history
Top of Form
Bottom of Form
Main page Contents Featured content Current events Random article Donate to Wikipedia Help
Interaction
Toolbox
Print/export Languages
Dansk Deutsch Franais Ripoarisch Portugus This page was last modified on 5 December 2011 at 07:25. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of use for details. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization. Contact us Privacy policy About Wikipedia Disclaimers Mobile view