Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nancy L. Leech is active.

Publication


Featured researches published by Nancy L. Leech.


School Psychology Quarterly | 2007

An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation.

Nancy L. Leech; Anthony J. Onwuegbuzie

One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to understand phenomenon more fully for school psychology research and beyond. The authors describe seven qualitative analysis tools: methods of constant comparison, keywords-in-context, word count, classical content analysis, domain analysis, taxonomic analysis, and componential analysis. Then, the authors outline when to use each type of analysis. In so doing, the authors use real qualitative data to help distinguish the various types of analyses. Furthermore, flowcharts and tables are provided to help delineate when to choose each type of analysis. Finally, the role of computer-assisted software in the qualitative data-analytic process is discussed. As such, use of the analyses outlined in this article should help to promote rigor in qualitative research.


International Journal of Social Research Methodology | 2005

On Becoming a Pragmatic Researcher: The Importance of Combining Quantitative and Qualitative Research Methodologies

Anthony J. Onwuegbuzie; Nancy L. Leech

The last 100 years have witnessed a fervent debate in the USA about quantitative and qualitative research paradigms. Unfortunately, this has led to a great divide between quantitative and qualitative researchers, who often view themselves as in competition with each other. Clearly, this polarization has promoted purists, namely, researchers who restrict themselves exclusively either to quantitative or to qualitative research methods. Mono‐method research is the biggest threat to the advancement of the social sciences. Indeed, as long as we stay polarized in research, how can we expect stakeholders who rely on our research findings to take our work seriously? Thus, the purpose of this paper is to explore how the debate between quantitative and qualitative is divisive and, hence, counterproductive for advancing the social and behavioural science field. This paper advocates that all graduate students learn to utilize and to appreciate both quantitative and qualitative research. In so doing, students will develop into what we term as pragmatic researchers.


The International Journal of Qualitative Methods | 2009

A Qualitative Framework for Collecting and Analyzing Data in Focus Group Research

Anthony J. Onwuegbuzie; Wendy B. Dickinson; Nancy L. Leech; Annmarie Gorenc Zoran

Despite the abundance of published material on conducting focus groups, scant specific information exists on how to analyze focus group data in social science research. Thus, the authors provide a new qualitative framework for collecting and analyzing focus group data. First, they identify types of data that can be collected during focus groups. Second, they identify the qualitative data analysis techniques best suited for analyzing these data. Third, they introduce what they term as a micro-interlocutor analysis, wherein meticulous information about which participant responds to each question, the order in which each participant responds, response characteristics, the nonverbal communication used, and the like is collected, analyzed, and interpreted. They conceptualize how conversation analysis offers great potential for analyzing focus group data. They believe that their framework goes far beyond analyzing only the verbal communication of focus group participants, thereby increasing the rigor of focus group analyses in social science research.


Journal of the American Academy of Child and Adolescent Psychiatry | 2003

Measures of clinical significance.

Helena C. Kraemer; George A. Morgan; Nancy L. Leech; Jeffrey A. Gliner; Jerry J. Vaske; Robert J. Harmon

Behavioral scientists are interested in answering three basic questions when examining the relationships between variables (Kirk, 2001). First, is an observed result real or should it be attributed to chance (i.e., statistical significance)? Second, if the result is real, how large is it (i.e., effect size)? Third, is the result large enough to be meaningful and useful (i.e., clinical or practical significance)? In this last column in the series, we treat clinical significance as equivalent to practical significance. Judgments by the researcher and the consumers (e.g., clinicians and patients) regarding clinical significance consider factors such as clinical benefit, cost, and side effects. Although there is no formal statistical test of clinical significance, researchers suggest using one of three types of effect size measures to assist in interpreting clinical significance. These include the strength of association between variables (r family effect size measures), the magnitude of the difference between treatment and comparison groups (d family effect size measures), and measures of risk potency. In this paper, we review the d and r effect size measures and five measures of risk potency: odds ratio, risk ratio, relative risk reduction, risk difference, and number needed to treat. Finally, we review a relatively new effect size, AUC (which for historical reasons irrelevant to the current discussion stands for area under the receiver operating characteristic [ROC] curve), that integrates many of the others and is directly related to clinical significance. Each of these measures, however, has limitations that require the clinician to be cautious about interpretation. Guidelines are offered to facilitate the interpretation and understanding of clinical significance. Problems With Statistical Significance


Journal of Mixed Methods Research | 2007

Toward a Unified Validation Framework in Mixed Methods Research

Amy B. Dellinger; Nancy L. Leech

The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as present recently published validity terminology for mixed methods research. The authors discuss the rationale for their framework and how it unifies thinking about validity in mixed methods research. Finally, they discuss how the framework can be used.


Measurement and Evaluation in Counseling and Development | 2003

The Meaning of Validity in the New Standards for Educational and Psychological Testing: Implications for Measurement Courses

Laura D. Goodwin; Nancy L. Leech

Abstract The treatment of validity in the newest edition o/Standards for Educational and Psychological Testing (Standards; American Educational Research Association, American Psychological Association, & National Council on Measurement in Education, 1999) is quite different from coverage in earlier editions of the Standards and in most measurement textbooks. The view of validity in the 1999 Standards is discussed, and suggestions for instructors of measurement courses are offered.


Journal of Mixed Methods Research | 2010

Evaluating Mixed Research Studies: A Mixed Methods Approach

Nancy L. Leech; Amy B. Dellinger; Kim B. Brannagan; Hideyuki Tanaka

The purpose of this article is to demonstrate application of a new framework, the validation framework (VF), to assist researchers in evaluating mixed research studies. Based on an earlier work by Dellinger and Leech, a description of the VF is delineated. Using the VF, three studies from education, health care, and counseling fields are evaluated. The three mixed research studies differed in design and implementation. Elements of the VF were examined and evaluated for each study, and a picture of the quality of each study was captured textually. In presenting the VF and its potential for practical application in evaluating mixed research studies, pragmatic researchers can use this tool to increase the quality of their evaluations of mixed research studies.


International Journal of Qualitative Methods - ARCHIVE | 2008

Interviewing the Interpretive Researcher: A Method for Addressing the Crises of Representation, Legitimation, and Praxis

Anthony J. Onwuegbuzie; Nancy L. Leech; Kathleen M. T. Collins

In this article the authors outline five types of debriefing and introduce a new type of debriefing, namely, that of debriefing the interpretive researcher. Next they present eight main areas accompanied by example questions to guide the interviewer when debriefing the researcher. They also present five authenticity criteria developed by Guba and Lincoln (1989) and include possible interview questions to document the degree to which the researcher has met these criteria. Finally, using Miles and Hubermans (1994) framework, they illustrate how displays such as matrices can be used to collect, analyze, and interpret debriefing interview data as well as leave an audit trail.


International Journal of Multiple Research Approaches | 2009

Mixed data analysis: Advanced integration techniques

Anthony J. Onwuegbuzie; John R. Slate; Nancy L. Leech; Kathleen M. T. Collins

Abstract The purpose of this paper is to provide a coherent and inclusive framework for conducting mixed analyses. First, we present a two-dimensional representation for classifying and organizing both qualitative and quantitative analyses. This representation involves reframing qualitative and quantitative analyses as either variable-oriented or case-oriented analyses, yielding a 2 (qualitative analysis phase vs. quantitative analysis phase) × 2 (variable-oriented analysis vs. case-oriented analysis) mixed analysis grid. We present a comprehensive list of specific qualitative (e.g. method of constant comparison) and quantitative (e.g. multiple regression) analyses that fit under each of the four cells. Next, we provide an even more comprehensive framework that incorporates a time dimension (i.e. process/experience-oriented analyses), yielding a 2 (qualitative analysis phase vs. quantitative analysis phase) × 2 (particularistic vs. universalistic; variable-oriented analysis) × 2 (intrinsic case vs. instrumental case; case-oriented analysis) × 2 (cross-sectional vs. longitudinal; process/experience-oriented analysis) model. Examples from published studies are presented for each of these two representations. We contend that these two representations can help mixed researchers – both novice and experienced researchers alike – not only classify qualitative, quantitative and mixed research, but, more importantly, can help them both design their mixed analyses, as well as analyze their data coherently and make meta-inferences that have interpretive consistency.


International Journal of Multiple Research Approaches | 2007

Conducting mixed analyses: A general typology

Anthony J. Onwuegbuzie; John R. Slate; Nancy L. Leech; Kathleen M. T. Collins

Abstract In this article, we provide a typology of mixed analysis techniques, namely the Mixed Analysis Matrix, that helps researchers select a data analysis technique given the number of (a) data types collected (i.e. quantitative or qualitative; or quantitative and qualitative) and (b) analysis types used (i.e. quantitative or qualitative; or quantitative and qualitative)—yielding a 2 X 2 representation involving four cells that each contain specific analytical techniques, with two of these cells containing a total of 15 mixed analysis techniques. Furthermore, we describe the fundamental principle of mixed analysis, describe the steps in a mixed analysis, and delineate the rationale and purpose for conducting mixed analyses. For each technique, readers are directed to published studies that serve as illustrative examples. Outlining the mixed-analysis techniques available for researchers hopefully will increase awareness of the number of choices for analyzing data from mixed studies.

Collaboration


Dive into the Nancy L. Leech's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Margarita Bianco

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffrey A. Gliner

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Robert J. Harmon

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Bryn Harris

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

John R. Slate

Sam Houston State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge