Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Constance V. Hines is active.

Publication


Featured researches published by Constance V. Hines.


Educational and Psychological Measurement | 2005

The Quality of Factor Solutions in Exploratory Factor Analysis: The Influence of Sample Size, Communality, and Overdetermination.

Kristine Y. Hogarty; Constance V. Hines; Jeffrey D. Kromrey; John M. Ferron; Karen R. Mumford

The purpose of this studywas to investigate the relationship between sample size and the quality of factor solutions obtained from exploratory factor analysis. This research expanded upon the range of conditions previously examined, employing a broad selection of criteria for the evaluation of the quality of sample factor solutions. Results showed that when communalities are high, sample size tended to have less influence on the quality of factor solutions than when communalities are low. Overdetermination of factors was also shown to improve the factor analysis solution. Finally, decisions about the quality of the factor solution depended upon which criteria were examined.


American Educational Research Journal | 1985

Teacher Clarity and Its Relationship to Student Achievement and Satisfaction

Constance V. Hines; Donald R. Cruickshank; John J. Kennedy

Relationships between the clarity behaviors of teachers and the dual outcome measures of student achievement and satisfaction were examined. Relatively reliable measures of clarity (both of a low-inference and high-inference nature) on 32 preservice teachers who taught the same lesson within a small-group laboratory setting were generated by (a) trained observers, (b) participating students, and (c) the teachers themselves. The high and relatively low-inference measures of teacher clarity correlated highly, and both were significantly and positively related to postinstructional measures of student achievement and student satisfaction. A number of specific clarity behaviors have been identified that appear to be strongly and directly linked to desirable student outcomes.


Educational and Psychological Measurement | 1995

Use of Empirical Estimates of Shrinkage in Multiple Regression: A Caution.

Jeffrey D. Kromrey; Constance V. Hines

Empirical techniques to estimate the shrinkage of the sample R2 have been advocated as alternatives to analytical formulae. Although such techniques may be appropriate for estimating the coefficient of cross-validation, they do not provide accurate estimates of the population multiple correlation. The accuracy of four empirical techniques (simple cross-validation, multi-cross-validation, jackknife, and bootstrap) were investigated in a Monte Carlo study. Random samples of size 20 to 200 were drawn from a pseudopopulation of actual field data. Regression models were investigated with population coefficients of determination ranging from .04 to .50 and with numbers of regressors ranging from 2 to 10. Substantial statistical bias was evident when the shrunken R2 values were used to estimate the population squared multiple correlation. Researchers are advised to avoid the empirical techniques when the parameter of interest is the population coefficient of determination rather than the coefficient of cross-validation.


Psychometrika | 2004

Selection of Variables in Exploratory Factor Analysis: An Empirical Comparison of a Stepwise and Traditional Approach.

Kristine Y. Hogarty; Jeffrey D. Kromrey; John M. Ferron; Constance V. Hines

The purpose of this study was to investigate and compare the performance of a stepwise variable selection algorithm to traditional exploratory factor analysis. The Monte Carlo study included six factors in the design; the number of common factors; the number of variables explained by the common factors; the magnitude of factor loadings; the number of variables not explained by the common factors; the type of anomaly evidenced by the poorly explained variables; and sample size. The performance of the methods was evaluated in terms of selection and pattern accuracy, and bias and root mean squared error of the structure coefficients. Results indicate that the stepwise algorithm was generally ineffective at excluding anomalous variables from the factor model. The poor selection accuracy of the stepwise approach suggests that it should be avoided.


Journal of Experimental Education | 1996

Estimating the coefficient of cross-validity in multiple regression : A comparison of analytical and empirical methods

Jeffrey D. Kromrey; Constance V. Hines

Abstract In predictive applications of multiple regression, interest centers on the estimation of the population coefficient of cross-validation rather than the population multiple correlation. The accuracy of 3 analytical formulas for shrinkage estimation (Ezekiel, Browne, & Darlington) and 4 empirical techniques (simple cross-validation, multicross-validation, jackknife, and bootstrap) were investigated in a Monte Carlo study. Random samples of size 20 to 200 were drawn from a pseudopopulation of actual field data. Regression models were investigated with population coefficients of determination ranging from .04 to .50 and with numbers of regressors ranging from 2 to 10. For all techniques except the Browne formula and multicross-validation, substantial statistical bias was evident when the shrunken R 2 values were used to estimate the coefficient of cross-validation. In addition, none of the techniques examined provided unbiased estimates with sample sizes smaller than 100, regardless of the number of ...


Teacher Education and Special Education | 1996

Creating and Using a Multiparadigmatic Knowledge Base for Restructuring Teacher Education in Special Education: Technical and Philosophical Issues

Jeffrey D. Kromrey; Constance V. Hines; James L. Paul; Hilda Rosselli

Important challenges that researchers face in the creation and use of a knowledge base for restructuring teacher education in the field of special education are reviewed and analyzed. Such challenges arise both from the changing intellectual context within which research operates and from the fundamental reform efforts currently underway throughout the educational community. Promising avenues for research to meet these challenges are delineated and the critical roles of multiparadigmatic research communities are highlighted. Mechanisms for the development, maintenance, and on-going operations of such communities are suggested, and potential caveats and pitfalls to their success are described.


Journal of Applied School Psychology | 2015

Development and Initial Validation of a Scale Measuring the Beliefs of Educators Regarding Response to Intervention

Jose M. Castillo; Robert F. Dedrick; Kevin M. Stockslager; Amanda L. March; Constance V. Hines; Sim Yin Tan

This article presents information on the development and initial validation of the 16-item Response to Intervention (RTI) Beliefs Scale. The scale is designed to measure the extent to which educators working in schools hold beliefs consistent with the tenets of RTI. The authors administered the instrument to 2,430 educators in 62 elementary schools in the fall of 2007 and 2,443 educators in 68 elementary schools in the spring of 2008. Exploratory, single-level confirmatory, and multilevel confirmatory factor analysis procedures were used to examine construct validity. Results supported a correlated 3-factor model (Academic Abilities and Performance of Students with Disabilities, Data-Based Decision Making, and Functions of Core and Supplemental Instruction) at both the school and educator levels of analysis. Furthermore, the factor scores derived from the model demonstrated significant, positive relations to RTI implementation. Reliability estimates for two of the three factor scores exceeded.70. Implications for research on educator beliefs and implementation of RTI as well as implications for school psychologists supporting RTI implementation are discussed.


Assessment for Effective Intervention | 2016

Measuring Educators’ Perceptions of Their Skills Relative to Response to Intervention A Psychometric Study of a Survey Tool

Jose M. Castillo; Amanda L. March; Kevin M. Stockslager; Constance V. Hines

The Perceptions of RtI Skills Survey is a self-report measure that assesses educators’ perceptions of their data-based problem-solving skills—a critical element of many Response-to-Intervention (RtI) models. Confirmatory factor analysis (CFA) was used to evaluate the underlying factor structure of this tool. Educators from 68 (n = 2,397) and 60 (n = 1,961) schools in a southeastern state participated during the spring of 2008 and spring of 2010, respectively. Results supported a correlated three-factor model with the following dimensions: Perceptions of RtI Skills Applied to Academic Content, Perceptions of RtI Skills Applied to Behavior Content, and Perceptions of Data Display Skills. Internal consistency estimates for all factors exceeded .90. In addition, significant associations between factor scores and data-based problem-solving fidelity at Tiers I and II were found. Implications for educators facilitating RtI implementation are discussed.


Review of Educational Research | 2002

Editors’ Introduction: Special Issue on Standards-Based Reforms and Accountability

Kathryn M. Borman; Jeffrey D. Kromrey; Constance V. Hines; Kristine Y. Hogarty

In this special issue of Review of Educational Research, we present a collection of articles that cohere around the common theme of standards-based reform and accountability. The authors critically examine the extant literature in a time of increasing polarization of views about the desirability and consequences of the accountability movement and of educational reform in general. In the first article, Madhabi Chatterji focuses our attention on purposes, models, and methods of inquiry. This article classifies the literature into seven categories and analyzes it with respect to content and methodology, using theoretically supported taxonomies. With the premise that past reforms were meant to be systemic, Chatterji examines the extent to which related inquiry has been guided by designs that explicitly or implicitly acknowledge the presence of a system, and evaluates the utility of designs in support of large-scale systematic changes in education. In the second article, James P. Spillane, Brian J. Reiser, and Todd Reimer develop a cognitive framework to characterize sense-making in the implementation process. Their framework is especially relevant for recent education policy initiatives, such as standards-based reform, that press for tremendous changes in classroom instruction. The framework is premised on the assumption that if teachers and school administrators are to implement state and national standards, they must first interpret them. Their understanding of reform proposals is the basis on which they decide whether to ignore, sabotage, adapt, or adopt them. The authors argue that behavioral change includes a fundamentally cognitive component. They draw on both theoretical and empirical literatures to develop a cognitive perspective on implementation. In the third article, Laura Desimone reviews, critiques, and synthesizes research that documents the implementation of comprehensive school reform models. She applies the theory that the more specific, consistent, authoritative, powerful, and stable a policy is, the stronger will be its implementation. The research suggests that all five of these policy attributes contribute to implementation and that, in addition, each is related to particular aspects of the process: specificity to implementation fidelity, power to immediate implementation effects, and authority, consistency, and stability to long-lasting change. In the final article, Jian Wang and Sandra J. Odell draw on the mentoring and learning-to-teach literature, exploring how mentoring that supports novices’ learning to teach in ways consistent with the standards brings about reform, whether in preservice or induction contexts. The authors point out that the assumptions on which mentoring programs are based often do not focus on standards and that the prevailing mentoring practices are consistent with those assumptions rather than with the assumptions behind standards-based teaching. To promote mentoring that reforms teaching in ways consistent with the aims of the standards movement, policymakers must determine effective methods of educating mentors and mentoring program designers.


Reading Research and Instruction | 1993

An lnvestigation of varying reading .level placement on reading achievement of chapter l students

Susan P. Homan; Constance V. Hines; Jeffrey D. Kromrey

Abstract The purpose of this study was to investigate the effect of reading placement (instructional level vs. above instructional level) on reading achievement of at‐risk sixth grade students with reading problems. Achievement differences for students who were slow learners vs. disabled readers were also examined. The sample consisted of 304 sixth graders in a special Chapter I reading program. Data were subjected to a two‐factor multivariate analysis of variance (MANOVA) procedure. Results indicated that scores of students placed one‐half to one year above instructional level were not significantly different from scores of students placed at instructional reading level.

Collaboration


Dive into the Constance V. Hines's collaboration.

Top Co-Authors

Avatar

Jeffrey D. Kromrey

University of South Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Melinda R. Hess

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

John M. Ferron

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Amanda L. March

University of South Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bryce Pride

University of South Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ha Phan

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Joel Wao

University of South Florida

View shared research outputs
Researchain Logo
Decentralizing Knowledge