Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tim Davey is active.

Publication


Featured researches published by Tim Davey.


Archive | 2002

Practical considerations in computer-based testing

Cynthia G. Parshall; Judith A. Spray; John C. Kalohn; Tim Davey

Considerations in Computer-Based Testing * Issues in Test Administration and Development * Examinee Issues * Software Issues * Issues in Innovative Item Types * Computerized Fixed Tests * Automated Test Assembly for Online Administration * Computerized Adaptive Tests * Computerized Classification Tests * Item Pool Evaluation and Maintenance * Comparison of the Test Delivery Methods


Archive | 2002

Computerized Adaptive Tests

Cynthia G. Parshall; Judith A. Spray; John C. Kalohn; Tim Davey

A traditional computerized adaptive test (CAT) selects items individually for each examinee based on the examinee’s responses to previous items in order to obtain a precise and accurate estimate of that examinee’s latent ability on some underlying scale. The specific items, the number of items, and the order of item presentation are all likely to vary from one examinee to another. Forms are drawn adaptively and scored in real time, and unique tests are constructed for each examinee. Scores are equated through reliance on item response theory (IRT)1 ability estimates.


Encyclopedia of Statistics in Behavioral Science | 2005

Computer-based Testing

Tim Davey

Most broadly defined, computer-based tests (CBTs) include not just tests administered on computers or workstations but also exams delivered via telephones, PDAs, and other electronic devices. There have been three main reasons for test developers to move beyond conventional paper and pencil administration. The first is to change the nature of what is being measured. The second is to improve measurement precision or efficiency. The third is to make test administration more convenient for examinees, test sponsors, or both. This entry details some of the advantages of CBTs relative to conventional paper and pencil tests, and describes some of the methods used for test administration and scoring. Keywords: computer-based testing; tailored testing


Wiley StatsRef: Statistics Reference Online | 2014

Computer-based Testing†

Tim Davey

Most broadly defined, computer-based tests (CBTs) include not just tests administered on computers or workstations but also exams delivered via telephones, PDAs, and other electronic devices. There have been three main reasons for test developers to move beyond conventional paper and pencil administration. The first is to change the nature of what is being measured. The second is to improve measurement precision or efficiency. The third is to make test administration more convenient for examinees, test sponsors, or both. This entry details some of the advantages of CBTs relative to conventional paper and pencil tests, and describes some of the methods used for test administration and scoring. Keywords: computer-based testing; tailored testing


Archive | 2002

Considerations in Computer-Based Testing

Cynthia G. Parshall; Judith A. Spray; John C. Kalohn; Tim Davey

In recent years, many tests have begun to be administered on computer. In some cases, the tests are developed for or converted to a computer-based format for no better reason than the trend value. Computerized exams frequently are perceived as being “state of the art” or automatically better than traditional, standardized, paper-and-pencil exams. These clearly are not accurate assumptions, and a testing program should not elect to computerize an exam without strongerreasons than these. Indeed, there are many challenges inherent in computer-based testing, and development of a computerized exam program should not be undertaken lightly. However, while computerized tests are not intrinsically better than paper-and-pencil tests, there are some distinct advantages available in computerized test administration.


ETS Research Report Series | 2012

Evaluation of the e-rater® Scoring Engine for the TOEFL® Independent and Integrated Prompts

Chaitanya Ramineni; Catherine Trapani; David M. Williamson; Tim Davey; Brent Bridgeman


ETS Research Report Series | 2011

Computer-Adaptive Testing for Students With Disabilities: A Review of the Literature

Elizabeth Stone; Tim Davey


Archive | 2006

Designing Computerized Adaptive Tests

Tim Davey; Mary J. Pitoniak


Archive | 1990

Comparison of Two Logistic Multidimensional Item Response Theory Models

Judith A. Spray; Tim Davey; Mark D. Reckase; Terry A. Ackerman; James E. Carlson


ETS Research Report Series | 2012

Evaluation of the e-rater® Scoring Engine for the GRE® Issue and Argument Prompts

Chaitanya Ramineni; Catherine Trapani; David M. Williamson; Tim Davey; Brent Bridgeman

Collaboration


Dive into the Tim Davey's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge