Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Elizabeth A. Hahn is active.

Publication


Featured researches published by Elizabeth A. Hahn.


Journal of Clinical Epidemiology | 2010

The Patient-Reported Outcomes Measurement Information System (PROMIS) developed and tested its first wave of adult self-reported health outcome item banks: 2005–2008

David Cella; William T. Riley; Arthur A. Stone; Nan Rothrock; Bryce B. Reeve; Susan Yount; Dagmar Amtmann; Rita K. Bode; Daniel J. Buysse; Seung W. Choi; Karon F. Cook; Robert F. DeVellis; Darren A. DeWalt; James F. Fries; Richard Gershon; Elizabeth A. Hahn; Jin Shei Lai; Paul A. Pilkonis; Dennis A. Revicki; Matthias Rose; Kevin P. Weinfurt; Ron D. Hays

OBJECTIVESnPatient-reported outcomes (PROs) are essential when evaluating many new treatments in health care; yet, current measures have been limited by a lack of precision, standardization, and comparability of scores across studies and diseases. The Patient-Reported Outcomes Measurement Information System (PROMIS) provides item banks that offer the potential for efficient (minimizes item number without compromising reliability), flexible (enables optional use of interchangeable items), and precise (has minimal error in estimate) measurement of commonly studied PROs. We report results from the first large-scale testing of PROMIS items.nnnSTUDY DESIGN AND SETTINGnFourteen item pools were tested in the U.S. general population and clinical groups using an online panel and clinic recruitment. A scale-setting subsample was created reflecting demographics proportional to the 2000 U.S. census.nnnRESULTSnUsing item-response theory (graded response model), 11 item banks were calibrated on a sample of 21,133, measuring components of self-reported physical, mental, and social health, along with a 10-item Global Health Scale. Short forms from each bank were developed and compared with the overall bank and with other well-validated and widely accepted (legacy) measures. All item banks demonstrated good reliability across most of the score distributions. Construct validity was supported by moderate to strong correlations with legacy measures.nnnCONCLUSIONnPROMIS item banks and their short forms provide evidence that they are reliable and precise measures of generic symptoms and functional reports comparable to legacy instruments. Further testing will continue to validate and test PROMIS items and banks in diverse clinical populations.


Quality of Life Research | 2010

Measuring social health in the patient-reported outcomes measurement information system (PROMIS): item bank development and testing

Elizabeth A. Hahn; Robert F. DeVellis; Rita K. Bode; Sofia F. Garcia; Liana D. Castel; Susan V. Eisen; Hayden B. Bosworth; Allen W. Heinemann; Nan Rothrock; David Cella

PurposeTo develop a social health measurement framework, to test items in diverse populations and to develop item response theory (IRT) item banks.MethodsA literature review guided framework development of Social Function and Social Relationships sub-domains. Items were revised based on patient feedback, and Social Function items were field-tested. Analyses included exploratory factor analysis (EFA), confirmatory factor analysis (CFA), two-parameter IRT modeling and evaluation of differential item functioning (DIF).ResultsThe analytic sample included 956 general population respondents who answered 56 Ability to Participate and 56 Satisfaction with Participation items. EFA and CFA identified three Ability to Participate sub-domains. However, because of positive and negative wording, and content redundancy, many items did not fit the IRT model, so item banks do not yet exist. EFA, CFA and IRT identified two preliminary Satisfaction item banks. One item exhibited trivial age DIF.ConclusionAfter extensive item preparation and review, EFA-, CFA- and IRT-guided item banks help provide increased measurement precision and flexibility. Two Satisfaction short forms are available for use in research and clinical practice. This initial validation study resulted in revised item pools that are currently undergoing testing in new clinical samples and populations.


Quality of Life Research | 2010

Patient-reported outcomes measurement information system (PROMIS) domain names and definitions revisions: further evaluation of content validity in IRT-derived item banks.

William T. Riley; Nan Rothrock; Bonnie Bruce; Christopher Christodolou; Karon F. Cook; Elizabeth A. Hahn; David Cella

PurposeContent validity of patient-reported outcomes (PROs) is evaluated primarily during item development, but subsequent psychometric analyses, particularly for item response theory (IRT)-derived scales, often result in considerable item pruning and potential loss of content. After selecting items for the PROMIS banks based on psychometric and content considerations, we invited external content expert reviews of the degree to which the initial domain names and definitions represented the calibrated item bank content.MethodsA minimum of four content experts reviewed each item bank and recommended a domain name and definition based on item content. Domain names and definitions then were revealed to the experts who rated how well these names and definitions fit the bank content and provided recommendations for definition revisions.ResultsThese reviews indicated that the PROMIS domain names and definitions remained generally representative of bank content following item pruning, but modifications to two domain names and minor to moderate revisions of all domain definitions were needed to optimize fit with the item bank content.ConclusionsThis reevaluation of domain names and definitions following psychometric item pruning, although not previously documented in the literature, appears to be an important procedure for refining conceptual frameworks and further supporting content validity.


Archives of Physical Medicine and Rehabilitation | 2015

Environmental Barriers and Supports to Everyday Participation: A Qualitative Insider Perspective From People With Disabilities

Joy Hammel; Susan Magasi; Allen W. Heinemann; David B. Gray; Susan Stark; Pamela A. Kisala; Noelle E. Carlozzi; David S. Tulsky; Sofia F. Garcia; Elizabeth A. Hahn

OBJECTIVEnTo describe environmental factors that influence participation of people with disabilities.nnnDESIGNnConstant comparative, qualitative analyses of transcripts from 36 focus groups across 5 research projects.nnnSETTINGnHome, community, work, and social participation settings.nnnPARTICIPANTSnCommunity-dwelling people (N=201) with diverse disabilities (primarily spinal cord injury, traumatic brain injury, and stroke) from 8 states.nnnINTERVENTIONSnNone.nnnMAIN OUTCOME MEASURESnEnvironmental barriers and supports to participation.nnnRESULTSnWe developed a conceptual framework to describe how environmental factors influence the participation of people with disabilities, highlighting 8 domains of environmental facilitators and barriers (built, natural, assistive technology, transportation, information and technology access, social support and attitudes, systems and policies, economics) and a transactional model showing the influence of environmental factors on participation at the micro (individual), mesa (community), and macro (societal) levels. Focus group data validated some International Classification of Functioning, Disability and Health environmental categories while also bringing unique factors (eg, information and technology access, economic quality of life) to the fore. Data were used to construct items to enable people with disabilities to assess the impact of environmental factors on everyday participation from their firsthand experience.nnnCONCLUSIONSnParticipants with disabilities voiced the need to evaluate the impact of the environment on their participation at the immediate, community, and societal levels. The results have implications for assessing environmental facilitators and barriers to participation within rehabilitation and community settings, evaluating outcomes of environmental interventions, and effecting system and policy changes to target environmental barriers that may result in societal participation disparities versus opportunities.


Cancer | 2009

Initial report of the cancer patient-reported outcomes measurement information system (PROMIS) sexual function committee: Review of sexual function measures and domains used in oncology

Diana D. Jeffery; Janice P. Tzeng; Francis J. Keefe; Laura S. Porter; Elizabeth A. Hahn; Kathryn E. Flynn; Bryce B. Reeve; Kevin P. Weinfurt

For this report, the authors described the initial activities of the Cancer Patient‐Reported Outcomes Measurement Information System (PROMIS)‐Sexual Function domain group, which is part of the National Institutes of Health Roadmap Initiative to develop brief questionnaires or individually tailored assessments of quality‐of‐life domains. Presented are a literature review of sexual function measures used in cancer populations and descriptions of the domains found in those measures. By using a consensus‐driven approach, an electronic bibliographic search was conducted for articles that were published from 1991 to 2007, and 486 articles were identified for in‐depth review. In total, 257 articles reported the administration of a psychometrically evaluated sexual function measure to individuals who were diagnosed with cancer. Apart from the University of California‐Los Angeles Prostate Cancer Index, the International Index of Erectile Function, and the Female Sexual Function Index, the 31 identified measures have not been tested widely in cancer populations. Most measures were multidimensional and included domains related to the sexual response cycle and to general sexual satisfaction. The current review supports the need for a flexible, psychometrically robust measure of sexual function for use in oncology settings and strongly justifies the development of the PROMIS‐Sexual Function instrument. When the PROMIS‐Sexual Function instrument is available publicly, cancer clinicians and researchers will have another measure with which to assess patient‐reported sexual function outcomes in addition to the few legacy measures that were identified through this review. Cancer 2009.


The Annals of Thoracic Surgery | 2008

Absence of Cognitive Decline One Year After Coronary Bypass Surgery: Comparison to Nonsurgical and Healthy Controls

Jerry J. Sweet; Eileen Finnin; Penny L. Wolfe; Jennifer L. Beaumont; Elizabeth A. Hahn; Jesse H. Marymont; Timothy A. Sanborn; Todd K. Rosengart

BACKGROUNDnCognitive decline after open-heart surgery has been the subject of a number of conflicting reports in recent years. Determination of possible cognitive impairment due to surgery or use of cardiopulmonary bypass is complicated by numerous factors, including use of appropriate comparison groups and consideration of practice effects in cognitive testing.nnnMETHODSnNeuropsychological data were gathered from 46 healthy controls, 42 cardiac patients referred for percutaneous coronary intervention (PCI), and 43 cardiac patients referred for coronary artery bypass grafting (CABG). Fourteen cognitive function tests were utilized at baseline and at three time points after surgery (3 weeks, 4 months, 1 year). Measures showing acceptable test-retest reliability based on intraclass correlations were compared using regression-based reliable change indices.nnnRESULTSnNo clear pattern of group differences or change at follow-up emerged. A greater percentage of CABG patients than controls worsened in seven tests (three at 1 year), but a greater percentage of PCI patients than controls also worsened in seven tests (three at 1 year). Generalized estimating equations showed only two tests (Wechsler Adult Intelligence Scale, Third Edition, Digit Symbol, and Hopkins Verbal Learning Test, Revised, Total Recall) to be significantly different between groups from baseline to 1 year. Interestingly, compared with healthy controls, more PCI patients than CABG patients worsened in the former of those two tests, whereas more PCI and CABG patients improved on the latter.nnnCONCLUSIONSnUsing healthy controls and a relevant nonsurgical comparison group to contend with important methodological considerations, current CABG procedure does not appear to create cognitive decline.


Quality of Life Research | 2015

Validation of the PROMIS physical function measures in a diverse US population-based cohort of cancer patients

Roxanne E. Jensen; Arnold L. Potosky; Bryce B. Reeve; Elizabeth A. Hahn; David Cella; James F. Fries; Ashley Wilder Smith; Theresa H.M. Keegan; Xiao-Cheng Wu; Lisa E. Paddock; Carol M. Moinpour

AbstractPurposenTo evaluate the validity of the Patient-Reported Outcomes Measurement Information System (PROMIS) physical function measures in a diverse, population-based cancer sample.MethodsCancer patients 6–13xa0months post-diagnosis (nxa0=xa04840) were recruited for the Measuring Your Health study. Participants were diagnosed between 2010 and 2013 with non-Hodgkin lymphoma or cancers of the colorectum, lung, breast, uterus, cervix, or prostate. Four PROMIS physical function short forms (4a, 6b, 10a, and 16) were evaluated for validity and reliability across age and race–ethnicity groups. Covariates included gender, marital status, education level, cancer site and stage, comorbidities, and functional status.ResultsPROMIS physical function short forms showed high internal consistency (Cronbach’s αxa0=xa00.92–0.96), convergent validity (fatigue, pain interference, FACT physical well-being all rxa0≥xa00.68), and discriminant validity (unrelated domains all rxa0≤xa00.3) across survey short forms, age, and race–ethnicity. Known-group differences by demographic, clinical, and functional characteristics performed as hypothesized. Ceiling effects for higher-functioning individuals were identified on most forms.ConclusionsThis study provides strong evidence that PROMIS physical function measures are valid and reliable in multiple race–ethnicity and age groups. Researchers selecting specific PROMIS short forms should consider the degree of functional disability in their patient population to ensure that length and content are tailored to limit response burden.


Health Psychology | 2014

New English and Spanish social health measures will facilitate evaluating health determinants.

Elizabeth A. Hahn; Darren A. DeWalt; Rita K. Bode; Sofia F. Garcia; Robert F. DeVellis; Helena Correia; David Cella

OBJECTIVEnTo develop psychometrically sound, culturally relevant, and linguistically equivalent English and Spanish self-report measures of social health guided by a comprehensive conceptual model and applicable across chronic illnesses.nnnMETHODSnThe Patient-Reported Outcomes Measurement Information System (PROMIS) Social Health Workgroup implemented a mixed methods approach to evaluate earlier results (v1.0); expand and refine domain definitions and items; translate items into Spanish; and obtain qualitative feedback. Computer-based and paper/pencil questionnaire administration was conducted with a variety of U.S. respondent samples during 2009-2012. Analyses included exploratory factor analysis (EFA), confirmatory factor analysis (CFA), two-parameter logistic item response theory (IRT) modeling, evaluation of differential item functioning (DIF), and evaluation of criterion and construct validity.nnnRESULTSnQualitative feedback supported the conceptualization of the Social Health domain framework (Social Function and Social Relationships subcomponents). Validation testing participants (n = 2,208 English; n = 644 Spanish) were diverse in terms of gender, age, education, and ethnicity/race. EFA, CFA, and IRT identified 7 unidimensional factors with good model fit. There was no DIF by language, and good evidence of criterion and construct validity.nnnCONCLUSIONSnPROMIS English and Spanish language instruments (v2.0), including computer-adaptive tests and fixed-length short forms, are publicly available for assessment of Social Function (Ability to Participate in Social Roles and Activities, and Satisfaction with Social Roles and Activities) and Social Relationships (Companionship; Emotional, Informational and Instrumental Support; and Social Isolation). Measures of social health will play a key role in applications that use ecologic (or determinants of health) models that emphasize how patients social environments influence their health.


Journal of Health Communication | 2011

Health Literacy Assessment Using Talking Touchscreen Technology (Health LiTT): A New Item Response Theory-Based Measure of Health Literacy

Elizabeth A. Hahn; Seung W. Choi; James W. Griffith; Kathleen J. Yost; David W. Baker

The importance of health literacy has grown considerably among researchers, clinicians, patients, and policymakers. Better instruments and measurement strategies are needed. Our objective was to develop a new health literacy instrument using novel health information technology and modern psychometrics. We designed Health LiTT as a self-administered multimedia touchscreen test based on item response theory (IRT) principles. We enrolled a diverse group of 619 English-speaking, primary care patients in clinics for underserved patients. We tested three item types (prose, document, quantitative) that worked well together to reliably measure a single dimension of health literacy. The Health LiTT score meets psychometric standards (reliability of 0.90 or higher) for measurement of individual respondents in the low to middle range. Mean Health LiTT scores were associated with age, race/ethnicity, education, income, and prior computer use (p < .05). We created an IRT-calibrated item bank of 82 items. Standard setting needs to be performed to classify and map items onto the construct and to identify measurement gaps. We are incorporating Health LiTT into an existing online research management tool. This will enable administration of Health LiTT on the same touchscreen used for other patient-reported outcomes, as well as real-time scoring and reporting of health literacy scores.


Archives of Physical Medicine and Rehabilitation | 2015

Environmental factors item development for persons with stroke, traumatic brain injury, and spinal cord injury.

Allen W. Heinemann; Susan Magasi; Joy Hammel; Noelle E. Carlozzi; Sofia F. Garcia; Elizabeth A. Hahn; Jin Shei Lai; David S. Tulsky; David B. Gray; Holly Hollingsworth

OBJECTIVESnTo describe methods used in operationalizing environmental factors; to describe the results of a research project to develop measures of environmental factors that affect participation; and to define an initial item set of facilitators and barriers to participation after stroke, traumatic brain injury, and spinal cord injury.nnnDESIGNnInstrument development included an extensive literature review, item classification and selection, item writing, and cognitive testing following the approach of the Patient-Reported Outcomes Measurement Information System.nnnSETTINGnCommunity.nnnPARTICIPANTSnContent area and outcome measurement experts (n=10) contributed to instrument development; individuals (n=200) with the target conditions participated in focus groups and in cognitive testing (n=15).nnnINTERVENTIONSnNone.nnnMAIN OUTCOME MEASURESnEnvironmental factor items were categorized in 6 domains: assistive technology; built and natural environment; social environment; services, systems, and policies; access to information and technology; and economic quality of life.nnnRESULTSnWe binned 2273 items across the 6 domains, winnowed this pool to 291 items for cognitive testing, and recommended 274 items for pilot data collection.nnnCONCLUSIONSnFive of the 6 domains correspond closely to the International Classification of Functioning, Disability and Health taxonomy of environmental factors; the sixth domain, economic quality of life, reflects an important construct that reflects financial resources that affect participation. Testing with a new and larger sample is underway to evaluate reliability, validity, and sensitivity.

Collaboration


Dive into the Elizabeth A. Hahn's collaboration.

Top Co-Authors

Avatar

David Cella

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jin Shei Lai

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rita K. Bode

Northwestern University

View shared research outputs
Top Co-Authors

Avatar

David S. Tulsky

University of Medicine and Dentistry of New Jersey

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susan Magasi

University of Illinois at Chicago

View shared research outputs
Researchain Logo
Decentralizing Knowledge