Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maureen McEvoy is active.

Publication


Featured researches published by Maureen McEvoy.


BMC Musculoskeletal Disorders | 2005

Reliability of upright posture measurements in primary school children

Maureen McEvoy; Karen Grimmer

BackgroundCorrect upright posture is considered to be a measure of good musculoskeletal health. Little is known about the usual variability of childrens upright standing posture. The aim of this study was to assess differences between repeated measures of upright posture in a group of primary school children.MethodsSagittal plane photographs of usual, relaxed upright standing posture of 38 boys and girls aged 5–12 years were taken twice within an hour. Reflective markers were placed over the canthus, tragus, C7 spinous process, greater trochanter and lateral malleolus. Digitising software was used to calculate the x,y plane coordinates, from which five postural angles were calculated (trunk, neck, gaze, head on neck, lower limb). Height, weight, motor control estimates (as measured by the Brace Tests) and presence of recent pain were recorded for each child, and the association between the first test measure of posture angles and these factors was assessed using linear regression and ANOVA models. Multiple ANOVA models were applied to analyse the effect of repeated testing, and significant predictors on the angles.ResultsFour of the five postural angles (trunk, neck, head on neck, lower limb) were significantly influenced by age. As age was strongly associated with height (r2 = 0.84) and moderately associated with weight and motor control (r2 = 0.67, 0.56 respectively), these developmental parameters may well explain the age effect on angles. There was no relationship between age and pain reported on either the testing day, or recently, and there was no gender influence on any angle. There was no significant effect of repeated testing on any angle (ICC>0.93). None of the hypothesized predictors were associated with differences in angles from repeated testing.ConclusionThis study outlined the variability of relaxed upright standing posture of children aged 5–12 years, when measured twice in an hour. Age influenced the size of the angles but not the variability. While the subject numbers in this study are small, the findings provide useful information on which further studies in posture and its development in pre-adolescent children can be based.


Medical Teacher | 2010

Development and psychometric testing of a trans-professional evidence-based practice profile questionnaire

Maureen McEvoy; Marie Williams; Tim Olds

Background: Previous survey tools operationalising knowledge, attitudes or beliefs about evidence-based practice (EBP) have shortcomings in content, psychometric properties and target audience. Aims: This study developed and psychometrically assessed a self-report trans-professional questionnaire to describe an EBP profile. Methods: Sixty-six items were collated from existing EBP questionnaires and administered to 526 academics and students from health and non-health backgrounds. Principal component factor analysis revealed the presence of five factors (Relevance, Terminology, Confidence, Practice and Sympathy). Following expert panel review and pilot testing, the 58-item final questionnaire was disseminated to 105 subjects on two occasions. Test–retest and internal reliability were quantified using intra-class correlation coefficients (ICCs) and Cronbachs alpha, convergent validity against a commonly used EBP questionnaire by Pearsons correlation coefficient and discriminative validity via analysis of variance (ANOVA) based on exposure to EBP training. Results: The final questionnaire demonstrated acceptable internal consistency (Cronbachs alpha 0.96), test–retest reliability (ICCs range 0.77–0.94) and convergent validity (Practice 0.66, Confidence 0.80 and Sympathy 0.54). Three factors (Relevance, Terminology and Confidence) distinguished EBP exposure groups (ANOVA p < 0.001–0.004). Conclusion: The evidence-based practice profile (EBP2) questionnaire is a reliable instrument with the ability to discriminate for three factors, between respondents with differing EBP exposures.


BMC Medical Education | 2011

Evidence-based practice profiles of physiotherapists transitioning into the workforce: a study of two cohorts

Maureen McEvoy; Marie Williams; Tim Olds; Lucy K. Lewis; John Petkov

BackgroundTraining in the five steps of evidence-based practice (EBP) has been recommended for inclusion in entry-level health professional training. The effectiveness of EBP education has been explored predominantly in the medical and nursing professions and more commonly in post-graduate than entry-level students. Few studies have investigated longitudinal changes in EBP attitudes and behaviours. This study aimed to assess the changes in EBP knowledge, attitudes and behaviours in entry-level physiotherapy students transitioning into the workforce.MethodsA prospective, observational, longitudinal design was used, with two cohorts. From 2008, 29 participants were tested in their final year in a physiotherapy program, and after the first and second workforce years. From 2009, 76 participants were tested in their final entry-level and first workforce years. Participants completed an Evidence-Based Practice Profile questionnaire (EBP2), which includes self-report EBP domains [Relevance, Terminology (knowledge of EBP concepts), Confidence, Practice (EBP implementation), Sympathy (disposition towards EBP)]. Mixed model analysis with sequential Bonferroni adjustment was used to analyse the matched data. Effect sizes (ES) (95% CI) were calculated for all changes.ResultsEffect sizes of the changes in EBP domains were small (ES range 0.02 to 0.42). While most changes were not significant there was a consistent pattern of decline in scores for Relevance in the first workforce year (ES -0.42 to -0.29) followed by an improvement in the second year (ES +0.27). Scores in Terminology improved (ES +0.19 to +0.26) in each of the first two workforce years, while Practice scores declined (ES -0.23 to -0.19) in the first year and improved minimally in the second year (ES +0.04). Confidence scores improved during the second workforce year (ES +0.27). Scores for Sympathy showed little change.ConclusionsDuring the first two years in the workforce, there was a transitory decline in the self-reported practice and sense of relevance of EBP, despite increases in confidence and knowledge. The pattern of progression of EBP skills beyond these early professional working years is unknown.


BMC Medical Education | 2016

Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET).

Anna Phillips; Lucy K. Lewis; Maureen McEvoy; James Galipeau; Paul Glasziou; David Moher; Julie K. Tilson; Marie Williams

BackgroundThe majority of reporting guidelines assist researchers to report consistent information concerning study design, however, they contain limited information for describing study interventions. Using a three-stage development process, the Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) checklist and accompanying explanatory paper were developed to provide guidance for the reporting of educational interventions for evidence-based practice (EBP). The aim of this study was to complete the final development for the GREET checklist, incorporating psychometric testing to determine inter-rater reliability and criterion validity.MethodsThe final development for the GREET checklist incorporated the results of a prior systematic review and Delphi survey. Thirty-nine items, including all items from the prior systematic review, were proposed for inclusion in the GREET checklist. These 39 items were considered over a series of consensus discussions to determine the inclusion of items in the GREET checklist. The GREET checklist and explanatory paper were then developed and underwent psychometric testing with tertiary health professional students who evaluated the completeness of the reporting in a published study using the GREET checklist. For each GREET checklist item, consistency (%) of agreement both between participants and the consensus criterion reference measure were calculated. Criterion validity and inter-rater reliability were analysed using intra-class correlation coefficients (ICC).ResultsThree consensus discussions were undertaken, with 14 items identified for inclusion in the GREET checklist. Following further expert review by the Delphi panelists, three items were added and minor wording changes were completed, resulting in 17 checklist items. Psychometric testing for the updated GREET checklist was completed by 31 participants (n = 11 undergraduate, n = 20 postgraduate). The consistency of agreement between the participant ratings for completeness of reporting with the consensus criterion ratings ranged from 19 % for item 4 Steps of EBP, to 94 % for item 16 Planned delivery. The overall consistency of agreement, for criterion validity (ICC 0.73) and inter-rater reliability (ICC 0.96), was good to almost perfect.ConclusionThe final GREET checklist comprises 17 items which are recommended for reporting EBP educational interventions. Further validation of the GREET checklist with experts in EBP research and education is recommended.


BMC Medical Education | 2014

A systematic review of how studies describe educational interventions for evidence-based practice: stage 1 of the development of a reporting guideline

Anna Phillips; Lucy K. Lewis; Maureen McEvoy; James Galipeau; Paul Glasziou; Marilyn Hammick; David Moher; Julie K. Tilson; Marie Williams

AbstractBackgroundThe aim of this systematic review was to identify which information is included when reporting educational interventions used to facilitate foundational skills and knowledge of evidence-based practice (EBP) training for health professionals. This systematic review comprised the first stage in the three stage development process for a reporting guideline for educational interventions for EBP.MethodsThe review question was ‘What information has been reported when describing educational interventions targeting foundational evidence-based practice knowledge and skills?’ MEDLINE, Academic Search Premier, ERIC, CINAHL, Scopus, Embase, Informit health, Cochrane Library and Web of Science databases were searched from inception until October - December 2011. Randomised and non-randomised controlled trials reporting original data on educational interventions specific to developing foundational knowledge and skills of evidence-based practice were included.Studies were not appraised for methodological bias, however, reporting frequency and item commonality were compared between a random selection of studies included in the systematic review and a random selection of studies excluded as they were not controlled trials. Twenty-five data items were extracted by two independent reviewers (consistency > 90%).ResultsSixty-one studies met the inclusion criteria (n = 29 randomised, n = 32 non-randomised). The most consistently reported items were the learner’s stage of training, professional discipline and the evaluation methods used (100%). The least consistently reported items were the instructor(s) previous teaching experience (n = 8, 13%), and student effort outside face to face contact (n = 1, 2%).ConclusionThis systematic review demonstrates inconsistencies in describing educational interventions for EBP in randomised and non-randomised trials. To enable educational interventions to be replicable and comparable, improvements in the reporting for educational interventions for EBP are required. In the absence of a specific reporting guideline, there are a range of items which are reported with variable frequency. Identifying the important items for describing educational interventions for facilitating foundational knowledge and skills in EBP remains to be determined. The findings of this systematic review will be used to inform the next stage in the development of a reporting guideline for educational interventions for EBP.


BMC Medical Education | 2013

Protocol for development of the guideline for reporting evidence based practice educational interventions and teaching (GREET) statement.

Anna Phillips; Lucy K. Lewis; Maureen McEvoy; James Galipeau; Paul Glasziou; Marilyn Hammick; David Moher; Julie K. Tilson; Marie Williams

BackgroundThere are an increasing number of studies reporting the efficacy of educational strategies to facilitate the development of knowledge and skills underpinning evidence based practice (EBP). To date there is no standardised guideline for describing the teaching, evaluation, context or content of EBP educational strategies. The heterogeneity in the reporting of EBP educational interventions makes comparisons between studies difficult. The aim of this program of research is to develop the Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and an accompanying explanation and elaboration (E&E) paper.Methods/designThree stages are planned for the development process. Stage one will comprise a systematic review to identify features commonly reported in descriptions of EBP educational interventions. In stage two, corresponding authors of articles included in the systematic review and the editors of the journals in which these studies were published will be invited to participate in a Delphi process to reach consensus on items to be considered when reporting EBP educational interventions. The final stage of the project will include the development and pilot testing of the GREET statement and E&E paper.OutcomeThe final outcome will be the creation of a Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and E&E paper.DiscussionThe reporting of health research including EBP educational research interventions, have been criticised for a lack of transparency and completeness. The development of the GREET statement will enable the standardised reporting of EBP educational research. This will provide a guide for researchers, reviewers and publishers for reporting EBP educational interventions.


BMC Medical Education | 2014

A Delphi survey to determine how educational interventions for evidence-based practice should be reported: Stage 2 of the development of a reporting guideline

Anna Phillips; Lucy K. Lewis; Maureen McEvoy; James Galipeau; Paul Glasziou; Marilyn Hammick; David Moher; Julie K. Tilson; Marie Williams

BackgroundUndertaking a Delphi exercise is recommended during the second stage in the development process for a reporting guideline. To continue the development for the Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) a Delphi survey was undertaken to determine the consensus opinion of researchers, journal editors and educators in evidence-based practice (EBP) regarding the information items that should be reported when describing an educational intervention for EBP.MethodsA four round online Delphi survey was conducted from October 2012 to March 2013. The Delphi panel comprised international researchers, educators and journal editors in EBP. Commencing with an open-ended question, participants were invited to volunteer information considered important when reporting educational interventions for EBP. Over three subsequent rounds participants were invited to rate the importance of each of the Delphi items using an 11 point Likert rating scale (low 0 to 4, moderate 5 to 6, high 7 to 8 and very high >8). Consensus agreement was set a priori as at least 80 per cent participant agreement. Consensus agreement was initially calculated within the four categories of importance (low to very high), prior to these four categories being merged into two (<7 and ≥7). Descriptive statistics for each item were computed including the mean Likert scores, standard deviation (SD), range and median participant scores. Mean absolute deviation from the median (MAD-M) was also calculated as a measure of participant disagreement.ResultsThirty-six experts agreed to participate and 27 (79%) participants completed all four rounds. A total of 76 information items were generated across the four survey rounds. Thirty-nine items (51%) were specific to describing the intervention (as opposed to other elements of study design) and consensus agreement was achieved for two of these items (5%). When the four rating categories were merged into two (<7 and ≥7), 18 intervention items achieved consensus agreement.ConclusionThis Delphi survey has identified 39 items for describing an educational intervention for EBP. These Delphi intervention items will provide the groundwork for the subsequent consensus discussion to determine the final inclusion of items in the GREET, the first reporting guideline for educational interventions in EBP.


Developmental Medicine & Child Neurology | 2013

The reliability and validity of a research-grade pedometer for children and adolescents with cerebral palsy

Carol Maher; Amanda Kenyon; Maureen McEvoy; Judy Sprod

The aim of this study was to determine the reliability, validity, and optimal placement of pedometers in children with cerebral palsy (CP) who ambulate without aids.


Physiotherapy Theory and Practice | 2008

Transversus abdominis: Changes in thickness during an incremental upper limb exercise test

Maureen McEvoy; Adrian J Cowling; Ian Fulton; Marie Williams

The aim of this study was to measure transversus abdominis (TrA) during an incremental fatiguing task. Using real-time ultrasound, TrA thickness was measured in 26 healthy subjects (18–25 years, 9 male) during an unsupported upper limb exercise test (UULEX). Repeatability of changes in TrA thickness during the UULEX was established by using a test-retest process (n=9, intraclass correlation coefficient=0.62 (95% CI 0.38–0.82), standard error of measurement ∼1 (95% CI 0.87–1.08)). Using mixed model analysis with time as an independent variable, TrA thickness changed significantly throughout the UULEX (p < 0.05). Measures of TrA thickness at minutes 10, 11, and 12 were significantly greater than at baseline (p=0.006 (95% CI 0.23–1.35), 0.001 (95% CI 0.45–1.61), and <0.0001 (95% CI 0.77–2.03), respectively). Transversus abdominis was shown to be continuously and increasingly active over the 12 minutes of an incremental bilateral upper limb test in young healthy adults. As increases in TrA thickness occurred at the points of greatest postural and ventilatory demands, these findings may have implications for subjects with musculoskeletal or respiratory impairments who are often challenged by upper limb tasks.


Teaching and Learning in Medicine | 2016

How Comprehensively Is Evidence-Based Practice Represented in Australian Health Professional Accreditation Documents?: A Systematic Audit

Maureen McEvoy; Mike Crilly; Taryn Young; Jane Farrelly; Lucy K. Lewis

Abstract Phenonenon: In many developed countries, accreditation documents, which reflect the practice standards of health professions, form the basis for evaluation of education programs for meeting the requirements for registration. The 2005 Sicily statement proposed a 5-step model of training in evidence-based practice (ask, access, appraise, apply, and assess). A key recommendation was that evidence-based practice should be incorporated into entry-level health professional training and registration. No previous research has assessed the extent to which this has occurred. Approach: We undertook a systematic audit of the accreditation documents for the registered health professions in Australia. The 11 health professional disciplines included in the audit were medicine, nursing and midwifery, pharmacy, physiotherapy, dentistry, psychology, occupational therapy, optometry, podiatry, osteopathy, and chiropractic. Two investigators independently identified the occurrence of the term evidence that related to “evidence-based practice” and the occurrences of terms related to the 5 steps in the accreditation documents. Findings: Occurrence of the term evidence as it relates specifically to “evidence-based practice” ranged from 0 (pharmacy, dentistry and occupational therapy) to 8 (physiotherapy) in the accreditation documents. Overall, there were 77 occasions when terms relating to any of the 5 steps of evidence-based practice were used across all 11 accreditation documents. All 5 steps were included in the physiotherapy and psychology documents; 4 steps in medicine and optometry; 3 steps in pharmacy; 2 steps each in documents for chiropractic, osteopathy, and podiatry; and 1 step for nursing. There was no inclusion of terms relating to any of the 5 steps in the dentistry and occupational therapy documents. Insights: Terminology relating explicitly to evidence-based practice and to the 5 steps of evidence-based practice appears to be lacking in the accreditation documents for health professions registered in Australia. This is not necessarily reflective of the curricular content or quality, or dedication to evidence-based practice teaching. However, recognition and demand by accreditation bodies for skills in evidence-based practice may act as a driver for education providers to give greater priority to embedding this training in entry-level programs. Consequently, accreditation bodies are powerfully positioned to shape future directions, focus, and boundaries within and across professions. Future international audits of accreditation documents could provide insight into the global breadth of this phenomenon and contribute to closer scrutiny of the representation of evidence-based practice in future iterations of accreditation documents.

Collaboration


Dive into the Maureen McEvoy's collaboration.

Top Co-Authors

Avatar

Marie Williams

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Lucy K. Lewis

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Anna Phillips

University of South Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James Galipeau

Ottawa Hospital Research Institute

View shared research outputs
Top Co-Authors

Avatar

Julie K. Tilson

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Louise Wiles

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Tim Olds

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

David Moher

Ottawa Hospital Research Institute

View shared research outputs
Top Co-Authors

Avatar

Karen Grimmer

University of South Australia

View shared research outputs
Researchain Logo
Decentralizing Knowledge