Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Suzan Dojeiji is active.

Publication


Featured researches published by Suzan Dojeiji.


Medical Teacher | 2004

Communication skills, cultural challenges and individual support: challenges of international medical graduates in a Canadian healthcare environment

Pippa Hall; Erin Keely; Suzan Dojeiji; Anna Byszewski; Meridith Marks

Physicians require good communication skills to develop effective patient–physician relationships. Externally funded international medical graduates (IMGs) move directly from their home countries to complete residency training at the University of Ottawa, Canada. They must learn quickly how to work with patients, families and colleagues. A detailed needs assessment was designed to assess IMGs’ communication skill needs through focus groups, interviews and surveys with IMGs, program directors, allied healthcare professionals and experts in communication skills. There was a high degree of consensus amongst all participants concerning specific educational needs for communication skills and training issues related to the healthcare system for externally funded IMGs. Specific recommendations include (1) English-language skills; (2) how to get things done in the hospital/healthcare system; (3) opportunities to practise specific skills, e.g. negotiating treatment, (4) adequate support system for IMGs; (5) faculty and staff education on the cultural challenges faced by IMGs.


Academic Medicine | 2002

Can written communication skills be tested in an objective structured clinical examination format

Erin Keely; Kathryn Myers; Suzan Dojeiji

Purpose To design and evaluate an objective structured clinical examination (OSCE) station on dictating a consult letter as part of a formative OSCE for internal medicine residents. Method A 22-minute station for the dictation of a consult letter was included in a ten-station OSCE. Two raters completed a 34-item rating scale for 36 letters. The rating scale involved content and style. The stations score was derived from an overall rating of each section of the letter (history, physical examination, impression/plan) and a global rating of the complete letter. The exam also contained a physical exam station on the same patient problem and a verbal communication station. Residents provided written feedback following the station. Results The inter-rater reliability for the stations score was .72. The generalizability coefficient for the two-rater four-part rating scale was .79. The correlation between the consult letter and the total exam scores was .56, the highest for all ten stations. A significant correlation existed between the verbal communication stations score and the letter stations score (r =.37, p <.005), but no correlation occurred between the physical exam stations score on a similar patient problem and the letter stations score. The feedback from residents was favorable regarding amount of information provided and time allotted. Conclusion An OSCE station is a feasible way to examine written communication skills. Letter-writing skills appear to be distinct from knowledge (physical exam station), but somewhat linked to verbal communication skills.


BMC Medical Education | 2007

Peer assessment of outpatient consultation letters – feasibility and satisfaction

Erin Keely; Kathryn Myers; Suzan Dojeiji; Craig Campbell

BackgroundWritten correspondence is one of the most important forms of communication between health care providers, yet there is little feedback provided to specialists. The objective of this study was to determine the feasibility and satisfaction of a peer assessment program on consultation letters and to determine inter-rater reliability between family physicians and specialists.MethodsA rating scale of nine 5-point Likert scale items including specific content, style items, education value of the letter and an overall rating was developed from a previous validated tool.Nine Internal Medicine specialists/subspecialists from two tertiary care centres submitted 10 letters with patient and physician identifiers removed. Two Internal Medicine specialists, and 2 family physicians from the other centre rated each letter (to protect writer anonymity). A satisfaction survey was sent to each writer and rater after collation of the results. A follow-up survey was sent 6–8 months later.ResultsThere was a high degree of satisfaction with the process and feedback. The rating scale information was felt to be useful and appropriate for evaluating the quality of consultation letters by 6/7 writers. 5/7 seven writers felt that the feedback they received resulted in immediate changes to their letters. Six months later, 6/9 writers indicated they had maintained changes in their letters.Raters rank ordered letters similarly (Cronbachs alpha 0.57–0.84) but mean scores were highly variant. At site 1 there were significant differences in scoring brevity (p < 0.01) between family physician and specialist raters; whereas, at site 2 there were differences in scoring of history (p < 0.01), physical examination (p < 0.01) and educational value (p < 0.01) of the letter.ConclusionMost participants found peer assessment of letters feasible and beneficial and longstanding changes occurred in some individuals. Family physicians and specialists appear to have different expectations on some items. Further studies on reliability and validity, with a larger sample, are required before high stakes professional assessments include consultation letters.


Medical Teacher | 2012

Quality evaluation reports: Can a faculty development program make a difference?

Nancy L. Dudek; Meridith Marks; Timothy J. Wood; Suzan Dojeiji; Glen Bandiera; Rose Hatala; Lara Cooke; Leslie A. Sadownik

Background: The quality of medical student and resident clinical evaluation reports submitted by rotation supervisors is a concern. The effectiveness of faculty development (FD) interventions in changing report quality is uncertain. Aims: This study assessed whether faculty could be trained to complete higher quality reports. Method: A 3-h interactive program designed to improve evaluation report quality, previously developed and tested locally, was offered at three different Canadian medical schools. To assess for a change in report quality, three reports completed by each supervisor prior to the workshop and all reports completed for 6 months following the workshop were evaluated by three blinded, independent raters using the Completed Clinical Evaluation Report Rating (CCERR): a validated scale that assesses report quality. Results: A total of 22 supervisors from multiple specialties participated. The mean CCERR score for reports completed after the workshop was significantly higher (21.74 ± 4.91 versus 18.90 ± 5.00, p = 0.02). Conclusions: This study demonstrates that this FD workshop had a positive impact upon the quality of the participants’ evaluation reports suggesting that faculty have the potential to be trained with regards to trainee assessment. This adds to the literature which suggests that FD is an important component in improving assessment quality.


Medical Teacher | 2014

Twelve tips for completing quality in-training evaluation reports

Nancy L. Dudek; Suzan Dojeiji

Abstract Assessing learners in the clinical setting is vital to determining their level of professional competence. Clinical performance assessments can be documented using In-training evaluation reports (ITERs). Previous research has suggested a need for faculty development in order to improve the quality of these reports. Previous work identified key features of high-quality completed ITERs which primarily involve the narrative comments. This aligns well with the recent discourse in the assessment literature focusing on the value of qualitative assessments. Evidence exists to demonstrate that faculty can be trained to complete higher quality ITERs. We present 12 key strategies to assist clinical supervisors in improving the quality of their completed ITERs. Higher quality completed ITERs will improve the documentation of the trainee’s progress and be more defensible when questioned in an appeal or legal process.


Academic Medicine | 2016

Feedback to Supervisors: Is Anonymity Really So Important?

Nancy L. Dudek; Suzan Dojeiji; Kathleen Day; Lara Varpio

Purpose Research demonstrates that physicians benefit from regular feedback on their clinical supervision from their trainees. Several features of effective feedback are enabled by nonanonymous processes (i.e., open feedback). However, most resident-to-faculty feedback processes are anonymous given concerns of power differentials and possible reprisals. This exploratory study investigated resident experiences of giving faculty open feedback, advantages, and disadvantages. Method Between January and August 2014, nine graduates of a Canadian Physiatry residency program that uses open resident-to-faculty feedback participated in semistructured interviews in which they described their experiences of this system. Three members of the research team analyzed transcripts for emergent themes using conventional content analysis. In June 2014, semistructured group interviews were held with six residents who were actively enrolled in the program as a member-checking activity. Themes were refined on the basis of these data. Results Advantages of the open feedback system included giving timely feedback that was acted upon (thus enhancing residents’ educational experiences), and improved ability to receive feedback (thanks to observing modeled behavior). Although some disadvantages were noted, they were often speculative (e.g., “I think others might have felt …”) and were described as outweighed by advantages. Participants emphasized the program’s “feedback culture” as an open feedback enabler. Conclusions The relationship between the feedback giver and recipient has been described as influencing the uptake of feedback. Findings suggest that nonanonymous practices can enable a positive relationship in resident-to-faculty feedback. The benefits of an open system for resident-to-faculty feedback can be reaped if a “feedback culture” exists.


Medical Teacher | 2002

Writing effective consultation letters: 12 tips for teachers

Erin Keely; Suzan Dojeiji; Kathryn Myers


Academic Medicine | 1999

Development of a rating scale to evaluate written communication skills of residents.

Kathryn Myers; Erin Keely; Suzan Dojeiji; G R Norman


Canadian Journal of Neurological Sciences | 2013

The CNDR: Collaborating to Translate New Therapies for Canadians

Lawrence Korngut; Craig Campbell; Megan Johnston; Timothy J. Benstead; Angela Genge; Alex MacKenzie; Anna McCormick; Douglas Biggar; Pierre R. Bourque; Hannah Briemberg; Colleen O'Connell; Suzan Dojeiji; Joseph M. Dooley; Ian Grant; Gillian Hogan; Wendy Johnston; Sanjay Kalra; Hans D. Katzberg; Jean K. Mah; Laura McAdam; Hugh J. McMillan; Michel Melanson; Kathryn Selby; Christen Shoesmith; Garth Smith; Shannon L. Venance; Joy Wee


Archive | 2004

Pre-pregnancy Counselling: What Do Residents Write in Their Consultation Letters?

Erin Keely; Suzan Dojeiji; Kathryn A. Myers; Wylam Faught; Brigitte Bonin

Collaboration


Dive into the Suzan Dojeiji's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alex MacKenzie

Children's Hospital of Eastern Ontario

View shared research outputs
Top Co-Authors

Avatar

Angela Genge

Montreal Neurological Institute and Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna McCormick

Children's Hospital of Eastern Ontario

View shared research outputs
Top Co-Authors

Avatar

Colleen O'Connell

Izaak Walton Killam Health Centre

View shared research outputs
Researchain Logo
Decentralizing Knowledge