Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frances Mair is active.

Publication


Featured researches published by Frances Mair.


BMJ | 2002

Systematic review of cost effectiveness studies of telemedicine interventions

Pamela Whitten; Frances Mair; Alan Haycox; Carl May; Tracy Williams; Seth Hellmich

Abstract Objectives: To systematically review cost benefit studies of telemedicine. Design: Systematic review of English language, peer reviewed journal articles. Data sources: Searches of Medline, Embase, ISI citation indexes, and database of Telemedicine Information Exchange. Studies selected: 55 of 612 identified articles that presented actual cost benefit data. Main outcome measures: Scientific quality of reports assessed by use of an established instrument for adjudicating on the quality of economic analyses. Results: 557 articles without cost data categorised by topic. 55 articles with data initially categorised by cost variables employed in the study and conclusions. Only 24/55 (44%) studies met quality criteria justifying inclusion in a quality review. 20/24 (83%) restricted to simple cost comparisons. No study used cost utility analysis, the conventional means of establishing the “value for money” that a therapeutic intervention represents. Only 7/24 (29%) studies attempted to explore the level of utilisation that would be needed for telemedicine services to compare favourably with traditionally organised health care. None addressed this question in sufficient detail to adequately answer it. 15/24 (62.5%) of articles reviewed here provided no details of sensitivity analysis, a method all economic analyses should incorporate. Conclusion: There is no good evidence that telemedicine is a cost effective means of delivering health care.


Implementation Science | 2009

Development of a theory of implementation and integration: Normalization Process Theory

Carl May; Frances Mair; Tracy Finch; Anne MacFarlane; Christopher Dowrick; Shaun Treweek; Tim Rapley; Luciana Ballini; Bie Nio Ong; Anne Rogers; Elizabeth Murray; Glyn Elwyn; Jane Gunn; Victor M. Montori

BackgroundTheories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built.MethodsBetween 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model.ResultsEach phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory.ConclusionNormalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.


BMJ | 2009

We need minimally disruptive medicine

Carl May; Victor M. Montori; Frances Mair

The burden of treatment for many people with complex, chronic, comorbidities reduces their capacity to collaborate in their care. Carl May, Victor Montori, and Frances Mair argue that to be effective, care must be less disruptive


BMC Health Services Research | 2007

Understanding the implementation of complex interventions in health care: The normalization process model

Carl May; Tracy Finch; Frances Mair; Luciana Ballini; Christopher Dowrick; Martin Eccles; Linda Gask; Anne MacFarlane; Elizabeth Murray; Tim Rapley; Anne Rogers; Shaun Treweek; Paul Wallace; George Anderson; Jo Burns; Ben Heaven

BackgroundThe Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice.MethodsA formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions.ResultsThe model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration).ConclusionThe model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.


BMC Medicine | 2010

Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

Elizabeth Murray; Shaun Treweek; Catherine Pope; Anne MacFarlane; Luciana Ballini; Christopher Dowrick; Tracy Finch; Anne Kennedy; Frances Mair; Catherine O'Donnell; Bie Nio Ong; Tim Rapley; Anne Rogers; Carl May

BackgroundThe past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation).DiscussionIn this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential.SummaryThe NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.


Bulletin of The World Health Organization | 2012

Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review

Frances Mair; Carl May; Catherine O'Donnell; Tracy Finch; Frank Sullivan; Elizabeth Murray

OBJECTIVE To systematically review the literature on the implementation of e-health to identify: (i) barriers and facilitators to e-health implementation, and (ii) outstanding gaps in research on the subject. METHODS MEDLINE, EMBASE, CINAHL, PSYCINFO and the Cochrane Library were searched for reviews published between 1 January 1995 and 17 March 2009. Studies had to be systematic reviews, narrative reviews, qualitative metasyntheses or meta-ethnographies of e-health implementation. Abstracts and papers were double screened and data were extracted on country of origin; e-health domain; publication date; aims and methods; databases searched; inclusion and exclusion criteria and number of papers included. Data were analysed qualitatively using normalization process theory as an explanatory coding framework. FINDINGS Inclusion criteria were met by 37 papers; 20 had been published between 1995 and 2007 and 17 between 2008 and 2009. Methodological quality was poor: 19 papers did not specify the inclusion and exclusion criteria and 13 did not indicate the precise number of articles screened. The use of normalization process theory as a conceptual framework revealed that relatively little attention was paid to: (i) work directed at making sense of e-health systems, specifying their purposes and benefits, establishing their value to users and planning their implementation; (ii) factors promoting or inhibiting engagement and participation; (iii) effects on roles and responsibilities; (iv) risk management, and (v) ways in which implementation processes might be reconfigured by user-produced knowledge. CONCLUSION The published literature focused on organizational issues, neglecting the wider social framework that must be considered when introducing new technologies.


Journal of Clinical Epidemiology | 2012

Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice

Nathan D. Shippee; Nilay D. Shah; Carl May; Frances Mair; Victor M. Montori

OBJECTIVE To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. STUDY DESIGN AND SETTING The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. RESULTS The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patients time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. CONCLUSION With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice.


Annals of Family Medicine | 2011

Understanding Patients’ Experiences of Treatment Burden in Chronic Heart Failure Using Normalization Process Theory

Katie Gallacher; Carl May; Victor M. Montori; Frances Mair

PURPOSE Our goal was to assess the burden associated with treatment among patients living with chronic heart failure and to determine whether Normalization Process Theory (NPT) is a useful framework to help describe the components of treatment burden in these patients. METHODS We performed a secondary analysis of qualitative interview data, using framework analysis, informed by NPT, to determine the components of patient “work.” Participants were 47 patients with chronic heart failure managed in primary care in the United Kingdom who had participated in an earlier qualitative study about living with this condition. We identified and examined data that fell outside of the coding frame to determine if important concepts or ideas were being missed by using the chosen theoretical framework. RESULTS We were able to identify and describe components of treatment burden as distinct from illness burden using the framework. Treatment burden in chronic heart failure includes the work of developing an understanding of treatments, interacting with others to organize care, attending appointments, taking medications, enacting lifestyle measures, and appraising treatments. Factors that patients reported as increasing treatment burden included too many medications and appointments, barriers to accessing services, fragmented and poorly organized care, lack of continuity, and inadequate communication between health professionals. Patient “work” that fell outside of the coding frame was exclusively emotional or spiritual in nature. CONCLUSIONS We identified core components of treatment burden as reported by patients with chronic heart failure. The findings suggest that NPT is a theoretical framework that facilitates understanding of experiences of health care work at the individual, as well as the organizational, level. Although further exploration and patient endorsement are necessary, our findings lay the foundation for a new target for treatment and quality improvement efforts toward patient-centered care.


BMC Health Services Research | 2014

Rethinking the patient: Using Burden of Treatment Theory to understand the changing dynamics of illness

Carl May; David T. Eton; Kasey R. Boehmer; Katie Gallacher; Katherine Hunt; Sara Macdonald; Frances Mair; Christine M. May; Victor M. Montori; Alison Richardson; Anne Rogers; Nathan D. Shippee

BackgroundIn this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding – and sometimes preventing – disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.DiscussionAs the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.SummaryBurden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts.


BMC Health Services Research | 2011

Integrating telecare for chronic disease management in the community: What needs to be done?

Carl May; Tracy Finch; James Cornford; Catherine Exley; Claire Gately; Susan Kirk; K. Neil Jenkings; Janice Osbourne; A. Louise Robinson; Anne Rogers; Rob Wilson; Frances Mair

BackgroundTelecare could greatly facilitate chronic disease management in the community, but despite government promotion and positive demonstrations its implementation has been limited. This study aimed to identify factors inhibiting the implementation and integration of telecare systems for chronic disease management in the community.MethodsLarge scale comparative study employing qualitative data collection techniques: semi-structured interviews with key informants, task-groups, and workshops; framework analysis of qualitative data informed by Normalization Process Theory. Drawn from telecare services in community and domestic settings in England and Scotland, 221 participants were included, consisting of health professionals and managers; patients and carers; social care professionals and managers; and service suppliers and manufacturers.ResultsKey barriers to telecare integration were uncertainties about coherent and sustainable service and business models; lack of coordination across social and primary care boundaries, lack of financial or other incentives to include telecare within primary care services; a lack of a sense of continuity with previous service provision and self-care work undertaken by patients; and general uncertainty about the adequacy of telecare systems. These problems led to poor integration of policy and practice.ConclusionTelecare services may offer a cost effective and safe form of care for some people living with chronic illness. Slow and uneven implementation and integration do not stem from problems of adoption. They result from incomplete understanding of the role of telecare systems and subsequent adaption and embeddedness to context, and uncertainties about the best way to develop, coordinate, and sustain services that assist with chronic disease management. Interventions are therefore needed that (i) reduce uncertainty about the ownership of implementation processes and that lock together health and social care agencies; and (ii) ensure user centred rather than biomedical/service-centred models of care.

Collaboration


Dive into the Frances Mair's collaboration.

Top Co-Authors

Avatar

Carl May

University of Southampton

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge