Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Carl May is active.

Publication


Featured researches published by Carl May.


Sociology | 2009

Implementing, Embedding, and Integrating Practices: An Outline of Normalization Process Theory

Carl May; Tracy Finch

Understanding the processes by which practices become routinely embedded in everyday life is a long-standing concern of sociology and the other social sciences. It has important applied relevance in understanding and evaluating the implementation of material practices across a range of settings.This article sets out a theory of normalization processes that proposes a working model of implementation, embedding and integration in conditions marked by complexity and emergence. The theory focuses on the work of embedding and of sustaining practices within interaction chains, and helps in understanding why some processes seem to lead to a practice becoming normalized while others do not.


BMJ | 2002

Systematic review of cost effectiveness studies of telemedicine interventions

Pamela Whitten; Frances Mair; Alan Haycox; Carl May; Tracy Williams; Seth Hellmich

Abstract Objectives: To systematically review cost benefit studies of telemedicine. Design: Systematic review of English language, peer reviewed journal articles. Data sources: Searches of Medline, Embase, ISI citation indexes, and database of Telemedicine Information Exchange. Studies selected: 55 of 612 identified articles that presented actual cost benefit data. Main outcome measures: Scientific quality of reports assessed by use of an established instrument for adjudicating on the quality of economic analyses. Results: 557 articles without cost data categorised by topic. 55 articles with data initially categorised by cost variables employed in the study and conclusions. Only 24/55 (44%) studies met quality criteria justifying inclusion in a quality review. 20/24 (83%) restricted to simple cost comparisons. No study used cost utility analysis, the conventional means of establishing the “value for money” that a therapeutic intervention represents. Only 7/24 (29%) studies attempted to explore the level of utilisation that would be needed for telemedicine services to compare favourably with traditionally organised health care. None addressed this question in sufficient detail to adequately answer it. 15/24 (62.5%) of articles reviewed here provided no details of sensitivity analysis, a method all economic analyses should incorporate. Conclusion: There is no good evidence that telemedicine is a cost effective means of delivering health care.


Implementation Science | 2009

Development of a theory of implementation and integration: Normalization Process Theory

Carl May; Frances Mair; Tracy Finch; Anne MacFarlane; Christopher Dowrick; Shaun Treweek; Tim Rapley; Luciana Ballini; Bie Nio Ong; Anne Rogers; Elizabeth Murray; Glyn Elwyn; Jane Gunn; Victor M. Montori

BackgroundTheories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built.MethodsBetween 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model.ResultsEach phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory.ConclusionNormalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.


BMJ | 2009

We need minimally disruptive medicine

Carl May; Victor M. Montori; Frances Mair

The burden of treatment for many people with complex, chronic, comorbidities reduces their capacity to collaborate in their care. Carl May, Victor Montori, and Frances Mair argue that to be effective, care must be less disruptive


BMC Medicine | 2010

Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

Elizabeth Murray; Shaun Treweek; Catherine Pope; Anne MacFarlane; Luciana Ballini; Christopher Dowrick; Tracy Finch; Anne Kennedy; Frances Mair; Catherine O'Donnell; Bie Nio Ong; Tim Rapley; Anne Rogers; Carl May

BackgroundThe past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation).DiscussionIn this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential.SummaryThe NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.


BMC Health Services Research | 2006

A rational model for assessing and evaluating complex interventions in health care

Carl May

BackgroundUnderstanding how new clinical techniques, technologies and other complex interventions become normalized in practice is important to researchers, clinicians, health service managers and policy-makers. This paper presents a model of the normalization of complex interventions.MethodsBetween 1995 and 2005 multiple qualitative studies were undertaken. These examined: professional-patient relationships; changing patterns of care; the development, evaluation and implementation of telemedicine and related informatics systems; and the production and utilization of evidence for practice. Data from these studies were subjected to (i) formative re-analysis, leading to sets of analytic propositions; and to (ii) a summative analysis that aimed to build a robust conceptual model of the normalization of complex interventions in health care.ResultsA normalization process model that enables analysis of the conditions necessary to support the introduction of complex interventions is presented. The model is defined by four constructs: interactional workability; relational integration; skill set workability and contextual integration. This model can be used to understand the normalization potential of new techniques and technologies in healthcare settingsConclusionThe normalization process model has face validity in (i) assessing the potential for complex interventions to become routinely embedded in everyday clinical work, and (ii) evaluating the factors that promote or inhibit their success and failure in practice.


Bulletin of The World Health Organization | 2012

Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review

Frances Mair; Carl May; Catherine O'Donnell; Tracy Finch; Frank Sullivan; Elizabeth Murray

OBJECTIVE To systematically review the literature on the implementation of e-health to identify: (i) barriers and facilitators to e-health implementation, and (ii) outstanding gaps in research on the subject. METHODS MEDLINE, EMBASE, CINAHL, PSYCINFO and the Cochrane Library were searched for reviews published between 1 January 1995 and 17 March 2009. Studies had to be systematic reviews, narrative reviews, qualitative metasyntheses or meta-ethnographies of e-health implementation. Abstracts and papers were double screened and data were extracted on country of origin; e-health domain; publication date; aims and methods; databases searched; inclusion and exclusion criteria and number of papers included. Data were analysed qualitatively using normalization process theory as an explanatory coding framework. FINDINGS Inclusion criteria were met by 37 papers; 20 had been published between 1995 and 2007 and 17 between 2008 and 2009. Methodological quality was poor: 19 papers did not specify the inclusion and exclusion criteria and 13 did not indicate the precise number of articles screened. The use of normalization process theory as a conceptual framework revealed that relatively little attention was paid to: (i) work directed at making sense of e-health systems, specifying their purposes and benefits, establishing their value to users and planning their implementation; (ii) factors promoting or inhibiting engagement and participation; (iii) effects on roles and responsibilities; (iv) risk management, and (v) ways in which implementation processes might be reconfigured by user-produced knowledge. CONCLUSION The published literature focused on organizational issues, neglecting the wider social framework that must be considered when introducing new technologies.


Journal of Clinical Epidemiology | 2012

Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice

Nathan D. Shippee; Nilay D. Shah; Carl May; Frances Mair; Victor M. Montori

OBJECTIVE To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. STUDY DESIGN AND SETTING The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. RESULTS The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patients time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. CONCLUSION With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice.


Annals of Family Medicine | 2011

Understanding Patients’ Experiences of Treatment Burden in Chronic Heart Failure Using Normalization Process Theory

Katie Gallacher; Carl May; Victor M. Montori; Frances Mair

PURPOSE Our goal was to assess the burden associated with treatment among patients living with chronic heart failure and to determine whether Normalization Process Theory (NPT) is a useful framework to help describe the components of treatment burden in these patients. METHODS We performed a secondary analysis of qualitative interview data, using framework analysis, informed by NPT, to determine the components of patient “work.” Participants were 47 patients with chronic heart failure managed in primary care in the United Kingdom who had participated in an earlier qualitative study about living with this condition. We identified and examined data that fell outside of the coding frame to determine if important concepts or ideas were being missed by using the chosen theoretical framework. RESULTS We were able to identify and describe components of treatment burden as distinct from illness burden using the framework. Treatment burden in chronic heart failure includes the work of developing an understanding of treatments, interacting with others to organize care, attending appointments, taking medications, enacting lifestyle measures, and appraising treatments. Factors that patients reported as increasing treatment burden included too many medications and appointments, barriers to accessing services, fragmented and poorly organized care, lack of continuity, and inadequate communication between health professionals. Patient “work” that fell outside of the coding frame was exclusively emotional or spiritual in nature. CONCLUSIONS We identified core components of treatment burden as reported by patients with chronic heart failure. The findings suggest that NPT is a theoretical framework that facilitates understanding of experiences of health care work at the individual, as well as the organizational, level. Although further exploration and patient endorsement are necessary, our findings lay the foundation for a new target for treatment and quality improvement efforts toward patient-centered care.


BMC Health Services Research | 2014

Rethinking the patient: Using Burden of Treatment Theory to understand the changing dynamics of illness

Carl May; David T. Eton; Kasey R. Boehmer; Katie Gallacher; Katherine Hunt; Sara Macdonald; Frances Mair; Christine M. May; Victor M. Montori; Alison Richardson; Anne Rogers; Nathan D. Shippee

BackgroundIn this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding – and sometimes preventing – disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.DiscussionAs the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.SummaryBurden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts.

Collaboration


Dive into the Carl May's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Catherine Pope

University of Southampton

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Linda Gask

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Jane Prichard

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Joanne Turnbull

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Lucy Yardley

University of Southampton

View shared research outputs
Researchain Logo
Decentralizing Knowledge