Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Carolyn McGregor is active.

Publication


Featured researches published by Carolyn McGregor.


Artificial Intelligence in Medicine | 2007

Temporal abstraction in intelligent clinical data analysis: A survey

Michael Stacey; Carolyn McGregor

OBJECTIVE Intelligent clinical data analysis systems require precise qualitative descriptions of data to enable effective and context sensitive interpretation to take place. Temporal abstraction (TA) provides the means to achieve such descriptions, which can then be used as input to a reasoning engine where they are evaluated against a knowledge base to arrive at possible clinical hypotheses. This paper surveys previous research into the development of intelligent clinical data analysis systems that incorporate TA mechanisms and presents research synergies and trends across the research reviewed, especially those associated with the multi-dimensional nature of real-time patient data streams. The motivation for this survey is case study based research into the development of an intelligent real-time, high-frequency patient monitoring system to provide detection of temporal patterns within multiple patient data streams. RESULTS The survey was based on factors that are of importance to broaden research into temporal abstraction and on characteristics we believe will assume an increasing level of importance for future clinical IDA systems. These factors were: aspects of the data that is abstracted such as source domain and sample frequency, complexity available within abstracted patterns, dimensionality of the TA and data environment and the knowledge and reasoning underpinning TA processes. CONCLUSION It is evident from the review that for intelligent clinical data analysis systems to progress into the future where clinical environments are becoming increasingly data-intensive, the ability for managing multi-dimensional aspects of data at high observation and sample frequencies must be provided. Also, the detection of complex patterns within patient data requires higher levels of TA than are presently available. The conflicting matters of computational tractability and temporal reasoning within a real-time environment present a non-trivial problem for investigation in regard to these matters. Finally, to be able to fully exploit the value of learning new knowledge from stored clinical data through data mining and enable its application to data abstraction, the fusion of data mining and TA processes becomes a necessity.


international conference on intelligence in next generation networks | 2015

Smart city architecture for community level services through the internet of things

Roozbeh Jalali; Khalil El-Khatib; Carolyn McGregor

Today, more than half of the worlds population spend their lives in cities, and this number will jump to 70 percent by 2050. Increasing population density in urban environments demands adequate provision of services and infrastructure. This explosion in city population will present major challenges including air pollution, traffic congestion, health concerns, energy and waste management. Solution to these challenges might require the integration of various Information and Communication Technologies into the artifact of the city. This paper presents an architecture for smart cities, where city management, community service providers and citizens have access to real time data which has been gathered using various sensory mechanisms in order to analyze and make decisions for future planning.


congress on evolutionary computation | 2003

A framework for analyzing and measuring business performance with Web services

Carolyn McGregor; J. Scheifer

The Web services paradigm provides organizations with an environment to enhance B2B communications. The aim is to create modularized services supporting the business processes within their organization and also those external entities participating in these same business processes. Current Web service frameworks do not include the functionality required for Web service execution performance measurement from an organization perspective. As such, a shift to this paradigm is at the expense of the organizations performance knowledge, as this knowledge will become buried within the internal processing of the Web service platform. This research introduces an approach to reclaim and improve this knowledge for the organization establishing a framework that enables the definition of Web services from a performance measurement perspective, together with the logging and analysis of the enactment of Web services. This framework utilizes Web service concepts, DSS principles, and agent technologies, to enable feedback on the organizations performance measures through the analysis of the Web services. A key benefit of this work is that the data is stored once but provides information both to the customer and the supplier of a Web service, removing the need for development of internal Web service performance monitoring.


computer based medical systems | 2011

A cloud computing framework for real-time rural and remote service of critical care

Carolyn McGregor

Critical care patients in rural, remote and some urban healthcare facilities do not have the same level of access to intensivist specialist support as patients in higher care level urban critical care units (CCUs). New clinical research is also demonstrating that computationally intensive analysis of physiological data streams in near real-time has the potential to detect the onset of certain conditions earlier. The provision of clinical decision support tools, in a cost effective way to all CCUs has the potential to reduce mortality and morbidity rates, reduce critical care patient transportation between CCUs and in so doing reduce healthcare costs. This research presents Artemis Cloud, a cloud computing based Software-asa-Service and Data-as-a-Service approach for the provision of remote real-time patient monitoring and support for clinical research. This research is demonstrated using a neonatal intensive care unit case study supporting clinical research for earlier onset detection of late onset neonatal sepsis.


computer-based medical systems | 2009

Extending CRISP-DM to incorporate temporal data mining of multidimensional medical data streams: A neonatal intensive care unit case study

Christina Catley; Kathleen P. Smith; Carolyn McGregor; Mark Tracy

Using a Neonatal Intensive Care Unit (NICU) case study, this work investigates the current CRoss Industry Standard Process for Data Mining (CRISP-DM) approach for modeling Intelligent Data Analysis (IDA)-based systems that perform temporal data mining (TDM). The case study highlights the need for an extended CRISP-DM approach when modeling clinical systems applying Data Mining (DM) and Temporal Abstraction (TA). As the number of such integrated TA/DM systems continues to grow, this limitation becomes significant and motivated our proposal of an extended CRISP-DM methodology to support TDM, known as CRISP-TDM. This approach supports clinical investigations on multi-dimensional time series data. This research paper has three key objectives: 1) Present a summary of the extended CRISP-TDM methodology; 2) Demonstrate the applicability of the proposed model to the NICU data, focusing on the challenges associated with multi-dimensional time series data; and 3) Describe the proposed IDA architecture for applying integrated TDM.


ieee international conference on e-technology, e-commerce and e-service | 2005

A Web services based framework for the transmission of physiological data for local and remote neonatal intensive care

Carolyn McGregor; Jennifer A. Heath; Ming Wei

Premature and ill term babies born in metropolitan and regional Australia are monitored and supported by a range of medical devices within neonatal intensive care units (NICUs) or special care nurseries. Information produced by these devices is in a range of formats making data transmission infrastructures complex. This paper details case study based Web service framework research for the transmission of physiological data for local and remote neonatal intensive care. This framework enables real-time physiological data collected from medical monitors and ventilators attached to the baby to be encoded in XML and transmitted via the use of a physiological log Web service. The key contribution of this significant research is the infrastructure providing a mechanism for neonatologists to receive information directly from a regional hospital, thereby preventing, in some cases, the immediate need to move the baby. This paper further describes the application of that architecture to a specific pilot within the NICU at Nepean Hospital, Penrith Australia.


computer-based medical systems | 2012

Variability analysis with analytics applied to physiological data streams from the neonatal intensive care unit

Carolyn McGregor; Christina Catley; Andrew James

Late onset neonatal sepsis (LONS) is one clinical condition that shows promise for earlier onset detection through the analysis of physiological signals. However, current work on Heart Rate Variability (HRV) analysis does not discuss the impact of narcotics and other drugs on early identification of sepsis. We present results of a pilot retrospective data mining study of neonatal intensive care unit patients using a dataset of 30 second spot readings. We derive analytics by creating temporal abstractions of hourly summaries for HRV and respiratory rate variability (RRV). Using representative patient examples, we illustrate an analytics user interface design that shows 1) the potential in using our HRV analytics for early identification of LONS with 30 second spot readings; and 2) that based on initial pilot results, reporting analytics for HRV and RRV concurrently adds value to HRV analysis by distinguishing between patients with low HRV due to imminent sepsis and those patients with low HRV due to the presence of confounding factors such as surgery and narcotics.


international conference of the ieee engineering in medicine and biology society | 2007

An architecture for multi-dimensional temporal abstraction and its application to support neonatal intensive care

Michael Stacey; Carolyn McGregor; Mark Tracy

Temporal abstraction (TA) provides the means to instil domain knowledge into data analysis processes and allows transformation of low level numeric data to high level qualitative narratives. TA mechanisms have been primarily applied to uni-dimensional data sources equating to single patients in the clinical context. This paper presents a framework for multi-dimensional TA (MDTA) enabling analysis of data emanating from numerous patients to detect multiple conditions within the environment of neonatal intensive care. Patient agents which perform temporal reasoning upon patient data streams are based on the event calculus and an active ontology provides a central knowledge core where rules are stored and agent responses accumulated, thus permitting a level of multi-dimensionality within data abstraction processes. Facilitation of TA across a ward of patients offers the potential for early detection of debilitating conditions such as Sepsis, Pneumothorax and Periventricular Leukomalacia (PVL), which have been shown to exhibit advance indicators in physiological data. Preliminary prototyping for patient agents has begun with promising results and a schema for the active rule repository outlined.


IEEE Reviews in Biomedical Engineering | 2013

Implementation of Artifact Detection in Critical Care: A Methodological Review

Shermeen Nizami; James R. Green; Carolyn McGregor

Artifact detection (AD) techniques minimize the impact of artifacts on physiologic data acquired in critical care units (CCU) by assessing quality of data prior to clinical event detection (CED) and parameter derivation (PD). This methodological review introduces unique taxonomies to synthesize over 80 AD algorithms based on these six themes: 1) CCU; 2) physiologic data source; 3) harvested data; 4) data analysis; 5) clinical evaluation; and 6) clinical implementation. Review results show that most published algorithms: a) are designed for one specific type of CCU; b) are validated on data harvested only from one OEM monitor; c) generate signal quality indicators (SQI) that are not yet formalized for useful integration in clinical workflows; d) operate either in standalone mode or coupled with CED or PD applications; e) are rarely evaluated in real-time; and f) are not implemented in clinical practice. In conclusion, it is recommended that AD algorithms conform to generic input and output interfaces with commonly defined data: 1) type; 2) frequency; 3) length; and 4) SQIs. This shall promote: a) reusability of algorithms across different CCU domains; b) evaluation on different OEM monitor data; c) fair comparison through formalized SQIs; d) meaningful integration with other AD, CED and PD algorithms; and e) real-time implementation in clinical workflows.


international parallel and distributed processing symposium | 2002

Business process monitoring using web services in B2B e-commerce

Carolyn McGregor; Santhosh Kumaran

Organisations are re-engineering their B2B communications to be performed through web services. Their aim is to create modularised services that support the business processes within their organisation and also those external entities that participate in these same business processes. This improvement is at the expense of the organisations knowledge of its performance, as this knowledge will become buried within the internal processing of the web service platform. This research introduces an approach to reclaim and improve this knowledge for the organisation by establishing a framework that enables the definition of web services, together with the logging and analysis of the enactment of web services. This framework utilises web service concepts, DSS principles, and agent technologies, to enable feedback on the organisations performance measures through the analysis of the web services. We apply this framework to a specific case study where suppliers participate in an inter-organisational workflow via a Private Exchange in the context of order fulfilment. A key benefit of this work is that the data is stored once but provides information both to the organisation acting as the customer and the organisation acting as the supplier. It therefore removes the need for development of internal performance monitoring tools to monitor web services performance.

Collaboration


Dive into the Carolyn McGregor's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Mikael Eklund

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joanne Curry

University of Western Sydney

View shared research outputs
Top Co-Authors

Avatar

Jennifer Percival

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rishikesan Kamaleswaran

Information Technology University

View shared research outputs
Top Co-Authors

Avatar

Christina Catley

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Anirudh Thommandram

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nadja Bressan

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kathleen P. Smith

University of Ontario Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge