Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daby M. Sow is active.

Publication


Featured researches published by Daby M. Sow.


pervasive computing and communications | 2005

Sentire: a framework for building middleware for sensor and actuator networks

Joel W. Branch; John S. Davis; Daby M. Sow; Chatschik Bisdikian

Sentire represents a framework for building extensible middleware for sensor and actuator networks (SANET). The fundamental principle behind Sentire is the partitioning of SANET middleware into logically related components in order to allow developers of different aspects of the middleware to share a common plug-in infrastructure where their developed artifacts can interact. In this paper, an extended introduction of Sentire is presented, followed by a practical illustration of how this framework can be used to build SANET middleware.


international workshop on mobile commerce | 2001

Enabling location-based applications

Chatschik Bisdikian; Jim Christensen; John S. Davis; Maria R. Ebling; Guerney D. H. Hunt; William F. Jerome; Hui Lei; Stephane Herman Maes; Daby M. Sow

We identify a number of factors that may hinder the commercial success of location-based applications: the concern of privacy, the need to consider context beyond location, the presence of voluminous resources, and the constrained interfaces available on mobile devices. We describe an end-to-end system architecture with integrated support to address these issues. In particular, the architecture includes a Secure Context Service that provides broad context information to applications and allows people to flexibly control the release of their private information, an Intelligent Service Discovery Service that allows for personalized selection of physical and virtual services, and a multi-modal interaction mechanism that enables users to exploit multiple synchronized access channels to interact with an application and to switch among channels at any time. Our goals are to improve user experience, to reduce user distraction and to facilitate awareness of the physical world.


international conference on pattern recognition | 2010

Localized Supervised Metric Learning on Temporal Physiological Data

Jimeng Sun; Daby M. Sow; Jianying Hu; Shahram Ebadollahi

Effective patient similarity assessment is important for clinical decision support. It enables the capture of past experience as manifested in the collective longitudinal medical records of patients to help clinicians assess the likely outcomes resulting from their decisions and actions. However, it is challenging to devise a patient similarity metric that is clinically relevant and semantically sound. Patient similarity is highly context sensitive: it depends on factors such as the disease, the particular stage of the disease, and co-morbidities. One way to discern the semantics in a particular context is to take advantage of physicians’ expert knowledge as reflected in labels assigned to some patients. In this paper we present a method that leverages localized supervised metric learning to effectively incorporate such expert knowledge to arrive at semantically sound patient similarity measures. Experiments using data obtained from the MIMIC II database demonstrate the effectiveness of this approach.


international conference on data mining | 2010

A System for Mining Temporal Physiological Data Streams for Advanced Prognostic Decision Support

Jimeng Sun; Daby M. Sow; Jianying Hu; Shahram Ebadollahi

We present a mining system that can predict the future health status of the patient using the temporal trajectories of health status of a set of similar patients. The main novelties of this system are its use of stream processing technology for handling the incoming physiological time series data and incorporating domain knowledge in learning the similarity metric between patients represented by their temporal data. The proposed approach and system were tested using the MIMIC II database, which consists of physiological waveforms, and accompanying clinical data obtained for ICU patients. The study was carried out on 1500 patients from this database. In the experiments we report the efficiency and throughput of the stream processing unit for feature extraction, the effectiveness of the supervised similarity measure both in the context of classification and retrieval tasks compared to unsupervised approaches, and the accuracy of the temporal projections of the patient data.


ITCom 2001: International Symposium on the Convergence of IT and Communications | 2001

Using self-authentication and recovery images for error concealment in wireless environments

Ching-Yung Lin; Daby M. Sow; Shih-Fu Chang

Handling packet loss or delay in the mobile and/or Internet environment is usually a challenging problem for multimedia transmission. Using connection-oriented protocol such as TCP may introduce intolerable time delay in re-transmission. Using datagram-oriented protocols such as UDP may cause partial representation in case of packet loss. In this paper, we propose a new method of using our self-authentication-and-recovery images (SARI) to do the error detection and concealment in the UDP environment. The lost information in a SARI image can be approximately recovered based on the embedded watermark, which includes the content-based authentication information and recovery information. Images or video frames are watermarked in a priori such that no additional mechanism is needed in the networking or the encoding process. Because the recovery is not based on adjacent blocks, the proposed method can recover the corrupted area even though the information loss happen in large areas or high variant areas. Our experiments show the advantages of such technique in both transmission time saving and broad application potentials.


international provenance and annotation workshop | 2008

Advances and Challenges for Scalable Provenance in Stream Processing Systems

Archan Misra; Marion Lee Blount; Anastasios Kementsietsidis; Daby M. Sow; Min Wang

While data provenance is a well-studied topic in both database and workflow systems, its support within stream processing systems presents a new set of challenges. Part of the challenge is the high stream event rate and the low processing latency requirements imposed by many streaming applications. For example, emerging streaming applications in healthcare or finance call for data provenance, as illustrated in the Century stream processing infrastructure that we are building for supporting online healthcare analytics. At anytime, given an output data element (e.g., a medical alert) generated by Century, the system must be able to retrieve the input and intermediate data elements that led to its generation. In this paper, we describe the requirements behind our initial implementation of Centurys provenance subsystem. We then analyze its strengths and limitations and propose a new provenance architecture to address some of these limitations. The paper also includes a discussion on the open challenges in this area.


international conference on mobile systems, applications, and services | 2007

A time-and-value centric provenance model and architecture for medical event streams

Min Wang; Marion Lee Blount; John S. Davis; Archan Misra; Daby M. Sow

Provenance becomes a critical requirement for healthcare IT infrastructures, especially when pervasive biomedical sensors act as a source of raw medical streams for large-scale, automated clinical decision support systems. Medical and legal requirements will make it obligatory for such systems to answer queries regarding the underlying data samples from which output alerts are derived, the IDs of the processing components used and the privileges of the individuals and software components accessing the medical data. Unfortunately, existing models of either annotation or process based provenance are designed for transaction-oriented systems and do not satisfy the unique requirements for systems processing high-volume, continuous medical streams. This paper proposes a simple, but useful, hybrid provenance model called Time-Value Centric (TVC) provenance.


runtime verification | 2010

Visual debugging for stream processing applications

Wim De Pauw; Mihai Leţia; Bugra Gedik; Henrique Andrade; Andy L. Frenkiel; Michael Donald Pfeifer; Daby M. Sow

Stream processing is a new computing paradigm that enables continuous and fast analysis of massive volumes of streaming data. Debugging streaming applications is not trivial, since they are typically distributed across multiple nodes and handle large amounts of data. Traditional debugging techniques like breakpoints often rely on a stop-the-world approach, which may be useful for debugging single node applications, but insufficient for streaming applications. We propose a new visual and analytic environment to support debugging, performance analysis, and troubleshooting for stream processing applications. Our environment provides several visualization methods to study, characterize, and summarize the flow of tuples between stream processing operators. The user can interactively indicate points in the streaming application from where tuples will be traced and visualized as they flow through different operators, without stopping the application. To substantiate our discussion, we also discuss several of these features in the context of a financial engineering application.


measurement and modeling of computer systems | 2014

A fast online learning algorithm for distributed mining of BigData

Yu Zhang; Daby M. Sow; Deepak S. Turaga; Mihaela van der Schaar

BigData analytics require that distributed mining of numerous data streams is performed in real-time. Unique challenges associated with designing such distributed mining systems are: online adaptation to incoming data characteristics, online processing of large amounts of heterogeneous data, limited data access and communication capabilities between distributed learners, etc. We propose a general framework for distributed data mining and develop an efficient online learning algorithm based on this. Our framework consists of an ensemble learner and multiple local learners, which can only access different parts of the incoming data. By exploiting the correlations of the learning models among local learners, our proposed learning algorithms can optimize the prediction accuracy while requiring significantly less information exchange and computational complexity than existing state-of-the-art learning solutions.


embedded and real-time computing systems and applications | 2007

Century: Automated Aspects of Patient Care

Marion Lee Blount; John S. Davis; Maria R. Ebling; Ji Hyun Kim; Kyu Hyun Kim; KangYoon Lee; Archan Misra; SeHun Park; Daby M. Sow; Young Ju Tak; Min Wang; Karen Witting

Remote health monitoring affords the possibility of improving the quality of health care by enabling relatively inexpensive out-patient care. However, remote health monitoring raises new a problem: the potential for data explosion in health care systems. To address this problem, the remote health monitoring systems must be integrated with analysis tools that provide automated trend analysis and event detection in real time. In this paper, we propose an overview of Century, an extensible framework for analysis of large numbers of remote sensor-based medical data streams.

Collaboration


Dive into the Daby M. Sow's collaboration.

Researchain Logo
Decentralizing Knowledge