David L. Hicks
Aalborg University – Esbjerg
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David L. Hicks.
ACM Transactions on Information Systems | 1993
John L. Schnase; John J. Leggett; David L. Hicks; Ron L. Szabo
Many important issues in the design and implementation of hypermedia system functionality focus on the way interobject connections are represented, manipulated, and stored. A prototypic system called HB1 is being designed to meet the storage needs of next-generation hypermedia system architectures. HB1 is referred to as a hyperbase management system (HBMS) because it supports, not only the storage and manipulation of information, but the storage and manipulation of the connectivity data that link information together to form hypermedia. Among HB1s distinctions is its use of a semantic network database system to manage physical storage. Here, basic semantic modeling concepts as they apply to hypermedia systems are reviewed, and experiences using a semantic database system in HB1 are discussed. Semantic data models attempt to provide more powerful mechanisms for structuring objects than are provided by traditional approaches. In HB1, it was necessary to abstract interobject connectivity, behaviors, and information for hypermedia. Building on top of a semantic database system facilitated such a separation and made the structural aspects of hypermedia conveniently accessible to manipulation. This becomes particularly important in the implementation of structure-related operations such as structural queries. Our experience suggests that an integrated semantic object-oriented database paradigm appears to be superior to purely relational, semantic, or object-oriented methodologies for representing the structurally complex interrelationships that arise in hypermedia.
acm conference on hypertext | 2001
Uffe Kock Wiil; David L. Hicks; Peter J. Nürnberg
Over the past decade, hypermedia systems have become increasingly open, distributed, and modular. As a direct result of this, open hypermedia systems have been increasingly successful in providing middleware services such as linking to a large set of clients. This paper presents a new approach to service provision in open hypermedia systems based on the concept of multiple open services. The overall idea with multiple open services is to rethink the way in which services are provided to clients. The goal is to split up services into components, each of which provides a general, scalable, and functionally independent (orthogonal) service. This results in a highly flexible architectural framework that can serve as a vehicle to further investigate many of the open issues relating to open hypermedia systems. The approach can be viewed as a natural next step in the evolution towards more open, distributed, and modular hypermedia systems. The concept of multiple open services is described in detail, and a proof of concept implementation called Construct is presented.
ACM Transactions on Information Systems | 1998
David L. Hicks; John J. Leggett; Peter J. Nürnberg; John L. Schnase
The areas of application of hypermedia technology, combined with the capabilities that hypermedia provides for manipulating structure, create an environment in which version control is very important. A hypermedia version control framework has been designed to specifically address the version control problem in open hypermedia environments. One of the primary distinctions of the framework is the partitioning of hypermedia version control functionality into intrinsic and application-specific categories. The version control has been used as a model for the design of version control services for a hyperbase management system that provides complete version support for both data and structural entities. In addition to serving as a version control model for open hypermedia environments, the framework offers a clarifying and unifying context in which to examine the issues of version control in hypermedia.
intelligence and security informatics | 2008
Nasrullah Memon; Henrik Legind Larsen; David L. Hicks; Nicholas Harkiolakis
This paper provides a novel algorithm to automatically detect the hidden hierarchy in terrorist networks. The algorithm is based on centrality measures used in social network analysis literature. The advantage of such automatic methods is to detect key players in terrorist networks. We illustrate the algorithm over some case studies of terrorist events that have occurred in the past. The results show great promise in detecting high value individuals.
Springer US | 2009
Nasrullah Memon; Jonathan David Farley; David L. Hicks; Torben Rosenørn
The present work presents the most current research from mathematicians and computer scientists from around the world to develop strategies for counterterrorism and homeland security to the broader public. New mathematical and computational technique concepts are applied to counterterrorism and computer security problems. Topics covered include strategies for disrupting terrorist cells, border penetration and security, terrorist cell formation and growth, data analysis of terrorist activity, terrorism deterrence strategies, information security, emergency response and planning. Since 2001, tremendous amounts of information have been gathered regarding terrorist cells and individuals potentially planning future attacks. This book addresses this need to develop new countermeasures. Interested readers include researchers, policy makers, politicians, and the members of intelligence and law enforcement agencies.
Archive | 2010
Nasrullah Memon; Jennifer Jie Xu; David L. Hicks; Hsinchun Chen
Driven by counter-terrorism efforts, marketing analysis and an explosion in online social networking in recent years, data mining has moved to the forefront of information science. This proposed Special Issue on Data Mining for Social Network Data will present a broad range of recent studies in social networking analysis. It will focus on emerging trends and needs in discovery and analysis of communities, solitary and social activities, activities in open for a and commercial sites as well. It will also look at network modeling, infrastructure construction, dynamic growth and evolution pattern discovery using machine learning approaches and multi-agent based simulations. Editors are three rising stars in world of data mining, knowledge discovery, social network analysis, and information infrastructures, and are anchored by Springer author/editor Hsinchun Chen (Terrorism Informatics; Medical Informatics; Digital Government), who is one of the most prominent intelligence analysis and data mining experts in the world.
availability, reliability and security | 2007
Nasrullah Memon; Kim C. Kristoffersen; David L. Hicks; Henrik Legind Larsen
This paper presents the study of structural cohesion which is discussed in social network analysis (SNA), but can also be used in several other important application areas including investigative data mining for destabilizing terrorist networks. Structural cohesion is defined as the number of actors who, if removed from a group, would disconnect the group. In this paper we discuss structural cohesion concepts, such as cliques, n-cliques, n-clans and k-plex to determine familiarity, robustness and reachability within subgroups of the 9/11 terrorist network. Moreover we also propose a methodology of detecting critical regions in covert networks; removing/capturing those nodes will disrupt most of the network
acm conference on hypertext | 2000
Uffe Kock Wiil; Peter J. Nürnberg; David L. Hicks; Siegfried Reich
The Construct development environment is targeted at the construction of different types of hypermedia services. The primary goal of the environment is to ease the construction of component-based open hypermedia systems by providing development tools that assist the system developers in the generation of the set of services that make up a hypermedia system.
International Symposium on Metainformatics | 2003
Peter J. Nürnberg; Uffe Kock Wiil; David L. Hicks
Structural computing, in one sense, seeks to unify the notions of data and structure under a synthesized abstraction, by which data and structure become views to be applied as the need or desire arises. Indeed, one way of looking at structural computing is that the notions of data and structure are contextual, not essential. Any entity may be data to one person (application, agent, whatever) at one moment, and structure to another. Data and structure are matters of interpretation, not essence. What exactly this has bought us is discussed at length elsewhere [7,10,11].
advanced data mining and applications | 2007
Nasrullah Memon; David L. Hicks; Henrik Legind Larsen
A new model of dependence centrality is proposed. The centrality measure is based on shortest paths between the pair of nodes. We apply this measure with the demonstration of a small network example. The comparisons are made with betweenness centrality. We discuss how intelligence investigation agencies could benefit from the proposed measure. In addition to that we argue about the investigative data mining techniques we are using, and a comparison is provided with traditional data mining techniques.