Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chris Partridge is active.

Publication


Featured researches published by Chris Partridge.


hawaii international conference on system sciences | 2005

An Ontological Approach for Recovering Legacy Business Content

A. Daga; S. de Cesare; Mark Lycett; Chris Partridge

Legacy Information Systems (LIS) pose a challenge for many organizations. On one hand, LIS are viewed as aging systems needing replacement; on the other hand, years of accumulated business knowledge have made these systems mission-critical. Current approaches however are often criticized for being overtly dependent on technology and ignoring the business knowledge which resides within LIS. In this light, this paper proposes a means of capturing the business knowledge in a technology agnostic manner and transforming it in a way that reaps the benefits of clear semantic expression - this transformation is achieved via the careful use of ontology. The approach called Content Sophistication (CS) aims to provide a model of the business that more closely adheres to the semantics and relationships of objects existing in the real world. The approach is illustrated via an example taken from a case study concerning the renovation of a large financial system and the outcome of the approach results in technology agnostic models that show improvements along several dimensions.


Ontology, Epistemology, and Teleology for Modeling and Simulation | 2013

Guidelines for Developing Ontological Architectures in Modelling and Simulation

Chris Partridge; Andrew Mitchell; Sergio de Cesare

This book is motivated by the belief that “a better understanding of ontology, epistemology, and teleology” is essential for enabling Modelling and Simulation (MS ones where building an ontology – and, we shall suggest, an epistemology – as an integrated part of their design will enable them to reach the next level of ‘intelligence’.


conference on advanced information systems engineering | 2013

Re-engineering Data with 4D Ontologies and Graph Databases

Sergio de Cesare; George Foy; Chris Partridge

The amount of data that is being made available on the Web is increasing. This provides business organisations with the opportunity to acquire large datasets in order to offer novel information services or to better market existing products and services. Much of this data is now publicly available (e.g., thanks to initiatives such as Open Government Data). The challenge from a corporate perspective is to make sense of the third party data and transform it so that it can more easily integrate with their existing corporate data or with datasets with a different provenance. This paper presents research-in-progress aimed at semantically transforming raw data on U.K. registered companies. The approach adopted is based on BORO (a 4D foundational ontology and re-engineering method) and the target technological platform is Neo4J (a graph database). The primary challenges encountered are (1) re-engineering the raw data into a 4D ontology and (2) representing the 4D ontology into a graph database. The paper will discuss such challenges and explain the transformation process that is currently being adopted.


international conference on conceptual modeling | 2015

Improving model quality through foundational ontologies: Two contrasting approaches to the representation of roles

Sergio de Cesare; Brian Henderson-Sellers; Chris Partridge; Mark Lycett

Several foundational ontologies have been developed recently. We examine two of these from the point of view of their quality in representing temporal changes, focusing on the example of roles. We discuss how these are modelled in two foundational ontologies: the Unified Foundational Ontology and the BORO foundational ontology. These exhibit two different approaches, endurantist and perdurantist respectively. We illustrate the differences using a running example in the university student domain, wherein one individual is not only a registered student but also, for part of this period, was elected the President of the Student Union. The metaphysical choices made by UFO and BORO lead to different representations of roles. Two key differences which affect the way roles are modelled are exemplified in this paper: (1) different criteria of identity and (2) differences in the way individual objects extend over time and possible worlds. These differences impact upon the quality of the models produced in terms of their respective explanatory power. The UFO model concentrates on the notion of validity in “all possible worlds” and is unable to accurately represent the way particulars are extended in time. The perdurantist approach is best able to describe temporal changes wherein roles are spatio-temporal extents of individuals.


International Journal of Intelligent Defence Support Systems | 2011

A novel ontological approach to semantic interoperability between legacy air defence command and control systems

Chris Partridge; Mike Lambert; Mike Loneragan; Andrew Mitchell; Pawel Garbacz

In common with many other government defence departments, the UK Ministry of Defence (MoD) has realised that it has a plethora of legacy systems that were procured as domain specific with little emphasis given to integration requirements. In particular, it realised that the lack of integration between a significant number of the legacy air defence command and control (AD-C2) systems meant it could not deliver the increased agility needed for joint force AD and that current approaches to integration were unlikely to resolve the problem. They realised that they needed a new approach that demonstrably worked. This paper describes a programme initiated by the MoD to address this problem through the formulation of a novel solution and its demonstration in the tactical AD-C2 environment using a sample of these existing legacy systems. It describes the ontological solution deployed to resolve the ‘hard’ semantic interoperability challenge. It outlines the physical and semantic architecture that was developed to support this approach and describes the implemented planning and collaborative execution (PACE-based) and semantic interoperability engine (SIE) solution.


Software and Systems Modeling | 2018

Formalization of the classification pattern: survey of classification modeling in information systems engineering

Chris Partridge; S. de Cesare; Andrew Mitchell; J Odell

Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the “one and the many.” Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor’s work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE’s understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.


international conference on model driven engineering and software development | 2018

Ontology then Agentology: A Finer Grained Framework for Enterprise Modelling.

Chris Partridge; Sergio de Cesare; Andrew Mitchell; Ana León; Frederik Gailly; Mesbah Khan

Data integration of enterprise systems typically involves combining heterogeneous data residing in different sources into a unified, homogeneous whole. This heterogeneity takes many forms and there are all sorts of significant practical and theoretical challenges to managing this, particularly at the semantic level. In this paper, we consider a type of semantic heterogeneity that is common in Model Driven Architecture (MDA) Computation Independent Models (CIM); one that arises due to the data’s dependence upon the system it resides in. There seems to be no relevant work on this topic in Conceptual Modelling, so we draw upon research done in philosophy and linguistics on formalizing pure indexicals – ‘I’, ‘here’ and ‘now’ – also known as de se (Latin ‘of oneself’) or the deitic centre. This reveals firstly that the core dependency is essential when the system is agentive and the rest of the dependency can be designed away. In the context of MDA, this suggests a natural architectural layering; where a new concern ‘system dependence’ is introduced and used to divide the CIM model into two parts; a system independent ontology model and a system dependent agentology model. We also show how this dependence complicates the integration process – but, interestingly, not reuse in the same context. We explain how this complication usually provides good pragmatic reasons for maximizing the ontology content in an ‘Ontology First’, or ‘Ontology then Agentology’ approach.


international conference on conceptual modeling | 2016

Grounding for Ontological Architecture Quality: Metaphysical Choices

Chris Partridge; Sergio de Cesare

Information systems (IS) are getting larger and more complex, becoming ‘gargantuan’. IS practices have not evolved in step to handle the development and maintenance of these gargantuan systems, leading to a variety of quality issues.


conference on advanced information systems engineering | 2011

Ontology Mining versus Ontology Speculation

Chris Partridge

When we embed the building of an ontology into an information system development or maintenance process, then the question arises as to how one should construct the content of the ontology. One of the choices is whether the construction process should focus on the mining of the ontology from existing resources or should be the result of speculation (‘starting with a blank sheet of paper’). I present some arguments for choosing mining over speculation and then look at the implications this has for legacy modernisation.


conference on advanced information systems engineering | 2011

Preface ODISE 2011

Sergio de Cesare; Frederik Gailly; Grant Holland; Mark Lycett; Chris Partridge

Information systems (IS) Engineering has progressed considerably over the decades. Numerous advances, such as improved development methodologies, languages that enforce recognised software engineering principles and sophisticated CASE tools, have helped to increase the quality of IS. Regardless of such progress many IS Engineering projects remain unsuccessful (e.g., fail to meet stakeholder requirements, run excessively over budget and far beyond the deadlines initially scheduled). As the literature points out, most of these problems are due to (1) the difficulties of capturing and knowing the business requirements of a living organisational system, (2) realising such requirements in software designs and implementations and (3) maintaining an effective level of synchronicity between the needs of the living system and its information system. The causes underlying such problems are diverse and difficult to identify. Nonetheless it is plausible to assume that at the heart of such IS Engineering problems is the difficulty to conceptualise an organisational system and its real-world problem domain.

Collaboration


Dive into the Chris Partridge's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Lycett

Brunel University London

View shared research outputs
Top Co-Authors

Avatar

Frederik Gailly

Vrije Universiteit Brussel

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Oscar Pastor

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Giancarlo Guizzardi

Free University of Bozen-Bolzano

View shared research outputs
Top Co-Authors

Avatar

S. de Cesare

Brunel University London

View shared research outputs
Top Co-Authors

Avatar

A. Daga

Brunel University London

View shared research outputs
Top Co-Authors

Avatar

Aseem Daga

Brunel University London

View shared research outputs
Researchain Logo
Decentralizing Knowledge