Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chris Burnett is active.

Publication


Featured researches published by Chris Burnett.


international joint conference on artificial intelligence | 2011

Trust decision-making in multi-agent systems

Chris Burnett; Timothy J. Norman; Katia P. Sycara

Trust is crucial in dynamic multi-agent systems, where agents may frequently join and leave, and the structure of the society may often change. In these environments, it may be difficult for agents to form stable trust relationships necessary for confident interactions. Societies may break down when trust between agents is too low to motivate interactions. In such settings, agents should make decisions about who to interact with, given their degree of trust in the available partners. We propose a decision-theoretic model of trust decision making allows controls to be used, as well as trust, to increase confidence in initial interactions. We consider explicit incentives, monitoring and reputation as examples of such controls. We evaluate our approach within a simulated, highly-dynamic multi-agent environment, and show how this model supports the making of delegation decisions when trust is low.


The Computer Journal | 2010

Agent Support for Policy-Driven Collaborative Mission Planning

Katia P. Sycara; Timothy J. Norman; Joseph A. Giampapa; Martin J. Kollingbaum; Chris Burnett; Daniele Masato; Mairi McCallum; Michael Strub

In this paper, we describe how agents can support collaborative planning within international coalitions, formed in an ad hoc fashion as a response to military and humanitarian crises. As these coalitions are formed rapidly and without much lead time or co-training, human planners may be required to observe a plethora of policies that direct their planning effort. In a series of experiments, we show how agents can support human planners, ease their cognitive burden by giving advice on the correct use of policies and catch possible violations. The experiments show that agents can effectively prevent policy violations with no significant extra cost.


ACM Transactions on Intelligent Systems and Technology | 2013

Stereotypical trust and bias in dynamic multiagent systems

Chris Burnett; Timothy J. Norman; Katia P. Sycara

Large-scale multiagent systems have the potential to be highly dynamic. Trust and reputation are crucial concepts in these environments, as it may be necessary for agents to rely on their peers to perform as expected, and learn to avoid untrustworthy partners. However, aspects of highly dynamic systems introduce issues which make the formation of trust relationships difficult. For example, they may be short-lived, precluding agents from gaining the necessary experiences to make an accurate trust evaluation. This article describes a new approach, inspired by theories of human organizational behavior, whereby agents generalize their experiences with previously encountered partners as stereotypes, based on the observable features of those partners and their behaviors. Subsequently, these stereotypes are applied when evaluating new and unknown partners. Furthermore, these stereotypical opinions can be communicated within the society, resulting in the notion of stereotypical reputation. We show how this approach can complement existing state-of-the-art trust models, and enhance the confidence in the evaluations that can be made about trustees when direct and reputational information is lacking or limited. Furthermore, we show how a stereotyping approach can help agents detect unwanted biases in the reputational opinions they receive from others in the society.


IEEE Intelligent Systems | 2014

Supporting Trust Assessment and Decision Making in Coalitions

Chris Burnett; Timothy J. Norman; Katia P. Sycara; Nir Oren

Modern multiorganizational coalitions can bring diverse sets of capabilities, assets, and information sources to bear on complex and dynamic operations. However, successfully completing these operations places demands on the trust between coalition partners. When its necessary to rely on other partners, decision makers must be able to make rapid and effective trust assessments and decisions. Here, the authors focus on coalition information acquisition and discuss mechanisms for assessing trust and arriving at decisions about how to act when trust can be supplemented by controls. They also discuss future directions for these systems and highlight outstanding challenges.


trust and trustworthy computing | 2013

TRUMP: A Trusted Mobile Platform for Self-management of Chronic Illness in Rural Areas

Chris Burnett; Peter Edwards; Timothy J. Norman; Liang Chen; Yogachandran Rahulamathavan; Mariesha Jaffray; Edoardo Pignotti

Disease self-management interventions have the potential to greatly benefit both sufferers of chronic illnesses and healthcare providers in rural areas. In this paper, we discuss our vision for a trusted platform for delivering self-management interventions in rural areas of the UK and India using second-generation mobile devices, and outline the key trust and privacy challenges in realising such an infrastructure. We illustrate our discussion with an example depression intervention scenario, highlighting some progress to date, and our plans towards realising this architecture.


Proceedings of SPIE | 2012

Trust and obfuscation

Murat Sensoy; Chatschik Bisdikian; Nir Oren; Chris Burnett; Timothy J. Norman; Mani B. Srivastava; Lance M. Kaplan

In modern coalition operations, decision makers must be capable of obtaining and fusing data from diverse sources. The reliability of these sources can vary, and, in order to protect their interests, the data they provide can be obfuscated. The trustworthiness of fused data depends on both the reliability of these sources and their obfuscation strategy. Information consumers must determine how to evaluate trust in the presence of obfuscation, while information providers must determine the appropriate level of obfuscation in order to ensure both that they remain trusted, and do not reveal any private information. In this paper, through a coalition scenario, we discuss and formalise trust and obfuscation in these contexts and the complex relationships between them.


adaptive agents and multi agents systems | 2010

Bootstrapping trust evaluations through stereotypes

Chris Burnett; Timothy J. Norman; Katia P. Sycara


Archive | 2008

Agent Support for Mission Planning Under Policy Constraints

Chris Burnett; Daniele Masato; Mairi McCallum; Timothy J. Norman; Joseph A. Giampapa; Martin J. Kollingbaum; Katia P. Sycara


adaptive agents and multi-agents systems | 2012

Sub-delegation and trust

Chris Burnett; Nir Oren


Archive | 2008

A Model of Human Teamwork for Agent-Assisted Search Operations

Gita Sukthankar; Katia P. Sycara; Joseph A. Giampapa; Chris Burnett

Collaboration


Dive into the Chris Burnett's collaboration.

Top Co-Authors

Avatar

Katia P. Sycara

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nir Oren

University of Aberdeen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gita Sukthankar

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge