Damien Marshall
Maynooth University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Damien Marshall.
ieee international symposium on distributed simulation and real-time applications | 2005
David J. Roberts; Damien Marshall; S. MacLoone; Declan Delaney; Tomas E. Ward; R. Aspit
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. Techniques and approaches for reducing bandwidth usage can minimize network delays by reducing the network traffic and therefore better exploiting available bandwidth. However, these approaches induce inconsistencies within the level of human perception. Dead reckoning is a well-known technique for reducing the number of update packets transmitted between participating nodes. It employs a distance threshold for deciding when to generate update packets. This paper questions the use of such a distance threshold in the context of absolute consistency and it highlights a major drawback with such a technique. An alternative threshold criterion based on time and distance is examined and it is compared to the distance only threshold. A drawback with this proposed technique is also identified and a hybrid threshold criterion is then proposed. However, the trade-off between spatial and temporal inconsistency remains.
Simulation | 2008
Dave Roberts; Rob Aspin; Damien Marshall; Seamus McLoone; Declan Delaney; Tomas E. Ward
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. The level of inconsistency arising from the network is proportional to the network delay, and thus a function of bandwidth consumption. Distributed simulation has often used a bandwidth reduction technique known as dead reckoning that combines approximation and estimation in the communication of entity movement to reduce network traffic, and thus improve consistency. However, unless carefully tuned to application and network characteristics, such an approach can introduce more inconsistency than it avoids. The key tuning metric is the distance threshold. This paper questions the suitability of the standard distance threshold as a metric for use in the dead reckoning scheme. Using a model relating entity path curvature and inconsistency, a major performance related limitation of the distance threshold technique is highlighted. We then propose an alternative time—space threshold criterion. The time—space threshold is demonstrated, through simulation, to perform better for low curvature movement. However, it too has a limitation. Based on this, we further propose a novel hybrid scheme. Through simulation and live trials, this scheme is shown to perform well across a range of curvature values, and places bounds on both the spatial and absolute inconsistency arising from dead reckoning.
ACM Crossroads Student Magazine | 2006
Damien Marshall; Tomas E. Ward; Seamus McLoone
It is the calm before the storm. By the end of 2006, Sony, Microsoft, and Nintendo will have released their new wave of gaming hardware, and the next round in the great video game battle will have begun. Capable of displaying photo-realistic images, and acting as the center of your entertainment lifestyle, these machines promise to change the face of gaming. Console games have moved away from the single-screen experiences of old, to multimillion dollar epics, featuring hours and hours of cinematic action. Truly, it is an exciting time to be a gamer. While visual and audio technology advances toward real-world fidelity, human computer interaction (HCI) — the methods by which users control the simulation — has not received the same degree of attention. But it now seems this aspect of the sense-of-presence problem may undergo a revolution similar to that of its audiovisual counterparts with the next generation of gaming devices. In this article, we discuss the driving forces behind these changes, several devices, and what current research suggests the future may hold for todays gamer.
ACM Transactions on Multimedia Computing, Communications, and Applications | 2010
Damien Marshall; Seamus McLoone; Tomas E. Ward
A key factor determining the success of a Distributed Interactive Application (DIA) is the maintenance of a consistent shared virtual world. To help maintain consistency, a number of Information Management techniques have been developed. However, unless carefully tuned to the underlying network, they can negatively impact on consistency. This work presents a novel adaptive algorithm for optimizing consistency by maximizing available bandwidth usage in DIAs. This algorithm operates by estimating bandwidth from trends in network latency, and modifying data transmission rates to match the estimated value. Results presented within demonstrate that this approach can help optimise consistency levels in a DIA.
ieee international symposium on distributed simulation and real-time applications | 2004
Damien Marshall; Declan Delaney; Seamus McLoone; Tomas E. Ward
As Distributed Interactive Applications (DIAs) become increasingly more prominent in the video game industry they must scale to accommodate progressively more users and maintain a globally consistent worldview. However, network constraints, such as bandwidth, limit the amount of communication allowed between users. Several methods of reducing network communication packets, while maintaining consistency, exist. These include dead reckoning and the hybrid strategy-based modelling approach. This latter method combines a short-term model such as dead reckoning with a long-term strategy model of user behaviour. By employing the strategy that most closely represents user behaviour, a reduction in the number of network packets that must be transmitted to maintain consistency has been shown. In this paper a novel method for constructing multiple long-term strategies using dead reckoning and polygons is described. Furthermore the algorithms are implemented in an industry-proven game engine known as Torque. A series of experiments are executed to investigate the effects of varying the spatial density of strategy models on the number of packets that need to be transmitted to maintain the global consistency of the DIA. The results show that increasing the spatial density of strategy models allows a higher consistency to be achieved with fewer packets using the hybrid strategy-based model than with pure dead reckoning. In some cases, the hybrid strategy-based model completely replaces dead reckoning as a means of communicating updates.
ieee virtual reality conference | 2006
Damien Marshall; Dave Roberts; Declan Delaney; Seamus McLoone; Tomas E. Ward
Collaboration and competition are important factors of Networked Virtual Environments (NVE). Both require a certain level of consistency in order for the interaction to be fruitful and compelling. However, finite network bandwidth and communication delay are key factors affecting this aspect of interactivity. A popular method in their mitigation for dynamic entities is the IEEE DIS standard dead reckoning mechanism [1].
ieee international symposium on distributed simulation and real time applications | 2006
Damien Marshall; Seamus McLoone; David J. Roberts; Declan Delaney; Tomas E. Ward
Dead reckoning is widely employed as an entity update packet reduction technique in distributed interactive applications (DIAs). Such techniques reduce network bandwidth consumption and thus limit the effects of network latency on the consistency of networked simulations. A key component of the dead reckoning method is the underlying error threshold metric, as this directly determines when an entity update packet is to be sent between local and remote users. The most common metric is the spatial threshold, which is simply based on the distance between a local users actual position and their predicted position. Other, recently proposed, metrics include the time-space threshold and the hybrid threshold, both of which are summarised within. This paper investigates the issue of user movement in relation to dead reckoning and each of the threshold metrics. In particular the relationship between the curvature of movement, the various threshold metrics and absolute consistency is studied. Experimental live trials across the Internet allow a comparative analysis of how users behave when different threshold metrics are used with varying degrees of curvature. The presented results provide justification for the use of a hybrid threshold approach when dead reckoning is employed in DIAs
ieee international symposium on distributed simulation and real time applications | 2006
Damien Marshall; Seamus McLoone; Declan Delaney; Tomas E. Ward
Collaboration within a distributed interactive application (DIA) requires that a high level of consistency be maintained between remote hosts. However, this can require large amounts of network resources, which can negatively affect the scalability of the application, and also increase network latency. Predictive models, such as dead reckoning, provide a sufficient level of consistency, whilst reducing network requirements. Dead reckoning traditionally uses a spatial error threshold metric to operate. In previous work, it was shown how the use of the spatial threshold could result in potentially unbounded local absolute inconsistency. To remedy this, a novel time-space threshold was proposed, that placed bounds on local absolute inconsistency. However, use of the time-space threshold could result in unacceptably large spatial inconsistency. A hybrid approach that combined both error threshold measures was then shown to place bounds on both levels of inconsistency. However, choosing suitable threshold values for use within the hybrid scheme has been problematic, as no direct comparisons can be made between the two threshold metrics. In this paper, a novel comparison scheme is proposed. Under this approach, an error threshold look-up table is generated, based on entity speed and equivalent inconsistency measures. Using this look-up table, it is shown how the performance of comparable thresholds is equal on average, from the point of view of network packet generation. These error thresholds are then employed in a hybrid threshold scheme, which is shown to improve overall consistency in comparison to the previous solution of simply using numerically equal threshold values
irish signals and systems conference | 2004
Damien Marshall; Aaron McCoy; Declan Delaney; Seamus McLoone; Tomas E. Ward
China-Ireland International Conference on Information and Communications Technologies (CIICT 2007) | 2007
Damien Marshall; Brian Mooney; Seamus McLoone; Tomas E. Ward