J.J. Collins
University of Limerick
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J.J. Collins.
international workshop on principles of software evolution | 2011
Anne Meade; Jim Buckley; J.J. Collins
Large legacy systems that have been in use for several decades need to evolve in order to take advantage of new technological advances. One such development is the emergence of multi-core processors and parallel platforms. However, the evolution of code written for single-core platforms into code that can take advantage of multi-core technology is challenging. The aim of this research is to explore the challenges that parallel programmers face in the evolution of existing software to exploit multicore and parallel architectures. A review of the current literature was conducted and ten frequently reported challenges were identified. It is important to raise awareness of potential challenges that practitioners may face when evolving sequential code to exploit multicore platforms in order to be better prepared for future evolution. The research community can use these results to develop a research agenda in order to design and develop solutions to address these challenges.
international conference on machine learning and applications | 2009
Haoming Xu; J.J. Collins
Localization is the accurate estimation of robots current position and is critical for map building. Odometry modeling is one of the main approaches to solving the localization problem, the other being a sensor based correspondence solver. Currently few robot positioning systems support calibration of odometry errors in both feature rich indoor and landmark poor outdoor environments. To achieve good performance in various environments, the mobile robot has to be able to learn to localize in unknown environments, and reuse previously computed environment specific localization models. This paper presents a method combining the standard Back-Propagation technique and a feed-forward neural network model for odometry calibration for both synchronous and differential drive mobile robots. This novel method is compared with a generic localization module and an optimization based approach, and found to minimize odometry error because of its nonlinear input-output mapping ability. Experimental results demonstrate that the neural network approach incorporating Bayesian Regularization provides improved performance and relaxes constraints in the UMBmark method.
european conference on software process improvement | 2010
Murat Yilmaz; Rory V. O'Connor; J.J. Collins
We introduce the novel concept of applying economic mechanism design to software development process, and aim to find ways to adjust the incentives and disincentives of the software organization to align them with the motivations of the participants in order to maximize the delivered value of a software project. We envision a set of principles to design processes that allow people to be self motivated but constantly working toward project goals. The resulting economic mechanism will rely on game theoretic principles (i.e. Stackelberg games) for leveraging the incentives, goals and motivation of the participants in the service of project and organizational goals.
international symposium on empirical software engineering | 2005
A. Le Gear; Jim Buckley; J.J. Collins; K. O'Dea
Software reflexion modelling is a useful technique to assist the understanding of large software systems. However, the technique relies heavily upon available documentation and domain knowledge to begin the process. We propose a technique called software reconnexion that uses a reuse perspective of software, containing core elements of the subject system, to prompt the user during the early iterations of the reflexion modelling process, thus reducing the techniques dependency upon documentation and domain knowledge. We provide a large, ecologically valid, case study to demonstrate our technique and show, in the absence of documentation and with only limited domain knowledge, how an automatically generated reuse perspective of software can be effectively used in conjunction with reflexion modelling to aid the design recovery and comprehension of an unfamiliar system.
international conference on machine learning and applications | 2013
Salaheddin Alakkari; J.J. Collins
A study is presented on face detection using Principal Component Analysis as a paradigm for generating compact representation for the human face. The study will focus on the contribution of individual eigenfaces in the face-space for classification in order to extract a minimum encoding for very low resolution images. The fourth, sixth, and seventh eigenfaces are identified as being particularly critical for classification, with the lowest order eigenface having a significant discriminatory contribution.
high performance computing and communications | 2013
Anne Meade; Deva Kumar Deeptimahanti; Michael Johnston; Jim Buckley; J.J. Collins
Parallelizing serial software systems in order to run in a High Performance Computing (HPC) environment presents many challenges to developers. In particular, the extant literature suggests the task of decomposing large-scale data applications is particularly complex and time-consuming. In order to take stock of the state of practice of data decomposition in HPC, we conducted a two-phased study. Firstly, using focus group methodology we conducted an exploratory study at a software laboratory with an established track record in HPC. Based on the findings of this first phase, we designed a survey to assess the state of practice among experts in this field around the world. Our study shows that approximately 75% of parallelized applications use some form of data decomposition. Furthermore, data decomposition was found to be the most challenging phase in the parallelization process, consuming approximately 40% of the total time. A key finding of our study is that experts do not use any of the available tools and formal representations, and in fact, are not aware of them. We discuss why existing tools have not been adopted in industry and based on our findings, provide a number of recommendations for future tool support.
IFIP World Computer Congress, TC 2 | 2004
Seamus Galvin; J.J. Collins; Chris Exton; Finbar McGurren
One of the key reasons why ADLs are yet to be adopted commercially on a large scale is due to shortcomings in their ability to describe adequate interface specifications. An interface specification that is vague, lacking in detail, too style focused or too language-specific results in an ADL description with a restricted scope of use. This paper demonstrates how an XML-based ADL (xADL 2.0) can be extended to model detailed, meaningful interface specifications, and is used as part of a simple prototype to demonstrate how they form an integral part of an architectural description, paying particular attention to interface-level constraints. The approach is based on the principle that an ADLs interface modeling features should provide sufficient flexibility to allow them to reflect stakeholders interface concerns at all stages in the lifecycle.
Journal of Systems and Software | 2017
Anne Meade; Deva Kumar Deeptimahanti; Jim Buckley; J.J. Collins
Multi-core programming is becoming increasingly important.Data decomposition is a key challenge during parallelization for multi-core CPUs.We conduct a multi-method study to better understand data decomposition.We derive a set of 10 key requirements for tools to support parallelization.The state-of-the-art tooling support does not support these requirements. Context: Multi-core architectures are becoming increasingly ubiquitous and software professionals are seeking to leverage the capabilities of distributed-memory architectures. The process of parallelizing software applications can be very tedious and error-prone, in particular the task of data decomposition. Empirical studies investigating the complexity of data decomposition and communication are lacking.Objective: Our objective is threefold: (i) to gain an empirical-based understanding of the task of data decomposition as part of the parallelization of software applications; (ii) to identify key requirements for tools to assist developers in this task, and (iii) assess the current state-of-the-art.Methods: Our empirical investigation employed a multi-method approach, using an interview study, participant-observer case study, focus group study, and a sample survey. The empirical investigation involved collaborations with three industry partners: IBMs High Performance Computing Center, the Irish Centre for High-End Computing (ICHEC), and JBA Consulting.Results: This article presents data decomposition as one of the most prevalent tasks of parallelizing applications for multi-core architectures. Based on our studies, we identify ten key requirements for tool support to help HPC developers in this area. Our evaluation of the state-of-the-art shows that none of the extant tool support implements all 10 requirements.Conclusion: While there is a considerable body of research in the area of HPC, a few empirical studies exist which explicitly focus on the challenges faced by practitioners in this area; this research aims to address this gap. The empirical studies in this article provide insights that may help researchers and tool vendors to better understand the needs of parallel programmers.
Software Quality Assurance | 2016
Michael English; Jim Buckley; J.J. Collins
Abstract Modularity is at the core of software quality. It is an attribute which reflects the complexity of software systems, and their ability to evolve. In previous metric-based research, modularity has been predominantly assessed at the class level, but this level seems inappropriate for the large-scale software systems of today due to information overload. More recently work has begun to focus on the assessment of modularity at higher levels of abstraction for these types of software systems. In moving to assess such systems at the module rather than the class level, the first question that arises is to define the nature of a module. In previous research, the concept of module has many definitions some of which are ambiguous. In this chapter we investigate if metrics for higher level abstractions can help to inform on the composition of High Level Modules (HLMs). Another interesting question is focused on whether class level modularity metrics in object-oriented systems reflect module level modularity metrics in systems. In other words, do relationships exist between metrics extracted at different levels of abstraction in systems? This chapter probes these two issues by reviewing the relevant literature and performing a preliminary empirical study that aims to identify candidate HLMs and assesses the ability of modularity metrics at that level to inform on modularity issues at lower levels of abstraction in the system. It proposes a simple metric-based characterization of HLMs and suggests that metric correlations, at different levels of abstraction, do exist.
international conference on control, automation, robotics and vision | 2010
Haoming Xu; J.J. Collins
This paper presents a novel scan matching approach entitled Fuzzy Map Matching for mobile robot localization that extracts low level features in the form of line segment from perceptual channels that are then matched to a map given a priori. Multiple candidate matches are supported through the use of fuzzy logic, which are iteratively refined. This probabilistic based fuzzy model of scan matching is used to filter the alignment combination that have low probability in order to reduce computational complexity. In addition, the initial pose of the robot does not have to be known as a result of support for multiple hypotheses with respect to potential correspondences. Incomplete line segments that result from incomplete scans, noisy sensors, or occlusion do not present a problem as features in observation space are grown during the correspondence phase of the algorithm. This approach does not impose a heavy demand on computational resources, and is significantly less resource hungry than the probabilistic approaches. Initial results demonstrated that the algorithm performs well in real world environments.