Nazri Kama
Universiti Teknologi Malaysia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nazri Kama.
australian software engineering conference | 2013
Mehran Halimi Asl; Nazri Kama
Modern software development is iterative and encourages frequent interactions between the software development team and the stakeholders of the software. These interactions will generate change requests as the requirements gradually evolve to meet the stakeholders expectations or due sudden shifts in circumstances. However, during the development, software developers have to assess the impact of these change requests with respect to incomplete status of software artefacts. This situation presents one important aspect when considering a change request, which is how to estimate the size of change impact that is required by a requirement change, given inconsistent states of artefacts across the project. Therefore, this paper introduces a new Change Impact Size Estimation (CISE) approach for the software development phase. Further on that, a prototype tool has been developed to support the implementation of the approach and evaluation. The case study method used for evaluating this approach corroborated the functionality and accuracy of this approach for estimating the change impact size.
asia-pacific software engineering conference | 2012
Nazri Kama; Faizul Azli
Software undergoes changes at all stages of the software development process. Accepting too many changes will cause expense and delay and rejecting the changes may cause customer dissatisfaction. One of the inputs that help the software project management to decide whether to accept or reject the changes is by having reliable predictions of the impact of the changes. Change impact analysis is one of the methods that can be used to provide the predictive information. Many current impact analysis approaches have been developed for the software maintenance phase. These approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis since it represents the final user requirements. However, these assumptions are not practical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed. This leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the accuracy over current impact analysis results.
IET Software | 2016
Sufyan Basri; Nazri Kama; Saiful Adli; Faizura Haneem
Change effort estimation is needed not only in software maintenance phase, but also in the software development phase. Many techniques have been developed to estimate required change effort for a particular change request. One of these techniques is impact analysis. One main challenge of this technique from a software development perspective is that the existence of inconsistent states of some software artefacts, i.e. some classes have completely developed and some of them partially developed. Therefore, this research proposes a new change effort estimation model that overcomes this challenge using a combination of static and dynamic analysis techniques. The results of this research are two-fold: (i) a new change effort estimation model using static and dynamic analysis techniques for software development phase; and (ii) demonstration of significant achievements of the approach via extensive experimental validation using several case studies.
symposium on information and communication technology | 2016
Sufyan Basri; Nazri Kama; Faizura Haneem; Saiful Adli Ismail
In any software development life cycle, requirement and software changes are inevitable. One of the factors that influences the effectiveness of the change acceptance decision is the accuracy of the effort prediction for requirement changes. There are two current models that have been widely used to predict rework effort for requirement changes which are algorithmic and non-algorithmic models. The algorithmic model is known for its formal and structural way of prediction and best suited for Traditional software development methodology. While non-algorithmic model is widely adopted for Agile software development methodology of software projects due to its easiness and requires less work in term of effort predictability. Nevertheless, none of the existing effort prediction models for requirement changes are proven to suit both, Traditional and Agile software development methodology. Thus, this paper proposes an algorithmic-based effort prediction model for requirement changes that uses change impact analysis method which is applicable for both Traditional and Agile software development methodologies. The proposed model uses a current selected change impact analysis method for software development phase. The proposed model is evaluated through an extensive experimental validation using case study of six real Traditional and Agile methodologies software projects. The evaluation results confirmed a significance accuracy improvement of the proposed model over the existing approaches for both Traditional and Agile methodologies.
international conference software and computer applications | 2018
Jalal Shah; Nazri Kama
Software Change Effort Estimation (SCEE) is required not only in software maintenance phase, but it is also required in software development phase. Many methods have been developed to estimate effort for software requirement changes. Function Point Analysis (FPA) is one of these methods, which is used to estimate effort for software requirement changes during software maintenance phase. In software maintenance phase software artifacts are in consistent state. One main challenge for this method from software development perspective is that the presence of inconsistent states of software artifacts, i.e. some classes are completely developed, some of them are partially developed and some are not developed at all. Hence, this research proposes a new SCEE model that overcomes this challenge using the combination of Change Impact Analysis technique (CIA) together with Function Point Analysis (FPA) method. The results of our new model, i.e. Function Point Analysis for Software Development Phase (FPA-SDP) can help software project managers in terms of: (1) predicting the impact of change requests on software artifacts and (2) understanding the actual conditions of inconsistent states of software artifacts during software development phase.
Proceedings of the 7th International Conference on Software and Information Engineering | 2018
Jalal Shah; Nazri Kama; Saiful Adli Ismail
It is important to know the actual size and complexity of the software before predicting the amount of effort required to be implemented. Two most common methods used for software size estimation are: (i) Source Lines of Code (SLOC) and (ii) Function Point Analysis (FPA). Estimating the size of a software with SLOC method is only possible once the process of coding is completed. On the other hand, estimating software size with FPA method is possible in early phases of Software Development Life Cycle (SDLC). However, one main challenge from the viewpoint of software development phase, is the presence of inconsistent states of software artifacts i.e. some of the classes are completely developed, some are partially developed, and some are not developed yet. Therefore, this research is using the new developed model i.e. Function Point Analysis for Software Development Phase (FPA-SDP) in an empirical study to overcome this challenge. The results of FPA-SDP model can help software project managers in: (i) knowing the inconsistent states of software artifacts (ii) estimating the actual size of a change request with its complexity level for software development phase.
international conference on research and innovation in information systems | 2017
Faizura Haneem; Rosmah Ali; Nazri Kama; Sufyan Basri
The management of scattered datasets over multiple data sources has led to data quality issues in an organization. Master Data Management (MDM) has been used to resolve this issue by providing “a single reference of truth” to reduce data redundancy in an organization. To the best of our knowledge, there is lack of study reviewing the progress of MDM research. Therefore, this paper intends to fill in the gap by conducting a systematic literature review to summarize the progress of MDM research domain. We also synthesize the data quality issues on multiple data sources management and how MDM tends to resolve them. We strategized our literature methods through relevant keywords searching from nine (9) databases including journals, proceedings, books, book chapters and industry research. The strategy has shown seven hundred and seventy-seven (777) articles were found during the initial searching stage and three hundred and forty-seven (347) relevant articles were filtered out for the analysis. The review shows that currently, MDM research has received a slope of enlightenment hence it still relevant to be explored. MDM is not just about a technology, it is an approach through a combination of processes, data governance, and technical implementation to resolve data quality issues on multiple data sources management such as duplication, inaccuracy and inconsistency of information.
international conference on research and innovation in information systems | 2017
Faizura Haneem; Rosmah Ali; Nazri Kama; Sufyan Basri
Systematic Literature Review (SLR) is a structured way of conducting a review of existing research works produced by the earlier researchers. The application of right data analysis technique during the SLR evaluation stage would give an insight to the researcher in achieving the SLR objective. This paper presents how descriptive analysis and text analysis can be applied to achieve one of the common SLR objectives which is to study the progress of specific research domain. These techniques have been demonstrated to synthesis the progress of Master Data Management research domain. Using descriptive analysis technique, this study has identified a trend of related literary works distribution by years, sources, and publication types. Meanwhile, text analysis shows the common terms and interest topics in the Master Data Management research which are 1) master data, 2) data quality, 3) business intelligence, 4) business process, 5) data integration, 6) big data, 7) data governance, 8) information governance, 9) data management and 10) product data. It is hoped that other researchers would be able to replicate these analysis techniques in performing SLR for other research domains.
4th Symposium on Asia-Pacific Requirements Engineering Symposium, APRES 2017 | 2017
Jalal Shah; Nazri Kama
Software go through changes at all stages of Software Development Life Cycle (SDLC). Accepting a large amount of changes may raise the time and cost of the software. While denying changes may rise customer dissatisfaction. An effective change acceptance decision helps software project manager to decide whether to accept or reject these changes. Software effort estimation is one of the methods that helps software project manager in an efficient change acceptance decision. Several software effort estimation techniques have been introduced to date and Function Point Analysis (FPA) is one of them. FPA method is used for measuring the size and complexity of a software by calculating the functionality that the system provides its user. Many studies highlighted that FPA method is used for early phases of SDLC as compared to software development phase. During software development phase software artifacts are in inconsistent states. Therefore, it is a challenging task for software project manager to estimate the amount of required effort for a change request during software development phase. In this paper we have used FPA method in a case study for requirement changes during software development phase. This study has highlighted the main concerns of using FPA method for requirement changes during software development phase.
ieee international colloquium on information science and technology | 2016
Nur Azaliah A. Bakar; S. Harihodin; Nazri Kama
Enterprise Architecture (EA) is vital element for organisation to ensure the viability of the organisation functionality. However, many organisations facing problem in EA implementation due to complexity of the EA frameworks, rigidity of business function and the chaotic IT structure. Various suggestions for better EA implementation in previous studies are yet to be verified by the EA practitioners in real case scenario. Therefore, this paper aims to measure the influential factors in EA implementation process from both experts and practitioners perspectives. This EA implementation model was formulated based on 27 factors from six categories, (IP-Internal Process, LG-Learning and Growth, AS-Authority Support, CS-Cost, TC-Technology and TM-Talent Management) which are gathered from previous studies and case studies of Malaysian Public Sector organisations. To measure this model, survey questionnaire was conducted to both selected EA experts and practitioners with intention to identify any differences between the theoretical and practical aspects in EA implementation. Findings reveal that, there is no significance difference on the level of agreement between the EA experts and practitioners except for three factors which are IP6-Rules and Process, AS5-Political Influence and CS1-Financial Resources. Hence, it can be concluded that both experts and practitioner share the same opinion on the factors that influence the EA implementation process.