Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where William R. Nichols is active.

Publication


Featured researches published by William R. Nichols.


international conference on software and system process | 2014

Initial evaluation of data quality in a TSP software engineering project data repository

Yasutaka Shirai; William R. Nichols; Mark Kasunic

To meet critical business challenges, software development teams need data to effectively manage product quality, cost, and schedule. The Team Software ProcessSM (TSPSM) provides a framework that teams use to collect software process data in real time, using a defined disciplined process. This data holds promise for use in software engineering research. We combined data from 109 industrial projects into a database to support performance benchmarking and model development. But is the data of sufficient quality to draw conclusions? We applied various tests and techniques to identify data anomalies that affect the quality of the data in several dimensions. In this paper, we report some initial results of our analysis, describing the amount and the rates of identified anomalies and suspect data, including incorrectness, inconsistency, and credibility. To illustrate the types of data available for analysis, we provide three examples. The preliminary results of this empirical study suggest that some aspects of the data quality are good and the data are generally credible, but size data are often missing.


empirical software engineering and measurement | 2014

Classification of project team patterns for benchmarking.

Yasutaka Shirai; William R. Nichols

ckground: Empirical software engineering data supports both research and benchmarking. Our repository is a bottom-up collection of developers daily tracking of effort, product size, defects, and development phase activity. We observed that structuring of teams in the development cycle may not only depend upon project size, life cycle phase, and development strategy, but also be significant factor for segmenting performance data. Aim: We use this data to identify patterns of staffing and team organization over time during execution of project and investigate the use of this characteristic for benchmarking. Method: We combined data from 89 industrial project development cycles into a database. Each cycle contains a team performing executing a plan within a development cycle. Using the project team name, developer identification, active dates, and development phases we associated sub-projects into larger project groupings. From these groups we identified organizing patterns of size, number of teams, and skill sets, that evolved longitudinally. Results: We identified six project patterns. Each pattern grouping includes between six and twelve reported projects, consisting of two or three collections of sub- project groups. We demonstrate that project pattern as an attribute is associated with some significant differences for project duration. Conclusions: The development efforts in our repository fit into a small number of patterns, each of which had characteristics presumably chosen by their teams to achieve their business goals; these patterns appear to differ qualitatively and quantitatively. As we develop our benchmarks, we will use the patterns to show how performance data may vary for different development project structures.


Archive | 2009

The Personal Software Process (PSP) Body of Knowledge, Version 2.0

Marsha Pomeroy-Huff; Robert Cannon; Timothy A. Chick; Julia Mullaney; William R. Nichols


Archive | 2010

Team Software Process (TSP) Body of Knowledge (BOK)

Watts S. Humphrey; Timothy A. Chick; William R. Nichols; Marsha Pomeroy-Huff


Archive | 2009

Deploying TSP on a National Scale: An Experience Report from Pilot Projects in Mexico

William R. Nichols; Rafael Salazar


Archive | 2012

Results of SEI Line-Funded Exploratory New Starts Projects

Len Bass; Nanette Brown; Gene M. Cahill; William Casey; Sagar Chaki; Corey Cohen; Dionisio de Niz; David French; Arie Gurfinkel; Rick Kazman; Edwin J. Morris; Brad A. Myers; William R. Nichols; Robert L. Nord; Ipek Ozkaya; Raghvinder S. Sangwan; Soumya Simanta; Ofer Strichman; Peppo Valetto


Archive | 2013

TSP Performance and Capability Evaluation (PACE): Customer Guide

William R. Nichols; Mark Kasunic; Timothy A. Chick


Archive | 2010

Using TSP Data to Evaluate Your Project Performance

Shigeru Sasao; William R. Nichols; James McCurley


Proceedings of the Annual Meeting of the Cognitive Science Society | 2007

Decision Making Using Learned Causal Structures

William R. Nichols; David Danks


compsac workshops | 2016

Measuring Software Assurance.

Robert J. Ellison; William R. Nichols; Carol Woody

Collaboration


Dive into the William R. Nichols's collaboration.

Top Co-Authors

Avatar

Timothy A. Chick

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James McHale

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Mark Kasunic

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Robert Cannon

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Carol Woody

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Julia Mullaney

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Mike Konrad

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Robert J. Ellison

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge