Please note: This PhD seminar will take place in DC 2564.
Belen Bonilla, PhD candidate
David R. Cheriton School of Computer Science
Supervisor: Professor Daniel Berry
There is a debate regarding how the requirements for a computer-based system (CBS) should be identified. Some argue that it should be done upfront, before implementation begins, to avoid the higher costs of fixing requirement defects later, while others argue that requirements should be identified and implemented incrementally, to avoid the waste of analyzing requirements that may later change or not be implemented. Berry et al. believe that the arguments of each approach are correct, but each is about a different kind of requirements, (1) scope determineD (D) or (2) scope determininG (G), respectively, and claim that D requirements should be identified upfront while G requirements can be identified incrementally.
This D-vs-G categorization, along with findings from past studies indicating that a large majority of defects in CBSs arise from missing D requirements, leads to the following question: Does a modified agile method (MAM), in which time in each sprint is taken to identify all and only the D requirements of the sprint’s scope before beginning the scope’s implementation, result in a faster and less buggy implementation of the CBS than a normal agile method (NAM), in which implementation of any sprint’s scope begins as soon as the scope of the sprint is identified? Addressing this question through controlled experiments is difficult because a real-life application of either of these full-lifecycle methods is not experiment-sized, and mining project data is not possible because MAM is only proposed and has not been used in any real project.
In this talk, I present a different approach to address this question through a controlled experiment using experiment-sized chunks of MAM and NAM that can serve as proxies of the entire full-lifecycle methods. I also describe the experiment designed to compare these chunks, including the variables, hypotheses, experimental design, subjects, experimental procedure, instrumentation, threats to validity, and data analysis plan.