13th International Conference on Evaluation and Assessment in Software Engineering
20-21st April 2009
|Time||Sunday 19th April 2009|
Evening reception and buffet meal (college bar open afterwards)
Holgate room in Grey College
|Time||Monday 20th April 2009|
|8:00-9:15||Breakfast & Registration|
How to Impact Software Engineering Practice Through Empirical Research
Session 1 - 'Industry Related Studies'
Empirical Support for Two Refactoring Studies Using Commercial C# Software
Investigating the Use of Chronological Splitting to Compare Software Cross-company and Single-company Effort Predictions: A Replicated Study
Empirical Validation of a Requirements Engineering Process Guide
Session 2 - 'Methodology'
Building an Expert-based Web Effort Estimation Model using Bayesian Networks
Does an 80:20 rule apply to Java coupling?
Session 3 - 'Quality'
An Evaluation of Quality Checklist Proposals—A participant-observer case study
Preliminary Reporting Guidelines for Experience Papers
Factors Explaining External Quality in 54 Case Studies of Software Development Projects
|18:30||Reception and Conference Dinner (Pennington Room)|
|Time||Tuesday 21st April 2009|
Informed Engineering Management: Determining the Role for Empirical Assessment
Session 4 - 'SLRs 1'
Using Systematic Reviews and Evidence-Based Software Engineering with Masters’ Students
Harmfulness of Code Duplication: A Structured Review of the Evidence
An assessment of published evaluations of requirements management tools
Session 5 - 'SLRs 2'
A Literature Review of Expert Problem Solving using Analogy
Evaluation of Variability Management Approaches: A Systematic Review
A Quality Checklist of Technology-Centred Testing Studies
Using Process Mining Metrics to Measure Noisy Process Fidelity