EASE 2009

13th International Conference on Evaluation and Assessment in Software Engineering

20-21st April 2009

Program

Time Sunday 19th April 2009
7pm

Evening reception and buffet meal (college bar open afterwards)

Holgate room in Grey College

 

Time Monday 20th April 2009
8:00-9:15 Breakfast & Registration
9:15-9:30

Opening session

David Budgen

9:30-10:45

Keynote Address

How to Impact Software Engineering Practice Through Empirical Research
Magne Jørgensen

10:45-11:15 Coffee
11:00-12:45

Session 1 - 'Industry Related Studies'

Empirical Support for Two Refactoring Studies Using Commercial C# Software
M Gatrell, S Counsell, T Hall

Investigating the Use of Chronological Splitting to Compare Software Cross-company and Single-company Effort Predictions: A Replicated Study
Emilia Mendes, Chris Lokan

Empirical Validation of a Requirements Engineering Process Guide
Jorg Leuser, Nicolas Porta, Armin Bolz, Alexander Raschke

12:45-14:00 Lunch
14:00-15:15

Session 2 - 'Methodology'

Reference-based search strategies in systematic reviews
Mats Skoglund, Per Runeson

Building an Expert-based Web Effort Estimation Model using Bayesian Networks
Emilia Mendes, Carmel Pollino, Nile Mosley

Does an 80:20 rule apply to Java coupling?
Asma Mubarak, Steve Counsell, Rob Hierans

15:30-15:45 Coffee
16:00-17:30

Session 3 - 'Quality'

An Evaluation of Quality Checklist Proposals—A participant-observer case study
Barbara Kitchenham, Pearl Brereton, David Budgen, Zhi Li

Preliminary Reporting Guidelines for Experience Papers
David Budgen, Cheng Zhang

Factors Explaining External Quality in 54 Case Studies of Software Development Projects
Chris Thomson, Mike Holcombe

18:30 Reception and Conference Dinner (Pennington Room)

 

Time Tuesday 21st April 2009
8:00-9:00

Breakfast

9:00

Introduction

9:00-10:45

Keynote Address

Informed Engineering Management: Determining the Role for Empirical Assessment
John McDermid

10:45-11:15 Coffee
11:15-12:45

Session 4 - 'SLRs 1'

Using Systematic Reviews and Evidence-Based Software Engineering with Masters’ Students
Briony Oates, Graham Capper

Harmfulness of Code Duplication: A Structured Review of the Evidence
Wiebe Hordijk, Maria Laura Ponisio, Roel Wieringa

An assessment of published evaluations of requirements management tools
Austen Rainer, Sarah Beecham, Cei Sanderson

12:45-14:00 Lunch
14:00-15:30

Session 5 - 'SLRs 2'

A Literature Review of Expert Problem Solving using Analogy
Carolyn Mair, Miriam Martincova, Martin Shepperd

Evaluation of Variability Management Approaches: A Systematic Review
Lianping Chen, Muhammad Ali Babar, Ciaran Cawley

A Quality Checklist of Technology-Centred Testing Studies
Barbara Kitchenham, Andrew J Burn, Zhi Li

Using Process Mining Metrics to Measure Noisy Process Fidelity
Chris Thomson, Marian Gheorghe

15:30

Closing session

15:45 Coffee