Skip to main content

Advertisement

An empirical comparative evaluation of requirements engineering methods

Abstract

Requirements Engineering (RE) is a relatively young discipline, and still many advances have been achieved during the last decades. In particular, numerous RE approaches are proposed in the literature with the aim of understanding a certain problem (e.g. information systems development) and establishing a knowledge base that is shared between domain experts and developers (i.e. a requirements specification). However, there is a growing concern for empirical validations that assess RE proposals and statements. This paper is related to the assessment of the quality of functional requirements specifications, using the Method Evaluation Model (MEM) as a theoretical framework. The MEM distinguishes the actual efficacy and the perceived efficacy of a method. In order to assess the actual efficacy or RE methods, the conceptual model quality framework by Lindland et al. can be applied; in this paper, we focus on the completeness and granularity of requirements models and extend this framework by defining four new metrics (e.g. degree of functional encapsulations completeness with respect to a reference model, number of functional fragmentation errors). In order to assess the perceived efficacy, conventional questionnaires can be used. A laboratory experiment with master students has been carried out, in order to compare (using the proposed metrics) two RE methods; namely, Use Cases and Communication Analysis. With respect to actual efficacy, results indicate greater model quality (in terms of completeness and granularity) when Communication Analysis guidelines are followed. With respect to perceived efficacy, we found that Use Cases was perceived to be slightly easier to use than Communication Analysis. However, Communication Analysis was perceived to be more useful in terms of determining the proper business processes granularity. The paper discusses these results and highlights some key issues for future research in this area.

References

  1. 1.

    España S, Condori-Fernández N, González A, Pastor Ó (2009) Evaluating the completeness and granularity of functional requirements specifications: a controlled experiment. In: 17th IEEE international requirements engineering conference (RE’09), Atlanta, GA, USA. IEEE, New York, pp 161–170

  2. 2.

    Boehm B, McClean RK, Ufrig DB (1975) Some experience with automated aids to the design of large-scale reliable software. IEEE Trans Softw Eng 1(1):125–133

  3. 3.

    Wieringa RJ, Heerkens JMG (2004) Evaluating the structure of research papers: a case study. In: 2nd international workshop in comparative evaluation of requirements engineering, Kyoto, Japan. IEEE Press, New York, pp 41–50

  4. 4.

    Wieringa RJ, Heerkens JMG (2006) The methodological soundness of requirements engineering papers: a conceptual framework and two case studies. Requir Eng 11(4):295–307

  5. 5.

    Höfer A, Tichy WF (2007) Status of empirical research in software engineering. In: Empirical software engineering issues. Critical assessment and future directions. LNCS, vol 4336. Springer, Berlin, pp 10–19

  6. 6.

    Iivari J, Kerola P (1983) A sociocybernetic framework for the feature analysis of information systems development methodologies. In: Olle TW, Sol HG, Tully CJ (eds) Information systems methodologies: a feature analysis. North-Holland, Amsterdam, pp 87–139

  7. 7.

    Dieste O, Lopez M, Ramos F (2008) Updating a systematic review about selection of software requirements specification techniques. In: 11th workshop on requirements engineering (WER 2008), Barcelona, Spain

  8. 8.

    Siau K, Rossi M (1998) Evaluation of information modeling methods—a review. In: 31st Hawaii international conference on systems science (HICSS 1998), Kohala Coast, USA, vol 5. IEEE, New York, pp 314–322

  9. 9.

    Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2000) Experimentation in Software Engineering: an introduction. Kluwer, Dordrecht

  10. 10.

    Cockburn A (2000) Writing effective use cases. Addison-Wesley, Reading

  11. 11.

    España S, González A, Pastor Ó (2009) Communication Analysis: a requirements elicitation approach for information systems. In: 21st international conference on advanced information systems, Amsterdam, The Netherlands. LNCS. Springer, Berlin

  12. 12.

    Wieringa RJ (1996) Requirements engineering: frameworks for understanding. Wiley, New York

  13. 13.

    Dobing B, Parsons J (2006) How UML is used. Commun ACM 49(5):109–113

  14. 14.

    Simons AJH (1999) Use cases considered harmful. In: Technology of object-oriented languages and systems, Nancy, France. IEEE Computer Society, Los Alamitos, pp 194–203

  15. 15.

    Dano B, Briand H, Barbier F (1997) A use case driven requirements engineering process. In: Third IEEE international symposium on requirements engineering (RE’97), Antapolis, MD. IEEE, New York

  16. 16.

    Rolland C, Achour CB (1998) Guiding the construction of textual use case specifications. Data Knowl Eng J 25(1–2):125–160

  17. 17.

    Constantine LL, Lockwood LAD (1999) Software for use: a practical guide to the models and methods of usage-centered design. Addison-Wesley, Reading

  18. 18.

    Cockburn A (1997) Structuring use cases with goals. J Object-Oriented Program

  19. 19.

    Phalp KT, Jonathan V, Cox K (2007) Improving the quality of use case descriptions: empirical assessment of writing guidelines. Softw Qual Control 15(4):383–399

  20. 20.

    Cox K, Phalp K, Shepperd M (2001) Comparing use case writing guidelines. In: 7th international workshop on requirements engineering: foundations for software quality (REFSQ’2001), Interlaken, Switzerland

  21. 21.

    Pastor O, González A, España S (2007) Conceptual alignment of software production methods. In: Conceptual modelling in information systems engineering. Springer, Heidelberg, pp 209–228

  22. 22.

    González A, España S, Pastor O (2008) Towards a communicational perspective for enterprise information systems modelling. In: IFIP WG 8.1 working conference on the practice of enterprise modeling, Stockholm, Sweden. LNBIP. Springer, Berlin

  23. 23.

    González A, España S, Pastor O (2009) Unity criteria for business process modelling: a theoretical argumentation for a software engineering recurrent problem. In: 3rd international conference on research challenges in information science (RCIS 2009), Fes, Morocco. IEEE, New York

  24. 24.

    Moody DL (2003) The Method Evaluation Model: a theoretical model for validating information systems design methods. In: Proceedings of the 11th European conference on information systems (ECIS 2003), Naples, Italy, 16–21 June 2003

  25. 25.

    Yadav SB, Bravoco RR, Chatfield AT, Rajkumar TM (1988) Comparison of analysis techniques for information requirement determination. Commun ACM 31(9):1090–1097

  26. 26.

    Lindland OI, Sindre G, Sølvberg A (1994) Understanding quality in conceptual modeling. IEEE Softw 11(2):42–49

  27. 27.

    Moody DL, Sindre G, Brasethvik T, Solvberg A (2003) Evaluating the quality of information models: empirical testing of a conceptual model quality framework. In: 25th international conference on software engineering (ICSE 2003), Portland, USA, pp 295–305

  28. 28.

    Moody DL, Sindre G, Brasethvik T, Solvberg A (2002) Evaluating the quality of process models: empirical testing of a quality framework. In: 21st international conference on conceptual modeling. Springer, Berlin

  29. 29.

    Larman C (1997) Applying UML and patterns. Prentice Hall, New York (see p 53)

  30. 30.

    Övergaard G, Palmkvist K (2004) Use Cases: patterns and blueprints. Addison-Wesley, Reading (see Part V)

  31. 31.

    Kulak D, Guiney E (2000) Use cases: requirements in context. Addison-Wesley, Reading, pp 94–95

  32. 32.

    Langlois RN (2002) Modularity in technology and organization. J Econ Behav Organ 49(1):19–37

  33. 33.

    Reijers H, Mendling J (2008) Modularity in process models: review and effects. In: 6th international conference on business process management (BPM 2008), Milan, Italy. LNCS, vol 5240. Springer, Berlin, pp 20–35

  34. 34.

    Rescher N (1977) Methodological pragmatism: systems-theoretic approach to the theory of knowledge. Basil Blackwell, Oxford

  35. 35.

    Davis FD, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology: a comparison of two theoretical models. Manag Sci 35(8):982–1003

  36. 36.

    Davis AM, Overmyer S, Jordan K, Caruso J, Dandashi F, Dinh A, Kincaid G, Ledeboer G, Reynolds P, Sitaram P, Ta A, Theofanos M (1993) Identifying and measuring quality in a software requirements specification. In: 1st international software metrics symposium, pp 141–152

  37. 37.

    Pohl K (1994) The three dimensions of requirements engineering: a framework and its applications. In: 5th international conference on advanced information systems engineering, Paris, France. Pergamon, New York

  38. 38.

    Krogstie J, Sindre G, Jorgensen H (2006) Process models representing knowledge for action: a revised quality framework. Eur J Inf Syst 15(1):91–102

  39. 39.

    Falkenberg E, Hesse W, Lindgreeen P, Nilsson B, Oei JLH, Rolland C, Stamper RK, VanAssche F, Verrijn-Stuart A, Voss K (1998) FRISCO. A framework of information systems concepts. IFIP WG 8.1 report

  40. 40.

    Krogstie J, Lindland OI, Sindre G (1995) Towards a deeper understanding of quality in requirements engineering. In: 7th international conference on advanced information systems engineering. Springer, Berlin, pp 82–95

  41. 41.

    Moody DL, Shanks GG (1994) What makes a good data model? Evaluating the quality of entity relationship models. In: 13th international conference on the entity-relationship approach, Manchester, UK. Springer, Berlin, pp 94–111

  42. 42.

    Schuette R, Rotthowe T (1998) The guidelines of modeling—an approach to enhance the quality in information models. In: Conceptual modeling (ER’98), Singapore. LNCS, vol 1507. Springer, Berlin, pp 240–254

  43. 43.

    Schuette R (1999) Architectures for evaluating the quality of information models—a meta and an object level comparison. In: 18th international conference on conceptual modeling (RE 1999). Springer, Berlin, pp 490–505

  44. 44.

    Shanks GG, Darke P (1997) Quality in conceptual modelling: linking theory and practice. Asia-Pacific conference on information systems (PACIS 1007), Brisbane, pp 805–814

  45. 45.

    Moody DL (2005) Theoretical and practical issues in evaluating the quality of conceptual models: current state and future directions. Data Knowl Eng 55(3):243–276

  46. 46.

    Lockemann PC, Mayr HC (1986) Information system design: techniques and software support. Information processing, vol 86. North-Holland, Amsterdam

  47. 47.

    Zave P, Jackson M (1997) Four dark corners of requirements engineering. ACM Trans Softw Eng Methodol 6(1):1–30

  48. 48.

    Jacobson I (2004) Use cases—yesterday, today, and tomorrow. Softw Syst Model 3(3):210–220

  49. 49.

    Kroll P (2004) Dr Process how many use cases should you have in a system? IBM Rational developerWorks documentation. Accessed 02-2009

  50. 50.

    Juristo N, Moreno A (2001) Basics of software engineering experimentation. Kluwer, Boston

  51. 51.

    Achour CB, Rolland C, Maiden NAM, Souveyet C (1999) Guiding use case authoring: results of an empirical study. In: IEEE symposium on requirements engineering. IEEE, Los Alamitos

  52. 52.

    Cox K, Phalp K (2000) Replicating the CREWS use case authoring guidelines experiment. Empir Softw Eng 5(3):245–267

  53. 53.

    Anda B, Sjøberg DIK, Jørgensen M (2001) Quality and understandability of use case models. In: 15th European conference on object-oriented programming (ECOOP 2001). LNCS, vol 2072. Springer, Berlin, pp 402–428

  54. 54.

    Fortuna M, Werner C, Borges M (2007) Um modelo integrado de requisitos com casos de uso. In: Workshop de ingeniería de requisitos y ambientes software IDEAS 2007

  55. 55.

    Jacobson I (1987) Object-oriented development in an industrial environment. In: Conference on object-oriented programming systems, languages and applications, Orlando, FL, USA. ACM, New York, pp 183–191

  56. 56.

    OMG (2009) OMG unified modeling language (OMG UML), superstructure, V2.2. http://www.omg.org/cgi-bin/doc?formal/09-02-02. Accessed 02-2010

  57. 57.

    Basili V, Rombach HD (1988) The TAME project: towards improvement-oriented software environments. IEEE Trans Softw Eng 14(6):758–773

  58. 58.

    Garson D (1998) Scales and standard measures. North Carolina State University. Updated September 2008. http://www2.chass.ncsu.edu/garson/pa765/standard.htm

  59. 59.

    Runeson P (2003) Using students as experiment subjects—an analysis on graduate and freshmen student data. In: 7th international conference on empirical assessment in software engineering, pp 95–102

Download references

Author information

Correspondence to Sergio España.

Additional information

This paper revises and extends previous work that has been published in the 17th IEEE International Requirements Engineering Conference (RE’09) [1].

Rights and permissions

Reprints and Permissions

About this article

Cite this article

España, S., Condori-Fernandez, N., González, A. et al. An empirical comparative evaluation of requirements engineering methods. J Braz Comput Soc 16, 3–19 (2010) doi:10.1007/s13173-010-0003-5

Download citation

Keywords

  • Experiment
  • Requirements specification
  • Use Cases
  • Communication Analysis
  • Perceived usefulness
  • Perceived ease of use
  • Method Evaluation Model
  • Conceptual model quality framework