Abstract
[Context and motivation] When a software-based system evolves, its requirements continuously change. This affects the acceptance tests, which must be adapted accordingly in order to maintain the quality of the evolving system. [Question/problem] In practice, requirements and acceptance test documents are not always aligned with each other, nor with the actual system behavior. Such inconsistencies may introduce software quality problems, unintended costs and project delays. [Principal ideas/results] To keep evolving requirements and their associated acceptance tests aligned, we are developing an approach called GuideGen that automatically generates guidance in natural language on how to modify impacted acceptance tests when a requirement is changed. We evaluated GuideGen using real-world data from three companies. For 262 non-trivial changes of requirements, we generated guidance on how to change the affected acceptance tests and evaluated the quality of this guidance with seven experts. The correctness of the guidance produced by our approach ranged between 67 and 89% of all changes for the three evaluated data sets. We further found that our approach performed better for agile requirements than for traditional ones. [Contribution] Our approach facilitates the alignment of acceptance tests with the actual requirements and also improves the communication between requirements engineers and testers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
This is a heuristic value which yielded excellent performance in our evaluation, cf. Sect. 4.2.
- 2.
https://docs.google.com/forms/d/1vLJYFIjmtLjzC60e2iT3JLbs9ST8LmOOhO9kotfrBwo/edit. For confidentiality reasons, the file does not contain the real data from our data sets, but only the example shown in this paper.
- 3.
References
Bjarnason, E., Runeson, P., Borg, M., Unterkalmsteiner, M., Engström, E., Regnell, B., Sabaliauskaite, G., Loconsole, A., Gorschek, T., Feldt, R.: Challenges and practices in aligning requirements with verification and validation: a case study of six companies. Empir. Softw. Eng. 19(6), 1809–1855 (2014)
Hotomski, S., Ben Charrada, E., Glinz, M.: An exploratory study on handling requirements and acceptance test documentation in industry. In: 24th IEEE International Requirements Engineering Conference (RE 2016), pp. 116–129. IEEE (2016)
Borg, M., Gotel, O.C., Wnuk, K.: Enabling traceability reuse for impact analyses: a feasibility study in a safety context. In: 7th International Workshop on Traceability in Emerging Forms of Software Engineering (TEFSE), pp. 72–78. IEEE (2013)
De Lucia, A., Marcus, A., Oliveto, R., Poshyvanyk, D.: Information retrieval methods for automated traceability recovery. In: Cleland-Huang, J., Gotel, O., Zisman, A. (eds.) Software and Systems Traceability, pp. 71–98. Springer, London (2012). https://doi.org/10.1007/978-1-4471-2239-5_4
Nair, S., de la Vara, J.L., Sen, S.: A review of traceability research at the requirements engineering conference\(^{{\rm re}@21}\). In: 21st IEEE International Requirements Engineering Conference (RE 2013), pp. 222–229. IEEE (2013)
Sommerville, I., Sawyer, P.: Requirements Engineering: A Good Practice Guide. Wiley, New York (1997)
Myers, G.J., Sandler, C., Badgett, T.: The Art of Software Testing. Wiley, New York (2011)
Arora, C., Sabetzadeh, M., Goknil, A., Briand, L.C., Zimmer, F.: Change impact analysis for natural language requirements: an NLP approach. In: 23rd IEEE International Requirements Engineering Conference (RE 2015), pp. 6–15. IEEE (2015)
Hotomski, S., Ben Charrada, E., Glinz, M.: Aligning requirements and acceptance tests via automatically generated guidance. In: 4th Workshop on Requirements Engineering and Testing (RET) (2017)
Manning, C.D., Surdeanu, M., Bauer, J., Finkel, J.R., Bethard, S., McClosky, D.: The Stanford CoreNLP natural language processing toolkit. In: ACL (System Demonstrations), pp. 55–60 (2014)
Rus, V., Lintean, M.C., Banjade, R., Niraula, N.B., Stefanescu, D.: Semilar: the semantic similarity toolkit. In: ACL (Conference System Demonstrations), pp. 163–168 (2013)
Andor, D., Alberti, C., Weiss, D., Severyn, A., Presta, A., Ganchev, K., Petrov, S., Collins, M.: Globally normalized transition-based neural networks. arXiv preprint arXiv:1603.06042 (2016)
Hagenbuch, J.S.C.: Text_diff-engine for performing and rendering text diffs. https://pear.horde.org/
Marcus, A., Maletic, J.I., Sergeyev, A.: Recovery of traceability links between software documentation and source code. Int. J. Softw. Eng. Knowl. Eng. 15(05), 811–836 (2005)
Antoniol, G., Canfora, G., Casazza, G., De Lucia, A., Merlo, E.: Recovering traceability links between code and documentation. IEEE Trans. Softw. Eng. 28(10), 970–983 (2002)
Hayes, J.H., Dekhtyar, A., Sundaram, S.K.: Advancing candidate link generation for requirements tracing: the study of methods. IEEE Trans. Softw. Eng. 32(1), 4–19 (2006)
Sinha, V., Sengupta, B., Chandra, S.: Enabling collaboration in distributed requirements management. IEEE Softw. 23(5), 52–61 (2006)
Bjarnason, E., Sharp, H.: The role of distances in requirements communication: a case study. Requir. Eng. 22(1), 1–26 (2017)
Adzic, G.: Bridging the Communication Gap: Specification by Example and Agile Acceptance Testing. Neuri Limited, London (2009)
Acknowledgements
We thank our experts and their companies for investing time and effort into the evaluation of our approach. This work was partially funded by the Swiss National Science Foundation under grant 200021-157004/1.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Hotomski, S., Ben Charrada, E., Glinz, M. (2018). Keeping Evolving Requirements and Acceptance Tests Aligned with Automatically Generated Guidance. In: Kamsties, E., Horkoff, J., Dalpiaz, F. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2018. Lecture Notes in Computer Science(), vol 10753. Springer, Cham. https://doi.org/10.1007/978-3-319-77243-1_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-77243-1_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-77242-4
Online ISBN: 978-3-319-77243-1
eBook Packages: Computer ScienceComputer Science (R0)