Abstract
The detection of behavioral design patterns is more accurate when a dynamic analysis is performed on the candidate instances identified statically. Such a dynamic analysis requires the monitoring of the candidate instances at run-time through the execution of a set of test cases. However, the definition of such test cases is a time-consuming task if performed manually, even more, when the number of candidate instances is high and they include many false positives. In this paper we present the results of an empirical study aiming at assessing the effectiveness of dynamic analysis based on automatically generated test cases in behavioral design pattern detection. The study considered three behavioral design patterns, namely State, Strategy, and Observer, and three publicly available software systems, namely JHotDraw 5.1, QuickUML 2001, and MapperXML 1.9.7. The results show that dynamic analysis based on automatically generated test cases improves the precision of design pattern detection tools based on static analysis only. As expected, this improvement in precision is achieved at the expenses of recall, so we also compared the results achieved with automatically generated test cases with the more expensive but also more accurate results achieved with manually built test cases. The results of this analysis allowed us to highlight costs and benefits of automating dynamic analysis for design pattern detection.