The aim of this research paper is to evaluate and compare three automated software testing tools to determine their usability and effectiveness. Software testing is a crucial part of software development and the process of automating software testing is vital to its success. Currently, complex systems are being built and testing throughout the software development cycle is pertinent to the success of the software. To enable comparison of the automated testing tools, a multi-partitioned metric suite was developed to guide the tool evaluation process. A selected web application was first manually tested; then tested using the three automated testing tools. Automated testing included the development of scripts that not only saves time and resources when applications are updated, but also speeds up the process of testing when regression testing is necessary. An important contribution of this research is the development of the metric suite that facilitates comparison and selection of a desired testing tool for automated testing. Inferences, implications and results are presented and discussed.
FULL TEXT (pdf)