This site uses cookies and by using the site you are consenting to this. We utilize cookies to optimize our brand’s web presence and website experience. To learn more about cookies, click here to read our privacy statement.

How to Expand Test Automation for Interactive Conversations

A red robot sitting on a desk next to a laptop.

A client producing interactive educational conversations was expanding their use of test automation and building capabilities by actively training their entire QA team on their automation frameworks. Before continuing further with their investment in automation, they wanted an independent assessment of their automation frameworks and practices.

Assessing Current Automation

The firm had multiple automation frameworks – Ruby Selenium, RSpec, and Capybara. Both the experienced automation engineers and their QA testers newly trained in test automation were creating automated tests using these frameworks. SPR conducted an assessment of the tests created by their experienced automation engineers and their QA testers. The assessment began with an on-site, in-depth discussion of the technologies, repository layout, approach to test creation, and vision for future automation. Following the discussion, the test repositories were shared with the SPR architect to complete the assessment.

Automation Best Practices

Using a representative sampling of the automated tests, the SPR architect evaluated the tests against automation best practices. These practices included:

  • Page Object Model (POM), to structure the UI test automation projects. Using the POM reduces code duplication and simplifies the test writing process.
  • Environment properties passed through System Environment Variables, allowing for test suites to be executed in any environment without making a code change.
  • Test Data read from a file or fed into each test via a data provider to reduce the maintenance to update data. It also easily allows for different data sets in each environment.
  • DRY principles (Don’t Repeat Yourself) to avoid replicating code and decrease maintenance by using inheritance and creating methods that can be referenced.
  • Minimizing Sleep method usage to avoid adding unnecessary time to the test execution time.

Specific findings and recommendations based on these best practices were documented and presented for the frameworks and code bases under review.

Insights Gained

The client gained independent confirmation that their automation practices, suspected of needing improvement, in fact needed to be improved. They also gained insight to additional best practices that they should be following. Since the experienced automaton engineers have very strong coding capabilities and a solid understanding of test automation, their engineers can collaborate, cross-train, and mentor the QA team in the use of best practices.