Software Developer in Test: Dropping Regression Testing

Today we’re going to take a closer look at how we’re transforming our testing processes by means of implementing automation standards and embedding automated test plans into the development process.

Image
azure header

Early last year, here at True Engineering we formulated a corporate strategy of “Shared Engineering”, in which we develop our base of approaches and standards and unify our workflow. We developed uniform requirements for project management, as well as for the architecture, deployment, and support of our products.

To optimize the testing process, the main goal of which was to reduce the time-to-market (TTM) of products, we adopted the SDET (software developer in test) concept. It implies that a QA engineer accompanies a product throughout its lifecycle, rather than joining testing at the very end. This approach guarantees us not only reduced testing time, but also continuous quality control.

How do we bring SDET into our teams working?

From our perspective, SDET is a certain set of functions from drawing up a general testing strategy to writing automated tests and manual testing, if required. These functions can be assigned to an individual specialist, but we have chosen distributing them within our teams.

General requirements for the testing process for all teams

  • Regression checklist in Azure DevOps which built into the CI/CD;

  • Set of auto tests covering the regression checklist;

  • Reports in the context of the regression checklist on the results of the run;

  • Making a checklist before you start developing a feature;

  • Development of autotests by checklist;

  • Metrics of test plan coverage by autotests.

Customizing Azure DevOps Pipelines

Azure DevOps Pipelines allow running automated tests and have a report on them. However, it was unclear how to align this report with a checklist and generate it in the required form. Taking that into account, we developed a custom extension for Azure DevOps that associates autotests and test cases.

How our teams implement SDET principles into their work

  • Set up a test environment and Azure DevOps pipelines;

  • Define types of autotests for code coverage: unit tests, integration tests, scenario tests;

  • Learn to write appropriate types of tests according to our standards;

  • Make a plan to write automated tests to test key functions;

  • Set up autotest reports based on test plans;

  • Develop a roadmap and plan to implement SDET in the project, agree with the customer and include appropriate tasks in the sprints;

  • Select and configure a tool to measure code coverage (Sonar Qube, JaCoCo or Cobertura). This is only relevant for Unit tests as the task is to control and maintain the required degree of code coverage.

Autotest code coverage is a process that can last for an unlimited period of time, and you have to conduct it properly. We always carefully calculate the amount of coverage needed against the criticality of the functions to be covered and costs for it. First, not all of the features are well tested by autotests. For example, it is time consuming to manually test features with monotonous and clearly described logic, but it is an ideal option for automated testing. At the same time, small tweaks like changing colors on one button will be easier to test by hand.

Second, our experience shows that the level of automation should be quite high, but it can only approach 100% on products that are not evolving. In an environment where the market and business respond almost instantly to innovation, products need to be flexible and proactive. TTM and the ability to change the product to meet new ideas and challenges play a key role here.

Our latest publications

See our knowledgebase