40% Less Bugs through Manual and Auto QA Standartization for IoT Product

checklist
  • Duration: 1 year 6 months
  • Industries: Telecom Industry; IT Industry; IoT & Geofencing
  • Services: Test Automation Services; Manual Testing Services
  • Software Categories and Types: B2B Solutions
  • Technical Expertise: Test Automation; Internet of Things (IoT), Geofencing, Smart City
  • DevOps Expertise: CI/CD Basics
  • Technologies: Git; Apache Maven; TestLink; Selenide; pgAdmin; JUnit / TestNG; Gurux DLMS/COSEM Director; Data Bases; Swagger; IntelliJ IDEA; Project Management, Collaboration and Bug Tracking; Jira; CI/CD Automation Servers; Json Web Token (JWT); Selenium WebDriver; Java; Frontend; TeamCity; Test Management Tools; IDE; Build and Dependency Platforms and Tools; Allure Framework; Backend; Testing Frameworks; JavaScript; JSON; PostgreSQL; JavaScript Libraries; Development Infrastructure and Tools; Source Code Management (SCM); Grafana; Standards and Protocols; Test Automation; Chrome DevTools; REST Assured; CI/CD and DevOps; Languages, Protocols, APIs, Network Tools; Network Tools; Putty; Gson; Software Engineering and Management Tools; Confluence; Awaitility; Java Libraries; log4js; Lombok; DLMS (Device Language Message Specification); Apache POI
project-team
  • Team size (3):
  • 1 Project Manager
  • 1 QA Manual
  • 1 QA Automation

Project summary

thumbnail-image The project represents the development of a service intended for the interaction with utility metering platforms. It processes readings received from metering devices and provides remote access to them. With the help of the service, platform users can receive data from metering devices, find out the amount of energy consumed, monitor the dynamics of costs, remotely control equipment, and change the settings of metering devices.

The service is a RESTful system consisting of a set of microservices. Microservices are a layer between end users and metering devices. They allow receiving data, as well as controlling metering devices via HTTPS, DLMS and IEC-104 protocols.

You can get more information about the technical features of the project in the previously published material.

JazzTeam responsibilities

At the beginning of the project, our company provided a service that included MVP development and professional project management. Over time, when the product grew from an MVP into a full-fledged product, we offered the customer additional services to ensure the product quality. JazzTeam specialists were assigned the task to establish the process of product manual testing, and then add the process of automated testing on the project. The processes included the following tasks:

  • To study the product requirements and create sets of test cases in the TestLink system covering up to 40% of the product functionality.
  • To develop test documentation: a test plan, test cases, checklists, authorization matrices depending on the user’s role.
  • To develop a plan of testing when transferring to a new version of the product, which includes regression testing of bugs and tasks in the development environment. To conduct smoke testing in the production environment of the customer’s clients when a new version of the product is released.
  • To record all bugs in the Jira bug tracking system.
  • To maintain up-to-date test documentation, depending on the creation of new functionality and customer requirements updates.
  • To develop a framework for product autotesting from scratch to continuously monitor critical product functionality and track changes.

Black box testing

Continuous black box testing and UI testing were implemented to check that the user interface complies with the established standards and regulations defined by the customer. Such checks consist of validating user interface objects that are directly displayed to the user when they interact with the application services. To do this, various test design techniques are used:

  • equivalence classes (used to build test data to prepare for Excel reports downloading, when checking the creation of new ports for connecting to devices);
  • boundary conditions (used when downloading the defect analysis, when enabling/disabling time synchronization, proactive transmission and additional collection);
  • domain analysis (used when checking if metering devices were added);
  • pairwise testing (used to download Excel reports by types of readings and information on metering devices in that or another service, when checking the settings of the scheduler, devices and additional collection).

Such testing allows maintaining the quality of the product, finding bugs with no need to wait for regression tests and helps check most of the user actions, the interaction of services and components.

Integration testing

After preparing and organizing regular black box testing, we implemented integration testing. It allows detecting bugs that occur when microservices interact with each other, as well as bugs that occur when connecting the user to metering devices through microservices. When testing, a special GuruxDirector tool is used (an open source solution for working with DLMS devices), which allows reading objects (the object is the main element of the metering device information structure, contains all parameters and data, and also has a unique logical name) from electricity, gas or water meters that are compatible with the DLMS/COSEM protocol. GuruxDirector allows checking for bugs related to the interaction of several microservices when reading objects from metering devices.

REST API testing

Another peculiar feature of providing product quality assurance services was the introduction of REST API testing to check the operation of functions that are not available on the UI, as well as conduct additional tests to check requests sent to metering devices. This check allows making sure that the interaction of microservices with each other is stable and data is received through microservices using HTTP methods.

Moreover, during the development of a new authorization microservice, the manual QA engineer carried out work, which consisted not only in testing the new microservice, but also in drawing up a document describing the updated authorization service, as well as creating various types of authorization matrices depending on the user’s role. This information is placed in the customer’s Confluence and added to the technical requirements for the product.

Regression and smoke testing

As part of release activities, regression and smoke testing were introduced on the project.

To carry out these types of testing, the following features were agreed upon and implemented:

  • The first feature is to use the Grafana tool to check the functionality, as well as stable system operation. During the testing process, Grafana is used to check the following: connecting clients, making requests to services, receiving readings, displaying the dynamics of data exchange and the availability of bugs in the operation of services.
  • The second feature is to send requests to the database using the pgAdmin tool, which simplifies the process of tasks and bugs testing, as well as the process of undergoing test cases (for example, checking messages auto deletion).

These types of testing help to make sure that the main functionality of the application after new version release works according to the requirements and does not have side bugs caused by new changes, which allows finding bugs on various production installations before end users do it.

Automated testing

Our company started the process of implementing automated testing from scratch. Jointly with the customer, a list of critical checks to be covered by autotests was developed and approved. We collected all the necessary information, developed a work plan, decomposed the tasks, and estimated the amount of work hours within the initial stage.

The initial task was to develop a framework for the automated testing of critical product functionality. Test cases were selected that affect the operation of the UI, the work on downloading and checking the reports generated by Excel and on checking the operation of metering devices through the GuruxDirector tool. To automate these test cases, Excel file clients and the Gurux client, which simulates the work of the GuruxDirector tool, were created.

Developing a client for working with Excel files

The Excel client is developed on the basis of the Java Apache POI library, which allows reading Microsoft Office documents, in particular MS Excel. In the application, when requesting a report, it is downloaded as a .zip archive. The Excel client unzips and extracts the Excel file. Next, a Workbook object and an Excel Sheet object are created, and line-by-line reading is performed from each sheet line into the Map data structure object in the Java programming language (Map<Integer, List<String>>). After that, the data from the data structure object is distributed to the desired report object. To obtain the final result, the data in the created report is compared with the data from the database. The results of the check are considered positive if the data match and there are no duplicates of metering devices in the report.

Development of a client for working with the Gurux library

The GuruxDLMS client is developed on the basis of the Gurux library and goes through the following steps to obtain data:

  1. Establishment of a connection with the metering device.
  2. Creation of an object for reading specific values from the metering device (for example, time value, network frequency, phase voltage, daily readings log, etc.).
  3. Reading data from the metering device in accordance with the created object.
  4. Termination of connection with the metering device.

Within the framework of the Gurux client, a test case was automated using the multi-threading property to check the simultaneous operation of several metering devices.

Automation of work with Grafana

Another non-trivial, but very interesting task for automated testing was the development of a framework for monitoring the work of the customer’s product using data collected by the Prometheus monitoring system.

The Grafana tool is used to visualize the data from the monitoring system. The customer set the task of creating autotests for the selected product metrics for a certain period of time that meet certain conditions. Before starting the development of the framework, the QA automation engineer studied the environment of the customer, the form in which the metric data is stored, and how it is possible to obtain data for a period of time for a specific metric. The framework uses REST API requests using the Rest Assured library to monitor the performance of the product. A query is sent to the Prometheus timeline database to retrieve data on specific metrics for a specific period of time. The test is considered passed if the values of the metrics meet the specified conditions.

Technologies used

Databases: pgAdmin.

Backend: Java 11, Swagger, Grafana, Apache POI, Log4j2, Lombok, Gson.

Test Automation: JUnit 5, Selenide, Rest Assured, Allure, Awaitility.

CI/CD and DevOps: Maven, TeamCity.

Languages, Protocols, APIs, Network Tools: DevTools, JWT, Putty, Gurux DLMS/COSEM Director.

Software Engineering and Management Tools: TestLink, Jira, Confluence, Git, IntelliJ IDEA.

Project features

  • The customer’s product is constantly used by end customers. The issue of quality and stable performance of the product are the main priorities on the project.
  • Specific tools are used for different types of testing.
  • Smoke testing is carried out on two production installations with different amounts of metering devices and different load levels.
  • Performing various types of testing to check and detect possible deviations in the operation of the product.
  • Creating a framework with autotests to monitor the work of the product using Prometheus.

Results of service provision

  • The process of manual and automated testing was organized and successfully implemented on the project.
  • Technical documentation was created for the testing process:
  • A test plan was created and described in detail;
  • 6 test suites were created:
  • GUI testing (~400 test cases);
  • Regression testing (~150 test cases);
  • Smoke testing (~150 test cases);
  • REST API testing (~50 test cases);
  • Checklist for testing the authorization service (~150 test case);
  • Integration testing (~50 test cases).
  • An authorization matrix for certain types of users was developed.
  • The level of coverage of GUI functionality with test cases is 95%.
  • A framework for black box autotesting of the application was written from scratch.
  • A client for test cases using the Gurux utility was developed. The percentage of coverage of functionality related to working with Gurux is 62.5% of the existing manual test cases. A client for test cases in which Excel reports are generated and checked was also developed.
  • Test cases that check the critical functionality of the UI component became automated.
  • A framework to monitor the operation of the application using data collected by Prometheus was written from scratch.
  • Regular creation of release notes dedicated to the release of new versions of the product and posting this information in the customer’s Confluence.

Company’s achievements on the project

  • The quality of the product is ensured uninterruptedly on the project. The testing process is continuously linked to the development process. The release includes versions that passed regression testing. The stability of the product in the production environment is maintained by conducting smoke testing.
  • Every night, all autotests are run automatically on the TeamCity build server. In the morning, the QA automation engineer sends a detailed report to the chat with the customer containing the number of completed and failed test cases, as well as a description of the reasons for the failure. This information allows finding problems in the critical functionality of the product and quickly fixing them.
  • JazzTeam worked closely with the customer in the implementation of a new authorization microservice, which made it possible to take into account all the requirements for user roles used when working with the product and create role matrices for testing.
  • Our team ensured a seamless transfer of knowledge, approaches between the project participants − many different technologies and tools were used, but all best practices were collected in one place.

Clients about cooperation with JazzTeam

Related projects

Recent Work

    Contact Us

    What happens next?
    • 1

      Leave your project request. We will contact you and schedule a call.

    • 2

      Signing of the NDA to ensure the project info confidentiality.

    • 3

      Negotiation of your request and the required services.

    • 4

      Team forming, coordination of workstages.

    • 5

      Contract signing and project start.