Provided services
Test Automation Services, Manual Testing Services
Client
Our client is an IoT data service provider with a base station network. With extensive expertise in IoT ecosystems, the client delivers reliable and high-performance solutions that facilitate automation, optimize resource consumption, and enhance operational efficiency for businesses and municipalities.
Product
The client’s newly developed product is intended for the interaction and management of utility metering platforms. It processes readings received from metering devices and provides remote access to them.
The solution allows users to receive data from metering devices about the amount of energy consumed, monitor the dynamics of costs, remotely control equipment, and change the settings of metering devices.
The solution is based on a RESTful system consisting of a set of microservices that serve as a layer between end users and metering devices. These microservices allow data reception and control of metering devices via HTTPS, DLMS, and IEC-104 protocols.

Challenge
Previously, our client had a negative experience working with an outsourcing company and was quite sceptical about manual testing services. However, the functionality of the product constantly expanded, the system was unstable and required professional support.
Continuous bugs in production caused negative feedback from end users which made the client consider possible solutions that could help.
Project Context
Our team had previously participated in the development of one of the client’s products, which included implementing the MVP and providing professional project management services.
JazzTeam engineers demonstrated excellence on this project, that’s why the client addressed us for qualified and immediate support to maintain product operation and connectivity. We proposed organizing the quality assurance process on the project in order to dispel the customer’s doubts about manual and automation testing.
JazzTeam Challenge
JazzTeam experts needed to establish the process of product manual testing, and then add the process of automated testing within the project.
The processes establishment implied the following tasks that our team needed to accomplish for effective work on the project:
- Product requirements analysis and the creation of sets of test cases in the TestLink system covering up to 80% of the product functionality.
- Development and regular update of the test documentation: a test plan, test cases, checklists, authorization matrices depending on the user’s role.
- Development of a testing plan for migration to a new version of the product, including regression testing of bugs and tasks in the development environment.
- Smoke testing in the production environment after a product release.
- A framework for product auto testing development from scratch to continuously monitor critical product functionality and track changes.
Solution
Jazzteam’s QA experts teamed up with software engineers to establish a structured product manual testing process and later integrate automated testing into the project.
Process Improvement
Our team has covered various types of manual and automated testing to establish a structured and effective quality testing process and enhance product quality.
We implemented continuous black box testing and UI testing to check the user interface’s compliance with the customer’s established standards and regulations.
Such checks include validating user interface objects directly displayed to the user during the interaction with the application services.
The team used various test design techniques, such as:
- equivalence classes (used to build test data to prepare for Excel reports downloading when checking the creation of new ports for connecting to devices);
- boundary conditions (used when downloading the defect analysis, when enabling/disabling time synchronization, proactive transmission, and additional collection);
- domain analysis (used when checking if metering devices were added);
- pairwise testing (used to download Excel reports by types of readings and information on metering devices in that or another service, when checking the settings of the scheduler, devices, and additional collection).
Such testing helps maintain product quality, find bugs without waiting for regression tests, and check most user actions and the interaction of services and components.
After preparing and organizing regular black box testing, we implemented integration testing.
It helped us to detect bugs in microservices interaction and in user connection to metering devices through microservices.
During this testing we used the GuruxDirector tool, an open-source solution for working with DLMS devices. It allows reading objects (the object is the main element of the metering device information structure containing all parameters and data and has a unique logical name) from electricity, gas, or water meters compatible with the DLMS/COSEM protocol.
The introduction of REST API testing on the project allowed our team to check the operation of functions not available on the UI and conduct additional tests to check requests sent to metering devices.
This way, we ensured that microservices’ interactions with each other were stable and that data was received through microservices using HTTP methods.
Our manual QA engineer not only tested the new microservice but also drew up a document describing the updated authorization service and created various types of authorization matrices depending on the user’s role.
Regression and smoke testing were covered on the project as part of release activities. Together with the client we agreed upon the following features and implemented them during the testing process:
First, we decided to use the Grafana tool to check the functionality and stability of the system operation. It allowed us to check the processes of the client connection, requests for service creation, readings receipt, the dynamics of data exchange display, and the availability of bugs in-service operation.
Secondly, we used the pgAdmin tool to send requests to the databases. It simplified the process of tasks and bugs testing, and undergoing test cases as well (e.g. checking messages auto-deletion)
All this helped to make sure the main functionality of the application after the new version release worked according to the requirements and without side bugs caused by recent changes. It allowed finding bugs on various production installations before end users do it.
The process of automated testing implementation was developed from scratch in close cooperation with the client. We created a list of critical checks to be covered, collected all the necessary information, developed a work plan, decomposed the tasks, and provided the estimation of the workhouse within the initial stage.
The core task was to develop a framework for automated testing of critical product functionality. We selected test cases to cover key aspects, including UI operation, downloading and verifying reports generated in Excel, and checking the operation of metering devices using the GuruxDirector tool.
In order to automate these test cases, JazzTeam experts developed dedicated clients – one for working with Excel files and another for simulating the GuruxDirector tool.
Development Approaches
The Excel client, developed on the basis of the Java Apache POI library, allows reading Microsoft Office documents, particularly MS Excel.
All requested reports in the client’s product are downloaded as a .zip archive. The Excel client unzips and extracts the Excel file, creates a Workbook and Excel Sheet objects, and reads each line from each sheet creating a Map data structure object (Map<Integer, List<String>> in Java). The data from the data structure object is then delivered to the desired report object and compared with the database. The check is considered successful if the data matches and there are no duplicates of metering devices in the report.
The GuruxDLMS client is developed on the basis of the Gurux library. To obtain data, it goes through the following steps:
- Establishment of a connection with the metering device.
- Creation of an object for reading specific values from the metering device (e.g., time value, network frequency, phase voltage, daily readings log, etc.).
- Reading data from the metering device in accordance with the created object.
- Termination of connection with the metering device.
The Gurux client framework helped us automate a test case by using the multi-threading property to check the simultaneous operation of multiple metering devices.
Another non-trivial yet interesting task in automated testing was developing a framework for monitoring the work of the customer’s product using data collected by the Prometheus monitoring system. We used the Grafana tool, which allows the client to monitor data in real time and automatically detects critical alerts, showcasing them on the system’s dashboard.
According to our client’s requirements, we needed to create autotests for the selected product metrics for a certain period of time meeting certain conditions. To achieve this, our QA automation engineer studied the customer’s environment, the form in which the metric data is stored, and the most effective way to obtain time-based data for a specific metric.
The framework was designed to monitor product performance using REST API requests using the Rest Assured library. A query is sent to the Prometheus timeline database to retrieve data on specific metrics over a specific period. The test is considered passed if the metrics’ values meet the required conditions.
Testimonial

"Lartech Telecom expresses its gratitude to JazzTeam for the cooperation and recommends JazzTeam's engineers as highly qualified.
We would like to note that the team took a proactive approach and was results-oriented from the very first days of the project. The team's immersion in a new subject area did not cause any difficulties and was promptly done.
Agile methodologies practices, such as daily Scrum meetings, weekly sprints and the results presentation, were implemented in the project from the very beginning. This allowed us to keep all communications at a high level, to focus on the outcome, to respond timely to challenges and changes in external circumstances, to get over pitfalls and to adjust plans and processes.
All the works were completed in full compliance with the requirements and within the specified timeframe.
We wish JazzTeam success in its professional activities."
Result
By seamlessly integrating the testing process into the development workflow we achieved the system’s stability and predictability while ensuring consistent product quality and control.
A framework with automated tests developed by JazzTeam now helps our client monitor product performance and detect potential issues in the early stages.
All autotests are run automatically on the TeamCity build server every night. In the morning, the client receives a detailed report on completed and failed test cases with reasoning. This helps identify problems in the product’s critical functionality and quickly fix them.
Our team facilitated a seamless knowledge transfer among project participants as many different technologies and tools were used. All best practices are now collected in one place for our client’s engineering team.
As a result of our work, the client’s product maintains high quality and stable performance. The overall user experience of the solution and user satisfaction metrics were improved by 65%.
JazzTeam Achievement
JazzTeam experts organized and successfully implemented the process of manual and automated testing with all necessary technical documentation, including:
- A detailed test plan;
- Six test suites;
- An authorization matrix for certain types of users was developed.
(~600 test cases)
(~900 test cases)
(~150 test cases)
(~130 test cases)
(~200 test cases)
(~50 test cases)
We implemented several testing approaches on this project, including smoke testing on two production installations with different loads and metering device counts.
By achieving 95% test case coverage of GUI functionality, we ensured greater reliability of the user interface.
Several frameworks were written from scratch, such as one for black box auto testing and another to monitor the application’s operation using data collected by Prometheus.
Also, specifically for this project, we developed a custom client using Gurux utility, covering 62.5% of existing manual test cases. Additionally, we created a client for testing Excel report generation and validation.
Now, test cases that check the critical functionality of the UI component have become automated.
Our experts regularly created release notes and posted this information in the client’s Confluence, providing clear communication and transparency throughout the project. This approach kept the client informed and helped to build a strong and trustful relationship.
Technologies
Databases: pgAdmin.
Backend: Java 11, Swagger, Grafana, Apache POI, Log4j2, Lombok, Gson.
Test Automation: JUnit 5, Selenide, Rest Assured, Allure, Awaitility.
CI/CD and DevOps: Maven, TeamCity.
Languages, Protocols, APIs, Network Tools: DevTools, JWT, Putty, Gurux DLMS/COSEM Director.
Software Engineering and Management Tools: TestLink, Jira, Confluence, Git, IntelliJ IDEA.