Project summary
The system provides meticulously customized monitoring of competitors’ stores and allows automatic optimizing the prices of the goods according to certain rules in real time.
For instance, you can set a rule so that the prices in your store are always lower than the competitors’ prices by a certain value. Let’s say 1$ or 1 %. In this case the system will monitor your competitors’ prices regularly and promptly underbid prices on similar products.
The competitors’ websites are scanned with the specific application (crawler), which collects necessary information and saves all information on found products in system database. Crawler can be run either on schedule or on system client’s request. There’s the whole store and certain products scan support.
While scanning a marketplace (for example: Amazon, Ebay, Pricerunner) the system automatically finds your competitors, who have analogous products as you do. The system informs a user not just about existing competitors, but also about new ones. After that the user is able to add a new competitor into the system and customize new price changing rules for user’s products.
The system provides access to charts, graphs, which allow to analyze current situation easily and to build a business development strategy, based on collected information and the history of changes.
System components
The system consists of two independent applications: web application and crawler.
Web application is an application which the user interacts with: sets up his dashboard, adds competitors, sets products price regulation rules, controls product scanning.
Backend is implemented on Java using Spring. Database MySQL is used in the system, Apache Solr is used for quick access to data. The application is divided into modules, which could be expanded on separate servers, RabbitMQ is used for connection between the modules.
The frontend is developed with ExtJS and Bootstrap.
KeyCloak server provides protection.
Crawler is a separate application, which scans internet stores. It is developed in Scala with using Apache Spark framework.
Parsing scripts are written for most cms stores. These scripts analyze a store structure, find information about products, save information in database. For parsing new and non-standard cms there’s a possibility to specify individual parsing rules using a certain pattern.
Our team was responsible for the development of new highly loaded system components, legacy code refactoring and unit tests application coverage.
Technologies
Backend: Java EE, Hibernate, JPA, Spring, KeyCloak, MySQL, Solr, RabbitMQ, Jackson, Apache Tomcat.
Frontend: ExtJS, JQuery, JavaScript, Bootstrap.
Crawler: JavaScript, Spark, Scala.
Jira, HipChat, GitHub, Jenkins, Maven.
Screenshots
Project features
- Joint work in a distributed international team.
- Quick entry into the project and fast, short stages of development.
- Frequently changing functional requirements.
- Work on the product functioning on high risky market.
- Need of immersion into the peculiar qualities of the major global trading platforms functioning.
Project results
- Work of the existing functionality was stabilized.
- The new functionality was realized.
- Work with database at high loads was optimized.
Company’s achievements during the project:
- The project is successfully finished in conditions of dynamically changing requirements.
- We initiated and took part in the database architecture development.