Skip navigation EPAM

Lead Data Quality Engineer Minsk, Belarus

Lead Data Quality Engineer Description

Job #: 46058
EPAM is committed to providing our global team of 36,700+ EPAMers with inspiring careers from day one. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence. In today’s new market conditions we continue to support operations for hundreds of clients around the world remotely, with the vast majority of our teams working from home. No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.


Currently we are looking for a Lead Data Quality Engineer for our Minsk office to make our team even stronger.


Our client`s expertise and experience in marketing analytics, packaging, supply chain management and logistics helps customers optimize their marketing activity and address supply chain challenges big and small. We offer databased insights and services that combine a strategic, long-term view with an unmatched commitment to execution.

You will be involved to:
• Work with customer directly. Develop test plan and test strategy to ensure delivery quality.
• QA to work with BA to understand business case scenarios and how that reflects in the incoming data files/fields . As such, coming up with test cases to begin with and prioritize to fit the timeline

You will analyze and cover testing needs in 2 teams:

• Inventory: new network-wide inventory solution, some of functionality: calculates days of supply from forecasted customer demand; provides visibility to detailed historical, current and forward inventory positions for each product at every location; generates near-real-time alerts (minimum-allowed shelf-life remaining, current and predicted safety stock etc.) This inventory solution will be offered as a multi-tenant, flexible SaaS plan with no upfront technology licensing costs.
Technology scope:
scala/akka policy engine, Spark Structured Streaming, kafka
python, Azure Data Factory and general ETL and data processing
terraform, Azure infrastructure automation scripting, Docker and Kubernetes

• Promotions: team creates its part of the system from scratch, integrating with new and existing systems. The main goal of the project is to help users to plan the delivery of products for the duration of promotions and events.
The tool allows the user to create an advertising event, build a graphic model and project the impact of all event parameters on the potential level of demand for the
advertised goods.
Technology scope:
NodeJS + Koa 2, React + Redux, PostgresQL DB + Sequelize, Typescript


  • 3 – 5 years of experience
  • Proficiency in SQL skills (able toto develop queries for validation)
  • Experience in end-to-end and regression testing
  • Experience designing, executing, or documenting test plans
  • Well-versed in the concepts of data ingestion, data cleansing, data transformation, data warehousing, and data aggregation/visualization

Nice to have

  • Experience in applying automation to data projects
  • Experience in reading and writing the stored procedures approach for data projects
  • Any script skills such as Python
  • Experience with the Azure eco system

We offer

  • Experience exchange with colleagues all around the world
  • Competitive compensation depending on experience and skills
  • Regular assessments and salary reviews
  • Social package: medical care, sports, family care
  • Free English classes
  • Opportunities for self-realization
  • Friendly team and enjoyable working environment
  • Flexible working schedule
  • Corporate and social events
Zjstěte více o EPAM Bělorusko

Dobrý den! Jak vám můžeme pomoci?

Kde nás najdete