QA Engineer wit ETL Experience (Part-time)
Erbis is a software development company with offices in the US, UK, Poland, and Ukraine. The majority of our customers come from the EU and the US. We undertake project implementation on our side or augment the clients’ in-house teams. Now, our team consists of 100+ IT professionals with expertise across domains. We help enterprises and SMBs create software solutions that make the world a better place :)
Our Client:
US-based data-driven business that serves colleges and universities.They help educational institutions drive their success with transparency of data, both across departments on campus and across the higher education marketplace.Our client leverage real-time student record-level data to visualize opportunities and provide analytics that highlight areas for improvement.
Requirements:
At least 5 years of experience in a QA role
Strong understanding of ETL tools (e.g., Azure Data Factory, Informatica, Talend, SSIS) and DWH concepts
Proficiency in SQL for querying, data validation, and performance testing
Python knowledge
Experience with Databricks
Experience with test automation frameworks (e.g., Selenium, JUnit) or custom automation for data workflows
Experience with Microsoft Azure
Knowledge of data governance and data quality best practices
Able to document, execute and maintain functional test cases and other test artefacts like the test data, data validation, harness scripts
Knowledgeable about requirements analysis (impact analysis, traceability matrix, functional & non-functional requirements)
Working with Azure DevOps
Responsibilities:
Collaborate with DWH/ETL developers to understand business requirements, technical specifications, and data transformation processes
Design, develop, and execute test strategies, test plans, and test cases for validating data integrity, transformation rules, and data quality
Perform ETL testing to ensure the accuracy, consistency, and reliability of data as it moves through ETL processes
Validate data migration, integration, and transformation processes within the DWH environment
Use SQL queries and Databricks notebooks to validate source-to-target data mapping and ensure data completeness, accuracy, and consistency
Identify, track, and document defects and inconsistencies in DWH/ETL processes, and work with developers to resolve them
Develop and implement automated testing frameworks for continuous integration (CI) in data workflows
Monitor and assess data quality metrics, providing feedback and reports to key stakeholders
Conduct regression testing when new features are added or changes are made to the data pipelines
Collaborate with business analysts and stakeholders to ensure that data requirements are met
Provide recommendations for performance improvements and optimization of ETL processes
Stay updated with emerging trends in data quality assurance, data warehousing, and ETL technologies
Our perks and benefits:
Choose your working mode and working hours with 100% remote mode
Paid Vacation days: 24 working days
Paid Sick days: 15 working days
The opportunity to dedicate up to 10% of working hours for self-education and personal development
Option to cooperate through a B2B contract for flexibility and business autonomy