Duration:
12 months to start
Job Description
- The ETL and Data Warehouse Test Engineer is responsible for designing, developing, and executing test plans and test cases to ensure the quality and accuracy of data flows and data transformations in our data warehouse environment.
- This role focuses on validating ETL processes, data integrity, and data warehouse components using both automated and manual testing techniques.
- The ideal candidate has hands-on experience with Azure platforms, the Snowflake platform, Talend, and a strong understanding of regression testing to support continuous integration and deployment within a cloud-based environment.
- Test Planning and Strategy: Develop and execute comprehensive test plans, test cases, and test scripts for ETL processes, data warehouse structures, and data integrations.
- ETL and Data Warehouse Testing: Conduct end-to-end testing of ETL processes to ensure data accuracy, completeness, and transformations in line with business requirements.
- Regression Testing: Design and perform regression testing to validate existing functionality and data pipelines, ensuring the stability of the data environment across updates.
- Platform Expertise: Use Azure Data Services (Azure Data Factory, Azure Synapse Analytics, etc.) and the Snowflake platform to validate data pipelines, transformations, and warehousing solutions.
- Perform tasks spanning the full lifecycle of data management activities including but not limited to defining completeness, accuracy, and consistency specifications by data element, writing scripts, and developing tools to monitor quality, and building defining and implementing controls for key data quality measures.
- Automation and Scripting: Create and maintain automated test scripts and frameworks for ETL processes and data validation, leveraging tools like Talend for ETL job automation
- Data Quality and Integrity Checks: Perform data quality, integrity, and accuracy checks, identifying and troubleshooting data discrepancies and implementing corrective actions.
- Collaboration: Work closely with data engineers, data architects, and business analysts to understand requirements, data flows, and business logic, ensuring thorough testing coverage.
- Reporting and Documentation: Document and report defects, issues, and enhancements; track progress on defect resolution and update stakeholders on testing status.
- Strong understanding and experience in development activities for all aspects of Software Development Life Cycle (SDLC). Data Vault Methodologies
- Excellent problem solving and critical thinking skills
- Effective communication skills with an ability to explain technical concepts to developers, product managers, and business partner.
- Knowledge and experience with database design principle including referential integrity normalization, and indexing to support application development
- Bachelors degree or masters degree in a quantitative field such as Computer Science and Information Systems, Database Management, Big Data, Data Engineering, Data Science, Applied Math, etc.
- 5+ years of professional data test engineering working experience.
- 3+ years of experience working with large data and variety of data sources.
- Experience in Data Vault 2.0 methodology
- Experience working in virtualized cloud environment including cloud-based IaaS/SaaS/PaaS solutions
- Experience in Power BI, and SQL
- Must be able to remain in a stationary position for a majority of the workday.