
| Job Title: | Data Engineer |
|---|---|
| Company: | Interfront |
| Location: | Pretoria, Gauteng (Hybrid) |
| Position Type: | Contract |
| Contract Duration: | 6 – 9 months |
| Closing Date: | 12 November 2025 |
| Available Positions: | 2 |
| Work Level: | Mid-Level |
| Salary: | Market Related |
| Reference Number: | Recruit198-199 |
Interfront Data Engineer Contract Role – Pretoria, Gauteng
About the Role
Interfront is seeking a skilled Data Engineer for a hybrid contract position based in Pretoria. This role offers the opportunity to work on end-to-end data engineering projects, including designing ETL/ELT pipelines, managing feature stores for machine learning models, and building data warehouses to support analytics and business intelligence initiatives. The position is open to persons with disabilities.
Key Responsibilities
Data Engineering & Pipeline Management
- Design, build, and optimize T-SQL stored procedures, functions, and scripts for high-volume data processing.
- Develop, deploy, and monitor ETL/ELT workflows using SQL Server Agent, SSIS, Azure Data Factory, or Airflow.
- Perform data cleansing, transformation, and preparation for BI and machine learning workflows.
- Engineer reusable feature store tables per entity/tax type to support ML models and operational scoring.
- Model and maintain data warehouse structures (3NF, star, snowflake schemas) with proper documentation.
- Prepare curated and scored datasets for downstream consumption in Power BI dashboards and analytics platforms.
- Maintain audit, telemetry, and job tracking tables for reliability, restartability, and monitoring.
- Support production pipelines and optimize query performance using indexing, tuning, and profiling.
Data Quality, Governance, and Compliance
- Implement and monitor data validation, reconciliation, and QA frameworks.
- Enforce data security, privacy, and compliance standards according to corporate and regulatory guidelines.
- Support data governance initiatives, including lineage documentation and adherence to EDM policies.
Collaboration and Cross-Functional Support
- Collaborate with data analysts, data scientists, software engineers, and business stakeholders to translate requirements into scalable solutions.
- Deliver accessible, well-documented datasets for reporting and analytics.
- Participate in all phases of the SDLC, including requirements gathering, design, development, testing, deployment, and maintenance.
Requirements
- Tertiary qualification in Computer Science, Information Systems, Data Engineering, Analytics, Mathematics, or Statistics OR Matric with 6–8 years relevant experience.
- Proven experience with SQL Server, including advanced T-SQL development, ETL/ELT workflow design, and performance tuning.
- Experience building and maintaining data warehouses, feature stores, and reusable data products.
- Hands-on experience with SQL Server Agent, SSIS, Azure Data Factory, or Airflow.
- Familiarity with cloud-based architectures, preferably Azure, and version control systems like Git.
- Exposure to BI tools such as Power BI for reporting and analytics enablement.
- Strong knowledge of data security, privacy, governance, and compliance standards.
- Programming skills in SQL; Python or R is a plus.
Key Skills and Competencies
- Advanced T-SQL and SQL Server development.
- ETL/ELT pipeline design and orchestration.
- Data modeling (3NF, star, snowflake schemas).
- Feature store development for ML and operational scoring.
- Data validation, QA, and reconciliation.
- Data governance, lineage, and compliance adherence.
- Workflow orchestration with checkpoint/rollback and job tracking.
- Performance optimization of queries and pipelines.
- Collaboration and communication across technical and business teams.
Why Join Interfront?
This contract role provides exposure to high-impact data engineering projects in enterprise environments, hybrid working flexibility, and collaboration with multi-disciplinary teams. The successful candidate will gain hands-on experience in advanced SQL, ETL pipelines, data warehouses, feature stores, and cloud-based solutions.
How to Apply
Submit your application through the Interfront recruitment portal or via the specified recruitment reference Recruit198-199 before 12 November 2025. Ensure your CV highlights relevant SQL Server, ETL, data warehouse, and cloud experience.
Apply Now



