I

Intermediate Data Engineer at Ignition Group

Ignition Group
Full-time
On-site
Description

Data Pipeline Development and Optimisation


Build, maintain, and optimise robust, production-grade data pipelines using Snowflake and modern orchestration tools.
Implement data transformations, ingestion, and integration across multiple sources (CRM, WFM, Telephony, Digital).
Builds production-grade pipelines with orchestration
Apply Snowflake best practices for warehouse design, schema structuring, and performance tuning.


Automation and Orchestration


Integrate Cortex Agents or similar automation tools to streamline data workflows.
Implement and maintain ETL/ELT orchestration using tools such as Airflow or dbt.
Implements automated workflows and test.
Implements data tests and monitoring
Automate repeatable processes to enhance reliability and scalability.


Data Quality, Lineage, and Governance


Apply data quality checks, lineage tracing, and validation rules to ensure data accuracy and consistency.
Document data processes, pipelines, and source-to-target mappings for transparency and maintenance.
Implements RBAC and lineage documentation.
Support adherence to organisational data governance and compliance standards.


Data Modelling and Analytics Enablement


Design and maintain analytical data models and structures for business reporting and dashboards.
Work closely with analysts and business partners to translate KPIs into well-defined data structures.
Ensure data is structured to support contact centre KPIs such as AHT, FCR, NPS, occupancy, and SLA.


Collaboration and Technical Contribution


Collaborate cross-functionally with Senior Data Engineers and Data Architects to improve performance and reliability.
Participate in peer reviews, design sessions, and technical documentation activities.
Contribute to continuous improvement initiatives and knowledge sharing across the team.


Continuous Learning


Stay current with advancements in data engineering, automation, and cloud-based data technologies.
Proactively learn and apply new frameworks and performance optimisation techniques.


Requirements


Bachelor's degree in computer science, Information Systems, or Data Science.
Certifications in Snowflake, Python, or data engineering technologies advantageous.


Experience


3 - 5 years of professional experience in data engineering.
Proven record of accomplishment of delivering and maintaining production-grade data pipelines.
Demonstrated ability to optimise data processes and integrate automation.