About the role You will work as a key member of a collaborative data engineering team, responsible for building, maintaining, and uplifting core GCP data platforms. You take ownership of defined deliverables, contribute to reliable and scalable solutions, and help drive best practice across documentation, observability, and deployment. You will work closely with engineers, vendor partners, and business stakeholders to ensure data enhancements and platform improvements are delivered safely, effectively, and on schedule. Key responsibilities • Deliver CI enhancements and problem tickets with accountability for quality outcomes • Coordinate daily run tasks and collaborate with onshore and offshore partners to support business critical data services • Support and improve production delivery pipelines, ensuring changes are released safely and align with operational requirements • Champion documentation standards and contribute to reusable guidance that supports long term platform resilience • Participate in cross training and knowledge sharing to build collective capability and reduce dependency on individuals • Improve observability through Cloud Logging, Cloud Monitoring, and alerting for production retail loads, stock feeds, and pricing updates • Communicate technical concepts clearly to non-technical stakeholders to support informed decision making Technical skills Core platform GCP • BigQuery, writing and tuning queries to support data warehouse solutions • Cloud Functions, Cloud Run, and Cloud Storage for scalable serverless workloads • Airflow, Cloud Composer, building and maintaining orchestration pipelines (DAGs) • Python, writing readable ETL scripts and integrations with GCP services • Strong SQL capability with attention to performance and data integrity • Experience with CI and CD practices using GitHub Actions or Azure DevOps • Understanding of cloud integration patterns and production support principles What you will bring • Four or more years experience as a Data Engineer delivering data warehouse and integration solutions • Strong GCP knowledge, specifically BigQuery, Cloud Functions, Cloud Storage, Dataflow, and Composer • Solid understanding of data modelling, data structures, and system design for reliable data delivery • Experience working in structured IT delivery environments, managing tasks, and prioritising effectively • Proven communication skills engaging with multiple stakeholders across technology and business teams • Ability to capture requirements, articulate the business outcome, and deliver to an agreed definition of done • Commitment to continuous improvement with awareness of emerging cloud and data engineering practices