Key Responsibilities Build powerful data pipelines using Azure Data Factory, pulling from on-prem systems (Authority), APIs (CAMMS), and flat files (CSV, JSON). Manage a modern data lake architecture (Bronze → Silver → Gold) with Azure Data Lake Gen2 and Delta Lake. Use Azure Databricks to turn raw data into clean, high-quality assets ready for reporting and advanced analytics. Develop fast, reliable data models in Databricks and Power BI that scale with business growth. Enable self-service analytics and reporting by delivering trusted, well-governed datasets. Drive DevOps practices: implement CI/CD pipelines, version control (Git), and automated deployments. Own and enforce data quality standards and accuracy, completeness, and consistency are your thing. Be the go-to data enabler for the broader Data & Insights team. Look after the Azure tech stack and optimise performance, reduce costs, and keep it humming. Support audit and compliance initiatives by remediating data issues with the Team Leader. Required Experience A degree in Computer Science, Data Engineering, or a related field. 5 years of hands-on experience designing and building modern data platforms on Azure. Expertise with Azure Data Factory, Databricks, Delta Lake, and Data Lake Storage Gen2. Strong Python and SQL skills and you can code, transform, and optimise with ease. Experience working with on-prem databases, APIs, and a range of file formats (CSV, XLS, JSON). Solid understanding of CI/CD pipelines and DevOps in a data context. Familiarity with Azure Purview and Unity Catalog for data governance. A solutions-first mindset Why Join? The chance to work on end-to-end data solutions across a full Azure ecosystem. Autonomy to build smart systems and contribute your ideas. A supportive environment where innovation, quality, and data-driven thinking are the norm. Flexible working, friendly team culture, and a clear roadmap of exciting projects.