Senior Data Engineer required to support the development of an Azure Databricks platform for a leading education business. Responsibilities: design and development of pipelines using Databricks and Lakehouse optimisation and maintenance of data workflows, ensuring quality and integrity performance tuning and monitoring set up and manage CI/CD pipelines Notebook development using Python and/or Pyspark adhere best practices for data engineering, including governance and security Requirements: extensive experience building data pipelines in an Azure Databricks environment well-versed with PySpark and SQL experience with Delta Lake Click on the 'Apply' button or email your resume to sahuja@fourquarters.com.au