The Role ASX-200 organisation seeking an experienced Data Engineer to design, build and operate resilient, scalable data platforms and pipelines at enterprise scale. You will transform complex, high-volume data into trusted, high-quality datasets underpinning analytics, reporting and machine-learning initiatives. Working closely with data scientists, analysts, product and engineering teams, as well as senior stakeholders, you’ll deliver production-grade data solutions that directly enable data-driven business outcomes. Key Responsibilities Design, build and optimise scalable ETL/ELT pipelines across cloud data platforms Develop robust data models in data warehouses and data lakes (e.g. Databricks / Delta Lake) Build and automate ingestion using Microsoft Fabric (Dataflows Gen2, Power Query, low-code integrations) Integrate data from APIs, relational systems and ERP platforms (including SAP ECC6 / S/4HANA ) Implement monitoring, alerting, data quality checks and cost/performance optimisation Contribute to data governance, lineage, documentation and platform standards Collaborate across analytics, AI/ML, product and engineering teams Experience 5 years’ experience in data engineering or similar roles Strong Python and SQL skills Hands-on experience with cloud data platforms ( Azure , AWS, or GCP) Experience with Databricks , data lakes and modern ELT tooling (e.g. dbt) Exposure to ERP / supply chain data (SAP ECC6, S/4HANA highly regarded) Solid engineering practices: Git, CI/CD, monitoring, documentation Ability to communicate clearly with both technical and non-technical stakeholders What’s on Offer North Sydney - hybrid working (2-3 days WFH) Modern data stack with genuine scale and complexity Opportunity to influence and evolve a core enterprise data platform Please apply now for interview slots this week!