6 months contract role with possible extension Fantastic pay rate up to $1000/day super Work Location: Hybrid (WFH Parramatta office) Great opportunity to work in a supportive work environment Purpose of the Role This role requires expertise in data ingestion, transformation, and processing using SSIS, Synapse Pipelines, Spark, and proficiency in Master data management. You will also be responsible for ensuring data quality, performance, and scalability of data solutions, and contributing to the development of Power BI reports within Microsoft Fabric. Our client is currently transitioning to a modern Lakehouse architecture using Synapse Analytics, while maintaining and enhancing our existing SQL-based Datawarehouse. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines for ingesting data from various source systems into our SQL Dedicated Pool and Synapse Lakehouse platforms. Develop and optimise data transformation logic using T-SQL and Spark (Python) within Synapse Pipelines. Develop Spark notebooks for complex data transformations and processing within the Synapse environment. Implement data quality checks and validation rules to ensure data accuracy and consistency. Build and deploy Synapse Pipelines for orchestrating data ingestion, transformation, and loading processes in Azure Synapse Analytics/ Microsoft Fabric. Collaborate with data architects to implement and maintain the data model and metadata management. Work with Power BI developers to provide curated and performant datasets for reporting and analytics within Microsoft Fabric; also develop analytics reports if required. Understand and implement medallion architecture principles (Bronze, Silver, Gold layers) within the data platforms. Participate in code reviews and contribute to team best practices. Monitor and troubleshoot data pipeline performance and resolve data-related issues in non-production environments Document data pipelines, data models, and other technical artefacts. Develop batch integration processes with Profisee for Master Data Management (MDM), including defining matching and survivorship rules for golden record creation when required. Essential Criteria: 5 years of experience in data engineering or a similar role. Strong understanding of data warehousing concepts, dimensional modeling, and ETL/ELT principles. Expertise in the following is essential: Master Data Management (MDM) concepts and tools, preferably Profisee. Azure Data Lake Storage Gen2, CI/CD pipelines for data engineering preferred. Developing and deploying SSIS packages would be beneficial. T-SQL and experience working with SQL Server and/or Azure Synapse Dedicated SQL Pools. Azure Synapse Analytics, including Synapse Pipelines and Spark Pools. Spark (Python) for data processing and transformation. Familiarity with Microsoft Fabric components (Data Factory, Synapse Data Engineering, Power BI). If this sounds like you, please submit your resume by clicking the 'Apply Now' button. About Us At easyA , we connect skilled professionals with opportunities that make an impact. As authorised suppliers to multiple government and corporate organisations across NSW, ACT, QLD, and the Federal Government, we specialise in providing expert talent for critical projects. When you work with easyA , you benefit from our strong relationships with contractors and clients alike, ensuring smooth and transparent recruitment processes tailored to your needs.