This is a Data Platform Engineer role with ResMed based in Sydney, NSW, AU ResMed Role Seniority - mid level, senior More about the Data Platform Engineer role at ResMed Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Data Platform Engineer We’re looking for a hands-on Data Platform Engineer who thrives on turning messy, complex data into trusted, production-grade pipelines and platforms . You’ll work across the full stack of data engineering, writing code, crafting SQL, building pipelines, automating releases, and ensuring everything runs reliably in AWS . You’ll own what you build, and you’ll help make data easier, faster, and safer for teams to use. What you’ll do Build, optimise, and operate reliable data pipelines and core platform capabilities Develop high-quality Python & SQL solutions for ingestion, transformation, and validation Enable safe, repeatable releases through CI/CD automation and strong engineering practices Collaborate through Git-based workflows (PRs, reviews, shared standards) Improve reliability with monitoring, alerting, and observability fundamentals Partner with stakeholders to deliver scalable, high-impact data solutions What you’ll bring We’re looking for candidates with experience as a Data Engineer, Platform Engineer, or Software Engineer , with hands-on delivery in production environments, including: Python & SQL for data pipeline development Snowflake or similar large-scale analytical platforms dbt or similar data transformation tools CI/CD tooling - GitHub Actions, Jenkins, or similar Infrastructure-as-code - Terraform, AWS CloudFormation Docker , with some exposure to Kubernetes or ECS AWS services - DMS, S3, Lambda, IAM, CloudWatch Git/GitHub - pull requests, code reviews, collaborative workflows Monitoring & observability fundamentals Good to Have Workflow orchestration - Dagster or Airflow Streaming / event-driven systems - Kafka or Kinesis Secrets management & cloud security best practices - AWS Secrets Manager , IAM least-privilege Log aggregation & observability platforms ( Datadog or Grafana) ML/AI workflow support or feature pipeline experience Experience in healthcare, regulated , or large-scale enterprise environments ResMed is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant. Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the ResMed team will be there to support your growth. Please consider applying even if you don't meet 100% of what’s outlined Key Responsibilities Building reliable data pipelines Developing high-quality solutions Collaborating through Git-based workflows Key Strengths Python SQL CI/CD automation Workflow orchestration Streaming systems Monitoring platforms A Final Note: This is a role with ResMed not with Hatch.