This is a Data Engineer - Clearing Risk Quantitative Development role with ASX based in Sydney, NSW, AU ASX Role Seniority - mid level, senior More about the Data Engineer - Clearing Risk Quantitative Development role at ASX ASX: Powering Australia's financial markets Why join the ASX? When you join ASX, you’re joining a company with a strong purpose – to power a stronger economic future by enabling a fair and dynamic marketplace for all. In your new role, you’ll be part of a leading global securities exchange with a strong brand. We are known for being a trusted market operator and an exciting data hub. Want to know why we are a great place to work, click on the link to learn more. www.asx.com.au/about/careers/a-great-place-to-work We are more than a securities exchange! The ASX team brings together talented people from a diverse range of disciplines. We run critical market infrastructure, with 1 in 3 people employed within technology. Yet we have a unique complexity of roles across a range of disciplines such as operations, program delivery, financial products, investor engagement, risk and compliance. We’re proud to foster a workplace where diversity is celebrated and inclusion is part of our everyday culture. Our employee-led networks champion LGBTIQ inclusion, promote gender equality, accessibility and wellbeing, inspire giving and volunteering, and celebrate cultural and religious events, creating a sense of belonging for all. As an AWEI Bronze employer and member of the Champions of Change Coalition for gender equality, we’re committed to a fair and inclusive workplace where everyone can thrive. Your Team The Clearing Risk Quantitative Development Team is a small team of developers with a range of technical and quantitative skills. We are responsible for developing and maintaining software tools and applications to quantify, evaluate and monitor the clearing risks faced by ASX Central Counterparties. We work closely with Quantitative Analysts, Risk Control, Data Analytics and Model Validation to accurately capture and understand complex mathematical models, algorithms and reporting requirements. Your responsibilities This is a hands on role - design, build, and maintain reliable data pipelines and platforms to support analytics and business use cases Synthesising and communicating complex requirements including logic and data flows from quantitative models and production tools. Translate these into structured, testable, maintainable software solutions. Contribute to maintenance and production support of existing models, tools and scripts Help to drive uplift and continuous improvement of our processes, applications and deployment frameworks. Work across the entire software development lifecycle from gathering requirements to planning, coding, testing, deployment and maintenance Apply a continuous improvement mindset to standards, methods and processes Your experience and qualifications Must have Extensive experience working with a variety of AWS services for ingestion, curation, storage, orchestration, data management, analytics and distribution Good knowledge of data modelling and best practices in structuring data within a data warehousing environment Experience with data governance and management standards. Experience in real time streaming services such as Confluence Kafka and AWS. Demonstrated experience with developing or supporting modern (preferably AWS) data ecosystem. For example, Airflow, AWS Glue, Apache Iceberg, Athena, Redshift, and DataZone Experience in .NET/C#, with a strong grasp of software design patterns, scalable architectures, and engineering best practices. Ability to contribute to DevSecOps practices to achieve high levels of automation across complex data pipelines. Good understanding of vanilla financial market products (exchange traded options, futures, equities and OTC swaps), including trade lifecycle, pricing, and risk management concepts. Demonstrated ability to produce clear, accurate, and maintainable technical documentation. Strong collaboration skills, with a proven ability to work effectively in cross functional, team-based environments Nice to have Tertiary qualifications in a quantitative discipline such as informatics, mathematics, physics, engineering, or quantitative finance Experience with infrastructure-as-code tools such as Terraform to support cloud-native deployments Experience designing and implementing real-time data streaming solutions such as Confluent Kafka and AWS-based services We make hiring decisions based on your skills, capabilities and experience, and how you’ll help us to live our values. We encourage you to apply even if you don’t meet all the criteria of this role. If you need any adjustments during the application or interview process to help you present your best self, please let us know at careers@asx.com.au. At ASX Group, our diverse workforce is essential to build and maintain a fair and dynamic marketplace. We support flexible working and offer hybrid working options. Even if our roles are advertised as full-time, we encourage you to apply if you are interested in part-time or other flexible working arrangements. We will arrange for successful candidates to have background checks, including reference and police checks, completed as part of the on-boarding process. To be considered for this position, candidates must be legally authorised to work in Australia on a permanent basis without any restrictions. Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the ASX team will be there to support your growth. Please consider applying even if you don't meet 100% of what’s outlined Key Responsibilities Designing and maintaining data pipelines Synthesising requirements Continuous improvement Key Strengths ☁️ AWS services Data modelling Collaboration skills Tertiary qualifications ️ Infrastructure-as-code tools Real-time data streaming solutions A Final Note: This is a role with ASX not with Hatch.