About Dataro Dataro is an ethically minded SaaS startup using machine learning to help not-for-profits raise more money and do more good. Our platform powers fundraising for organisations around the world, helping them run smarter campaigns and improve donor engagement using data-driven insights. If you want to build meaningful technology with real social impact - while working in a modern, supportive engineering culture - we'd love to meet you. The Role We're looking for a Data Engineer to help build and maintain the data pipelines that power Dataro's platform. Every day we ingest and process hundreds of millions of records from a wide range of fundraising and CRM systems. You'll work across our Python backend and AWS infrastructure to ensure data flows reliably from our customers' systems through to our predictive models and analytics products. This is a hands-on role where you'll build integrations, solve data problems, and grow your skills - while shipping meaningful improvements week-to-week. What You'll Do Build, extend, and maintain data pipelines that process and normalise large volumes of records Develop and support integrations across fundraising, CRM, and adjacent platforms Troubleshoot and resolve data issues across ingestion, transformation, and delivery pipelines Write clean, well-tested code and participate in thoughtful code reviews Collaborate with data scientists and engineers to ensure Dataro models run effectively Help shape engineering best practices What You'll Bring 2 years professional experience in data engineering or backend software engineering Strong Python and SQL skills, comfortable working with large datasets (PostgreSQL preferred) Experience building and supporting data pipelines, including deployment, monitoring, and troubleshooting Experience with AWS or equivalent cloud platform Proven experience with Git, CI/CD pipelines, and automated testing A clear communicator who works well across disciplines Nice to Have (But Not Required) No single person will have all of these - they're opportunities to grow. Experience with DuckDB, Athena, or similar analytical query engines Familiarity with serverless architectures (Lambda, Serverless Framework) Docker, containerisation, or infrastructure-as-code tooling Experience in the not-for-profit sector Why You'll Love Working With Us Work on socially meaningful technology that directly helps charities raise more money. Small but high-calibre engineering team - real autonomy, real ownership. Modern data stack (Python, Serverless AWS, Postgres, DuckDB, S3, Athena, etc.). We want smart engineers who understand how software works at a deep level - and who aren't afraid to use modern AI tools to ship better features faster. Flexible working arrangements (WFH office in Sydney). Supportive, transparent, mission-driven culture. If this sounds like you, we'd love to hear from you. Apply here.