This is a Data Engineer - GCP role with Ingrity based in Sydney, NSW, AU Ingrity Role Seniority - junior, mid level More about the Data Engineer - GCP role at Ingrity We are looking for a Junior to Mid-Level GCP Data Engineer to join our Data Engineering team. This is a fantastic opportunity to work with cloud technologies, contribute to building scalable data pipelines, and collaborate with both technical teams and senior stakeholders. Key Responsibilities Assist in designing, developing, deploying, and maintaining data pipelines and ETL processes on Google Cloud Platform (GCP) using Cloud Composer (Apache Airflow) and dbt with Python. Support CI/CD pipelines (Buildkite preferred) to ensure reliable and automated code deployment. Manage Git repositories, including pull requests, and link changes to Jira service requests. Contribute to Slowly Changing Dimension (SCD) Type 2 processes for historical data updates. Schedule and orchestrate workloads to optimize performance, scalability, and reliability. Follow data governance and security best practices to ensure compliance and data integrity. Collaborate with analysts and data scientists to enable access to structured and unstructured data. Document changes clearly in Atlassian Confluence and present deployment updates to CAB when required. Required Experience And Skills 1–3 years of experience in data engineering, ETL, or cloud-based data platforms. Hands-on experience with GCP services such as Cloud Composer, BigQuery, and Pub/Sub. Familiarity with dbt using Python to build ETL models and logical tests. Understanding of CI/CD processes, preferably using Buildkite or similar tools. Experience managing Git repositories and working with Jira for tracking changes. Knowledge of SCD Type 2 and basic data modeling concepts. Exposure to orchestration and scheduling tools (Apache Airflow preferred). Familiarity with other cloud platforms (AWS, Azure DevOps) is acceptable if you can apply similar concepts. Soft Skills Comfortable communicating with senior leaders. Able to explain technical approaches to non-technical stakeholders. Confident summarizing changes for documentation and deployment purposes. Comfortable presenting technical changes to CAB. Preferred Qualifications Programming skills in Python, Java, or Scala. Familiarity with containerization technologies such as Docker or Kubernetes is a plus. Bachelor’s degree in Computer Science, Engineering, or a related field. If you’re a Junior to Mid-Level GCP Engineer passionate about building data pipelines and contributing to data-driven decision-making, we’d love to hear from you. Please submit your resume highlighting your relevant experience and achievements. Before we jump into the responsibilities of the role. No matter what you come in knowing, you’ll be learning new things all the time and the Ingrity team will be there to support your growth. Please consider applying even if you don't meet 100% of what’s outlined Key Responsibilities Designing and developing data pipelines Supporting CI/CD pipelines Collaborating with teams Key Strengths ☁️ GCP services ETL processes CI/CD processes Programming skills Containerization technologies Data modeling concepts A Final Note: This is a role with Ingrity not with Hatch.