Design, build, and maintain scalable ELT pipelines in our modern data stack (Cloud Composer, Airflow, dbt, BigQuery, GitLab), Develop and maintain robust data models that power analytics, reporting, and product features, ensuring they follow best practices for performance and maintainability., Manage and optimise data infrastructure, including Cloud Composer environments and related GCP resources, in close collaboration with our infrastructure team., Implement orchestration, testing, and monitoring to ensure data quality, reliability, and traceability across the data warehouse., Troubleshoot pipeline or environment errors quickly and effectively., Collaborate closely with backend and frontend engineers (our backend is mostly in Java) to ensure seamless integration of data models and pipelines into the product., Contribute to architectural decision-making, e.g. by writing and reviewing RFCs, applying data engineering design patterns, and following architectural best practices., Promote documentation and knowledge sharing, ensuring our data systems and processes are transparent and accessible across the company., Mentor and support other engineers, fostering a culture of growth and continuous improvement within the team.