Our client is seeking a Data Engineer to join their high performing data team. The Data Engineer will play a crucial role in designing, developing, and maintaining data pipelines and infrastructure on the Google Cloud Platform to support the organisation's data-driven initiatives. ROLE: Design and develop scalable, reliable, and efficient data pipelines using GCP technologies such as BigQuery, Cloud Dataflow, Cloud Storage and Pub/Sub. Build, optimise and schedule data workflows to ensure data quality, integrity and security throughout the entire data lifecycle. Perform data modelling, schema design and performance tuning to support efficient data storage and retrieval Collaborate with cross-functional teams including data scientists, analysts, Tableau Developers and other engineers to understand data requirements and implement solutions that meet business needs. Stay up to date with the latest advancements in Cloud tools and technologies and provide recommendations for adopting new tools and frameworks to enhance data engineering capabilities REQUIREMENTS: Bachelor's degree in computer science, data science or a related field Solid understanding of data modelling, ETL/ELT processes and data integration techniques including understanding of data engineering principles. Proficiency in SQL/BigQuery and Python with experience in writing efficient and scalable code Exposure to GCP services such as BigQuery, Cloud Dataflow, Cloud Storage and Cloud Composer(or Apache Airflow and similar orchestration tools) Understanding of version control systems (e.g. Git) and CI/CD practices. SALARY: $145k