Benefits 12-month contract Competitive day rate Enterprise Data Platform 3-year program of work Responsibility's Implement data cleaning and transformation logic within Databricks using dbt core, Spark SQL, Python following the medallion architecture principles. Collaborate with data analysts, data scientist, data stewards and data modellers to understand business needs and translate them into technical efficient data models. Develop and maintain data models in gold layers using dbt Core to create consumable datasets for reporting and analysis. Collaborate with data analysts and business stakeholders to ensure the new platform meets their needs. Implement best practices for data Security, governance, and scalability within the data pipelines. Requirements Minimum 5 years of experience as a Data Engineer or a Similar role in a mature Datawarehouse environment. Proven commercial experience with cloud-based platforms, preferable Microsoft Azure and Databricks Experience in building and maintain data pipelines using tools like Azure Data Factory (ADF) and FiveTran Proficient in data modelling concepts and experience with dbt core. Commercial Experience working with Delta Lake on Databricks for reliable data storage and efficient data processing. Familiar with medallion architecture for data warehousing. Excellent problem-solving and analytical skills Effective communication and collaboration skills. Experience working with CI/CD pipelines for data engineering workloads. If interested, please apply by uploading your CV and we will be in touch. Thanks