6 months contract role with possible extension up to 1-2 years Attractive daily pay rate Work Location: Hybrid (you can be based in either Melbourne or Brisbane or Sydney) Purpose of the Role You will be responsible for ensuring fit for purpose data warehouse solutions are built for use across the Bank, utilising best practice ETL standards to ensure the data is trusted by key business consumers. Key Responsibilities: Design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management. Work with heterogeneous datasets in building and optimising data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include ETL/ELT, data replication/CDC, message-oriented data movement, API design and access and upcoming data ingestion and integration technologies such as stream data integration, CEP and data virtualization. Develop according to business requirements and provide third level operational support of the Data Warehouse, and other related data analytics support systems in production and non-production environments as required. Work with solution designers / architects, analysts, project managers, business SMEs, vendors and contractors to design, build and deploy new data solutions or enhancements to existing data solutions. Interpret business requirements to determine the solution required to implement new functionality within the data warehouse. Actively contribute towards optimizing existing ETL processes and data integration and data preparation flows and helping to move them in production. Transfer data warehouse knowledge to operational support team(s) and recommend tools and techniques that will assist with trouble shooting, automation and remediation of operational issues and processes. Follow agile methodologies, DevOps techniques and increasingly DataOps principles to data pipelines to improve the communication, integration, reuse and automation of data flows between data managers and consumers across an organization. Essential Criteria: 3 to 5 years in a similar role would be desirable. A bachelor’s or master’s degree in computer science, statistics, applied mathematics, data management, information systems, information science or a related quantitative field is required. Certifications in cloud and data technologies will be advantageous. 3 years of experience across design, development and maintenance of complex data warehouses/data lakes Demonstrated experience in implementing big data solutions in Azure using Spark/Databricks is essential along with proven experience in distributed technologies such as Spark, Data Lake, Data Solutions and Data Pipelines, ADF and Azure DevOps. Expertise in configuration, integration and data of banking applications desirable. If this sounds like you, please submit your resume by clicking the 'Apply Now' button. About Us At easyA , we connect skilled professionals with opportunities that make an impact. As authorised suppliers to multiple government and corporate organisations across NSW, ACT, QLD, and the Federal Government, we specialise in providing expert talent for critical projects. When you work with easyA , you benefit from our strong relationships with contractors and clients alike, ensuring smooth and transparent recruitment processes tailored to your needs.