Senior DataOps Engineer Kafka & Integration Location: Sydney Employment Type: Perm - Hybrid About the Role: We’re looking for a Senior DataOps Engineer to take ownership of Kafka based distributed streaming platforms and messaging systems. In this role, you’ll be responsible for ensuring the reliability, scalability, and governance of streaming infrastructure while working with cloud and on-prem environments. This is a great opportunity for someone with a strong background in DataOps, automation, and observability to play a key role in shaping data integration strategy. What You’ll Do: Deploy, configure, and manage Kafka based streaming platforms in a hybrid cloud/on-prem environment. Oversee platform observability, monitoring, and performance tuning using tools like Grafana, ELK, and Prometheus. Manage distributed streaming, API gateways, and managed file transfer systems to ensure high availability and security. Define and implement best practices for platform onboarding, governance, automation, and maintenance. Work closely with Data & Integration teams to deliver robust streaming and integration solutions. Develop standard operating procedures (SOPs), resolve incidents, and analyse system performance. Automate infrastructure management and deployment processes to improve efficiency and reduce downtime. What We’re Looking For: - Proven experience deploying and operating Kafka in a production environment. - Strong background in cloud services (AWS, Azure, or GCP) and automation. - Hands-on experience managing messaging and integration platforms (e.g., TIBCO EMS, Java Spring Boot). - Expertise in monitoring, observability, and incident response. - Strong problem solving skills and experience delivering scalable, high performance solutions. - Financial services experience (preferred, but not required). If you are interested please send me your CV to mattmlansonpartners.com and we can set up a chat.