This is an exciting opportunity for a Senior Data and Integration Engineer, to contribute to the successful delivery of the Digital Transformation initiatives of a large organisation. This role involves configuring and developing integration solutions using MuleSoft, Airflow, and other technologies to enhance student support and administration. Responsibilities: System Configuration: Configure and customise connectors, workflows, and APIs within MuleSoft, Airflow, and Kafka to meet specific business needs. API Development: Develop, Test and maintain APIs for seamless data exchange between various systems, including Snowflake, focusing on using MuleSoft, Airflow, Appflow, Kafka and similar tools for efficient API integration. Data Warehousing: Utilising your expertise in Apache Airflow and Snowflake (preferable) to build scalable and reliable data pipelines, including using DBT to transform data in the Snowflake data warehouse Integration Support: Provide ongoing support for integration platforms such as Apache Airflow and MuleSoft (CloudHub and Flex Gateway) within IT operations. Collaboration: Work with cross-functional IT Support teams for incident resolution and troubleshooting User Support: Provide day-to-day support to business users and other system users, troubleshoot issues and provide solutions in a timely manner. Performance Optimisation: Monitor and optimise the performance of integration solutions, ensuring high availability and scalability. Requirement Analysis: Collaborate with business and IT stakeholders to gather and understand integration requirements and business needs, translating them into technical specifications. Continuous Improvement: Stay updated on the latest features, functionalities, and best practices for MuleSoft, Airflow, Appflow and Kafka and provide recommendations for system enhancements and innovations. Project Delivery: Design and develop integration solutions to support the project. Ensure efficient communication with project and business stakeholders. Requirements: At least 3 years of experience in integration development, with specific expertise in using Apache Airflow, FastAPI, MuleSoft (Cloud Hub and Flex Gateway), Kafka, Kubernetes , container technology, AWS Appflow, and CI/CD. Proficient in Python and SQL Strong understanding of integration patterns such as batch processing, event-driven, API, Pub/Sub, along with knowledge of the advantages and limitations of different technologies in various contexts. Solid understanding of Data Warehouse and Data Engineering technologies, including DBT and Snowflake . Familiarity with the Salesforce platform and AWS services . Strong communication and collaboration capabilities. Experience with CI/CD pipelines using Git . Relevant certifications in MuleSoft, Airflow, Snowflake, and AWS would be advantageous. Apply to this role with your updated resume at the earliest Please contact Urvi Thacker if you have any queries.