Senior Data Engineer – Data Operations Overview: An esteemed organization is seeking a highly skilled Senior Data Engineer to join its Data Operations Team. This pivotal role involves designing, building, and maintaining scalable, cost-effective data pipelines and environments within a cloud-based Enterprise Data Lake (AWS). The position ensures seamless data integration and flow across various data platforms, supporting critical data operations and advanced analytics initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines within the Enterprise Data Lake (AWS). Ensure seamless integration and data flow between data lakes, warehouses (both legacy and new), and other platforms. Support data ingestion, transformation, governance, and automation while ensuring high data quality, security, and performance. Collaborate with Data Science teams to support advanced analytics and machine learning projects. Implement best practices for data governance, security, and compliance, including access control, encryption, and privacy considerations. Contract Details: Duration: Initial six-month engagement, with potential for extension. Reporting Line: Reports directly to the Director of Data Operations. Team Structure: No direct reports. Qualifications: AWS Certified Data Engineer – Associate (preferred). Additional AWS certifications such as Developer – Associate, Solutions Architect – Associate, or SysOps Administrator – Associate are advantageous. Experience and Expertise: Minimum 5 years of data engineering experience, with a strong focus on AWS. Proficiency with AWS services, including S3, Glue, Lambda, Step Functions, and Athena. Advanced Python and SQL skills for data processing and transformation. Hands-on experience with AWS CDK for infrastructure as code and automation. Proven experience in data governance, security, and compliance. Familiarity with working in cross-functional development teams within large, complex environments. Key Skills and Knowledge: Strong problem-solving and debugging skills, particularly in data pipeline performance optimization. Experience with DevOps tools such as GitHub Actions for CI/CD automation. Ability to communicate complex data engineering concepts clearly to stakeholders. Strong collaboration skills, with the ability to work both independently and within a team. Proficiency in data formats including YAML, JSON, Parquet, and XML. Knowledge of Workday SOAP API is a plus. Experience with DBT and Snowflake is highly desirable. Work Environment: The Data Operations Team works in an agile environment, managing multiple projects simultaneously. The team values collaboration and effective communication to achieve project goals. Flexible working arrangements include designated in-office days and opportunities for remote work. Why Apply? This role offers the chance to contribute to impactful data projects within a highly collaborative environment. The successful candidate will have the opportunity to leverage cutting-edge cloud technologies, support advanced analytics initiatives, and shape data operations best practices.