10 January 2023
A great opportunity for Cloud Engineers to dabble into a hybrid role of Cloud and Data to run an Enterprise datalake used by many users and hosted across cloud and on-premises servers.
- Degree holder majoring in Business, Finance, Computer Science or equivalent;
- Experience and hands on skills in Terraform, Jenkins, AWS CodePipeline, AWS CodeBuild, SQL and Python;
- Experience in cloud application architecture and administration in AWS/GCP;
- At least 5 years of work experience in large scale infrastructure, ETL deployment, managing pipelines;
- Team player, self motivated and resourceful;
- Numerically inclined with a strong analytical mind;
- Basic experience in ETL design, implementation and maintenance.
- Prior experience in Insurance domain;
- Certified in AWS.
- Collaborating with data engineering and machine learning teams to optimize the data infrastructure for better reliability, maintainability, and scalability;
- Using various tools such as AWS, Glue Spark, Airflow, Tableau, and PowerBI to design, build, maintain, and improve data infrastructure on the cloud;
- Developing solutions to enhance data delivery capabilities, data quality monitoring, and the data pipeline lifecycle;
- Administering cloud applications such as AWS Glue, Sagemaker, LakeFormation, and Iceberg Lakehouse;
- Managing the regression testing suite and continuous integration and deployment pipelines.
If you are interested in this role, click on the “Apply to this job” button below or you could also write in with your CV to Brandon Koh Kim Leong at firstname.lastname@example.org quoting the job title.