9 January 2023
Looking for a highly technical backend Data Engineer with years of experience in AWS and it’s services to build Hybrid Cloud Data Solutions.
- Bachelor’s degree, preferably in a field (Computer Science, Information Technology)
- Possess 3-5 years of experience in Data Engineering such as ETL, Data Modelling, Data Architecture, building datalakes;
- Good hands-on experience in SQL and Python programming;
- Good experience in data management process and lifecycle;
- Good experience using containers for Data Engineering workload;
- Hands-on experience on AWS platform services including AWS Athena, Glue Pyspark, RDS-PostgreSQL, S3 and Airflow;
- Good exposure to ETL and Data Integration tools (e.g. SQL);
- Strong aptitude for numbers and comfortable handling large volumes of data;
- Meticulous and organised with good interpersonal and communication skills.
- Experience in financial industry;
- Experience with AWS hybrid datalake.
- Design and build production ETL data pipeline from ingestion to consumption with big data architecture;
- Design and implement data engineering, ingestion and curation functions on AWS cloud;
- Analyze, re-architect & re-platform on-prem data warehouse to hybrid data platform on AWS;
- Design, build and operationalize enterprise data solutions and application using AWS analytics solutions such as Spark/Python, Airflow/Lambda;
- Assess current state data platform and create transition path to AWS cloud.
- Work closely with development, infrastructure and data centre teams to define CI/CD processes;
- Engage with stakeholders to maintain standards;
If you are interested in this role, click on the “Apply to this job” button below or you could also write in with your CV to Brandon Koh Kim Leong at firstname.lastname@example.org quoting the job title.