16 November 2022
An exciting opportunity for a technical role in Data Engineering with a leading financial institution to support data operations.
- Bachelor's degree, preferably in a field (Finance, Statistics, Business Analysis, Computer Science);
- Minimum 3-5 years of experience in Data Engineering experience such as ETL, Data Modelling, Data Architecture, building datalakes;
- Good hands-on experience in SQL and Python programming;
- Good experience in data management process and lifecycle;
- Good experience using containers for Data Engineering workload;
- Hands-on experience on AWS platform including AWS Athena, Glue Pyspark, RDS-PostgreSQL, S3 and Airflow;
- Good exposure to ETL and Data Integration tools (e.g. SQL);
- Strong aptitude for numbers and comfortable handling large volumes of data;
- Meticulous and organised with good interpersonal and communication skills.
- Experience in financial industry.
- Design, build and operationalize enterprise data solutions and application using AWS analytics solutions such as Spark/Python, Airflow/Lambda;
- Work closely with development, infrastructure and data centre teams to define CI/CD processes;
- Analyze, re-architect & re-platform on-prem data warehouse to data platform on AWS;
- Design and build production ETL data pipeline from ingestion to consumption with big data architecture;
- Design and implement data engineering, ingestion and curation functions on AWS cloud;
- Assess current state data platform and create transition path to AWS cloud.
If you are interested in this role, click on the “Apply to this job” button below or you could also write in with your CV to Brandon Koh Kim Leong at firstname.lastname@example.org quoting the job title.