10 August 2022
An exciting opportunity for a technical role in Data Engineering with a leading financial institution to support data operations.
- Bachelor's degree, preferably in a field (Finance, Statistics, Business Analysis, Computer Science);
- Minimum 3-5 years of experience in Data Engineering experience;
- Good hands-on experience in SQL programming;
- Good experience in data management process and lifecycle;
- Good experience using containers for Data Engineering workload;
- Has strong experience in Big Data technologies such as Hadoop, Spark, BigQuery, Redshift, Presto or KSQL;
- Good experience with Cloud technologies;
- Good exposure to ETL and Data Integration tools (e.g. SQL)
- Strong aptitude for numbers and comfortable handling large volumes of data;
- Meticulous and organised with good interpersonal and communication skills.
- Experience in financial industry.
- Responsible for developing contextual data sets using SQL for Enterprise Analytics;
- Responsible for building and managing sub-second/real time data pipelines;
- Running Containerized ETL workflow at scale;
- Conducting assessment and examination of data from the source to ensure consistency;
- Assisting the IT Subject Matter Experts to create and reviewing the data transformation logic;
- Ensures documentation of processes, analytics design, measure definitions, data integration, and development;
- Responsible to identify and escalate any data quality issues to the respective teams;
- Guide Data Analyst and Data Scientists to write efficient queries and workloads.
If you are interested in this role, click on the “Apply to this job” button below or you could also write in with your CV to Brandon Koh Kim Leong at email@example.com quoting the job title.