28 July 2020

If you are an experienced and passionate Data Engineer who enjoys working in a highly entrepreneurial work environment, this is the opportunity for you. Excellent career opportunity to manage, design, develop and implement big data lake platform across different countries.

Mandatory Skill(s)

  • Bachelor in IT, Computer Science or Engineering;
  • Hands on experience using Big Data technologies like Hadoop distribution (Hortonworks), Hive, HBase, Spark, Pig, Sqoop, Kafka and Spark Streaming.
  • Experience using Microsoft Azure technologies like Big Data Storage Gen2, Data Factory, Databricks, Azure ML, Synapse;
  • At least 2-3 years of solid hands-on development experience with ETL tools to transform complex data structure;
  • At least 1-3 years of big data programming techniques on coding in Python, Java, R and Scala;
  • Experience in creating data model (relational and/or data warehouse), data mart design and implementation;
  • Strong knowledge in various database technologies (Synapse / SQL);
  • Good understanding of data analytics and data visualization such as Power BI, QlikSense or Tableau;
  • Ability to communicate and present technical information in a clear and unambiguous manner;
  • Strong ability to work independently and cooperate with diverse teams in a multiple stakeholder’s environment;
  • Strong sense of work ownership, high affinity with anything data and a desire for constant improvements.

Desirable Skill(s)

  • Prior experience in Insurance domain.

Responsibilities

  • Design, develop, document and implement end-to-end data pipelines and data integration processes, both batch and real time;
  • Perform data analysis, data profiling, data cleansing, data lineage, data mapping and data transformation;
  • Develop ETL / ELT jobs and workflows, and deployment of data solutions;
  • Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency and cleanliness to optimize and fine tune processes;
  • Recommend, execute and deliver best practices in data management and data lifecycle processes, including modular development of data processes, coding and configuration standards, error handling and notification standards, auditing standards, and data archival standards;
  • Prepare test data, create and execute test plans, test cases and test scripts;
  • Collaborate with different stakeholders to understand data needs, gather requirements and implement data solutions to deliver business goals;
  • Provide technical support for any data issues and change requests, document all investigations, findings, recommendations and resolutions.
Apply to this Job