22 July 2020
We are seeking a highly self-motivated and experienced Data Architect to help our client with architect, develop and maintain data pipeline, data warehouse system and tools.
- Degree in Information Technology, Computer Engineering, Computer Science, Information System or related;
- Has at least 10 years of working experience in Data Architectural roles;
- Experience in development in the area of data engineering using Scala, Python, Go, Java or Shell scripting;
- Familiar with Hadoop, Spark, Scala, Hive, Kafka;
- Knowledge of SQL and in-depth knowledge of AWS Cloud;
- Comfortable working in Linux and containerized environment;
- Expert level understanding of normalization techniques;
- Good understanding of modern software engineering best practices such as microservices, SOA, design pattern etc;
- Good technical and analytical skills;
- Excellent communication and leadership skills;
- Strong personality and yet personable to enrich relationships within the organization.
- TOGAF certified;
- Experience working in the Media domain;
- Familiar with Agile methodology.
- Architect, develop and maintain data pipelines, highly scalable big data platforms, data warehouse systems and tools;
- Build APIs for receiving data from internal applications and create processes for loading data;
- Present architecture designs and ideas to the technical and non-technical stakeholders through various written, verbal and visual communication methods;
- Troubleshoot complex issues and providing root cause analysis;
- Ensure smooth delivery of the projects and products by working closely with the product, technical, business and analytics team;
- Manage the storage of large volume of data and try to optimize jobs and resource allocation;
- Look into data security best practices and how they can be implemented in the organization.