Job Description:
- Becoming a Cloud Platform and on-prem data expert on services relating to data engineering stuff
- Design, build and maintain data architecture or data pipeline depending on use cases, Working out a solution to support customers’ data pipeline
- Enhance Data quality and give advice on data processing-concerns such as Change Data Capture (CDC)
- Create ETL script or stored procedure to deliver data as needed
Qualification:
- Bachelor’s Degree or Master’s Degree in Computer Science, Engineering, IT or similar
- Understanding of data architecture principles or data pipeline design is preferred.
- Experienced working with huge data and being good at data modelling (data warehousing / data management) Required
- Experienced with API based interactions and bulk integration to cloud-based platforms Prefered
- Be able to develop ETL scripts using python or SQL Required
- Systems administration skills in Linux
- Any experience with ETL tools and Big data tools either on-premise or cloud-based Prefered
- Knowledge of a container such as Kubernetes and Docker
- Open-minded for Multi-purpose responsibility
- Be able to communicate in English