DevOps Engineer (Data)

Lalamove, Hong Kong

Skill Required:, IT and ICT
Preferred Experience: 
2 years of relevant work experience
Closing Date for Applications: 
8th October, 2020

Job Description

Lalamove is disrupting the logistics industry by connecting customers and drivers directly through our technology. We offer customers a lightning fast and convenient way to book delivery and moving services whether they are at their home, at work or on the go. People talk about O2O, we live it!

Our vision is to bring communities closer and make city life easier by allowing fast and convenient circulation of goods. We realise this vision with a ‘glocal’ approach, building a robust operations team to adapt our product to local networks of businesses and delivery contractors. At the same time, we have ambition to build an international brand by establishing an even more global presence.

As a DevOps Engineer in Data, you will contribute to the design, build and running of our data infrastructure, to increase the flexibility, scalability and efficiency of the Lalamove data platform, which is the foundation of business analytics, reporting, data engineering and data science. You will be building the CI/CD automation pipelines that streamline and optimize our development processes. You will be deploying container applications to Kubernetes, and scaling different components of our data platform on AWS.

What we seek:

  • Quick learner: you demonstrate the ability to learn new technology and frameworks quickly.
  • Problem solver: you are a problem solver with strong critical thinking skills, and willing to find creative solutions to difficult problems.
  • High autonomy: Self-organized, self-starter, passionate with a can-do attitude and take ownership of end-to-end projects. Ability to work independently yet teamwork oriented.

What you’ll need:

  • Experience in building CI/CD automation pipelines
  • Experience with Docker for containerization and Kubernetes for orchestration and production scaling
  • Experience with deploying container applications with helm charts
  • Experience of AWS architecture and administration in production environments

Plus but not required:

  • Understanding of data technologies that power data platforms (e.g.: Spark, Kafka, Airflow, Avro, Redshift, ELK, etc.)
  • Experience in managing Hadoop cluster on the cloud such as EMR
  • Programming experience in Python or Scala
  • Experience in managing and scaling Tableau Server


Recommend your friend

Copyrights 2017. All rights reserved | Technology Partner: Indev Consultancy Pvt. Ltd