- Work cross-functionally with business owners/product managers/engineers, and designers throughout the whole process of the data science projects
- Develop, deploy, and maintain scalable infrastructure for AI models and algorithms. The domain includes but not limited to Logistics and Supply Chain Management, Natural Language Processing, Large-scale Scheduling and Optimisation, Geographic Solution and Maps.
- Extract data insights from massive data sets, and build data visualisation for business and technical audiences. Communicate solution approaches to work with partners and stakeholders efficiently
- Design and conduct data science experimental framework to enable organization-wide data-driven practice
- Masters or PhD in Computer Science, Operations Research, Mathematics, Statistics or fields related to data mining preferred with minimum 2 years working experience in relevant industry
- Excellent communication skills with the ability to identify and communicate data-driven insights and solutions
- Deep understanding in algorithmic foundations of optimization, NLP, statistics and probability
- Experienced in at least one programming language (e.g., Golang, C++, Java, Python) and Unix/Linux system
Having one or more of the following skills will be a big plus.
- Practical experience in logistics and map relevant industry (POI, map matching, geocoding etc.)
- Practical development experience in real-time large-scale optimisation algorithms (e.g., supply chain network optimisation, vehicle optimisation optimisation)
- Practical experience in Kubernetes/Docker