About Us

Careem is the leading technology platform for the greater Middle East. A pioneer of the region’s ride-hailing economy, Careem is expanding services across its platform to include payments, delivery and mass transportation. Careem’s mission is to simplify and improve the lives of people and build a lasting institution that inspires. Established in July 2012, Careem operates in more than 120 cities across 16 countries and has created more than one million job opportunities in the region 🌎.

About the role

As the part of Careem's AI team you'll be exposed to the following challenges:

  • Build ML and AI-based services and solutions that empower different Careem products, affecting the day-to-day life of 25+ million user
  • Have impactful contributions to the different components of our state-of-the-art Machine Learning platform  fulfilling our mission to democratize ML across Careem
  • Work on complex time series forecasting techniques, operating with a large scale datasets
  • Continuously challenging the status quo and investigating new technologies. In addition, you'd be following the industry's best-practices to ensure continuous growth

Requirements:

  • Strong software development skills in one or more programming languages (Python, Go, Java, Scala, C++)
  • Solid experience with classical ML techniques and algorithms (like Gradient Boosting), as well as the latest state-of-the-art Deep Learning approaches (LSTM, Attention Networks)
  • Ability to write well abstracted, reusable and clean code components, following industry best-practices
  • Extensive experience with various Deep Learning frameworks and tools (PyTorch, TensorFlow, Keras)
  • Ability research, develop and validate complex time-series forecasting techniques needed to fulfil business needs
  • Strong English communication skills and the ability to multitask, meet deadlines and drive keep decisions within a team
  • Ability to learn fast, be responsive, team oriented and comfortable working in a dynamic organization with minimal structure and process and collaborating with different teams such as Product Management and Data Science

Bonus points if you have:

  •  MSc. or Ph.D. degree in Mathematics, Physics, Statistics, Computer Science or related fields
  •  Hands-on experience developing scalable Data Science pipelines in a distributed environment using Apache Hadoop and/or Apache Spark

Apply for this Job

* Required

File   X
File   X