At Careem we are led by a powerful purpose to simplify and improve lives in the Middle East, North Africa and Pakistan. We're  pioneering the development of innovative services to aid the mobility of people, the mobility of things and the mobility of money.

We're in the driving seat as we help to define how technology will shape progress in some of the fastest-growing countries in the world. Our teams are building tech to meet the needs of the future in areas including data and AI, e-commerce, technology-enabled logistics, maps, identity, and fintech. 

We’re well placed to solve complex and meaningful challenges at scale, with deep tech expertise, strong regulatory relationships, a local presence, and increasingly specialised global teams which are structured to operate as autonomous start-ups. Our team of over 400 engineers and developers are empowered to develop cutting-edge technology every day.

Careem was established in July 2012, became a wholly-owned subsidiary of Uber Technologies, Inc. in January 2020, and today operates in over 100 cities across 12 countries.

About the team

The Data Warehouse team at Careem builds and supports solutions to organize, process and visualize large amounts of data. You will be working with technologies such as Hive, Spark, Spark streaming, Kafka, Python, Redash, Presto, Tableau and many others to help Careem become a data-informed company. Some of the problems the team is working on are: Customer 360, ELT engineering, reporting infrastructure, data reliability, data discovery and access management.

What you'll do

  • Work with engineering and business stakeholders to understand data requirements
  • Build and refine data flow and ETL processes
  • Performing data cleansing and enhancing data quality
  • Produce and maintain strong documentation
  • Take action with quality, performance, scalability, and maintainability in mind
  • Work in a collaborative, fast paced Agile team environment

What you'll need

  • 6+ years of experience with designing, building and maintaining scalable ETL pipelines
  • Good understanding of data warehousing concepts and modeling techniques
  • Hands-on experience automating repetitive tasks in any scripting language preferably python
  • Hands-on experience satisfying data needs using Spark SQL
  • Ability to dig deeper into the issues of the ETL pipelines, understand the business logic and provide permanent fixes
  • Desire to learn about new tools and technologies
  •  Experience with CICD using Jenkins, Terraform or other related technologies
  •  Familiarity with Docker and Kubernetes
  •  Experience working with real time data processing using Kafka, Spark Streaming or similar technology
  • Experience with workflow processing engines like Airflow, Luigi

What we'll provide you

In addition to a competitive long-term total compensation with salary and equity, we have a reward philosophy that expands beyond this. As a Careem colleague you will be able to: 

  • Be part of a hybrid working environment
  • Work from any country in the world for 30 days a year
  • Use Unlimited Vacation days throughout the year
  • Access fitness reimbursements for health activities including: gym, health club and training classes.
  • Work and learn from great minds 
  • Create impact in a region with untapped potential
  • Explore new opportunities to learn and grow every day

Apply for this Job

* Required

resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)