Who are we?


Careem is the leading technology platform for the greater Middle East. A pioneer of the region’s ride-hailing economy, Careem is expanding services across its platform to include payments, delivery and mass transportation. Careem’s mission is to simplify and improve the lives of people and build a lasting institution that inspires. Established in July 2012, Careem operates in more than 130 cities across 16 countries and has created more than one million job opportunities in the region

Who are you?


As a Data Warehouse Engineer at Careem, you'll be part of a team that builds and supports solutions to organize, process and visualize large amounts of data.

You will be working with technologies such as Hadoop, Hive, Spark, Spark streaming, Kafka, Python, Redash, Presto, Tableau and many others to help Careem becoming a data-driven company.

Some of the problems the team is working on are: Workflow processing, ETL engineering, Reporting infrastructure and access management.

You have:


Good communication skills and proactivity

3+ years of experience managing data in relational databases and developing ETL pipelines

2+ years of experience using Spark SQL and Hive to write queries and scripts

2+ years of experience with data visualization tools maintenance such as Redash and Tableau

Experience with workflow processing engines like Airflow, Luigi

Experience with Bash

Ability to dig deeper into the issues of the ETL pipelines, understand the business logic and provide permanent fixes.


Its Desirable if you have:


Exposure to enterprise level service such as Cloudera, Databricks, AWS, etc

Experience with Python or Scala  

Exposure to AWS data services and technologies (EC2, S3, EMR, Kinesis, Lambda, DynamoDB).

Apply for this Job

* Required