Careem is the leading technology platform for the greater Middle East. A pioneer of the region’s ride-hailing economy, Careem is expanding services across its platform to include payments, delivery and mass transportation. Careem’s mission is to simplify and improve the lives of people and build a lasting institution that inspires. Established in July 2012, Careem operates in more than 120 cities across 15 countries and has created more than one million job opportunities in the region.

About the role

As a DataOps Engineer at Careem, you’ll be part of a team that builds solutions and tools to enable, organize and process large amounts of data. You will be working with batch and real-time technologies such as Hadoop, Hive, Spark, Spark streaming, Kafka, Cloud computing, and storage to help Careem becoming a data-driven company.

Some of the problems the team is working on are: building a tool to enable business areas to create real-time metrics, automating jobs and pipelines and delivering fast and reliable software.

Requirements

  • 2+ years of hands-on experience in building and managing Scalable Big Data Systems  
  • 2+ years of experience working with big data technologies like Spark and/or Kafka
  • 1+ year of experience on Packer / Terraform
  • Proficiency in at least one of the following scripting languages: Python or Bash
  • Ability to debug critical issues faced in the Big Data ecosystem and come up with a clear solution and fixes.
  • Understanding of distributed processing tools such as Hadoop, Hive, Zookeeper, Presto, Zeppelin, Airflow etc
  • Ability to dig deeper into the issues of the production critical systems and provide permanent fixes for the system
  • Experience implementing CI/CD, and maintaining big data ecosystems
  • Experience with one of the following automation tools: Chef, Ansible, or Puppet

 Desirable:

  • Experience working  with Data Science/Analytical teams and building scalable and stable systems
  • Exposure to enterprise level service such as Cloudera, Databricks, AWS, etc
  • Knowledge of containerization (Docker), and supporting technologies
  • Exposure to AWS data services and technologies (EC2, S3, EMR, Kinesis, Lambda, Glue, Data Pipeline, DynamoDB)
  • Knowledge of Relational Databases and Non Relational Databases, such as Maria, Mysql, Hbase or MongoDB
  • Understanding of the Elastic Search

 

What do we offer you?

Working in an international environment with colleagues from 70+ nationalities, an ownership culture, flexible working hours, unlimited (paid!) holidays and the latest technologies 

 

Careem gives equal opportunities. All aspects of ownership including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We celebrate diversity and are committed to creating an inclusive environment for everyone.

Apply for this Job

* Required

File   X
File   X
When autocomplete results are available use up and down arrows to review
+ Add Another Education