Senior Big Data Engineer - Careem PAY (Java / Kafka / Spark / Cassandra / MongoDB) 

Do you want to help build a world-class institution from the region, experience the thrill of being part of a high-growth technology company, and improve people lives? 

Big Data Engineer

At Careem, our mission is to simplify the lives of people, initially through solutions that make transportation reliable, and over time, through disruptions in payments and logistics. In the process, we want to build an organization that inspires and become a world-class institution.

Founded in 2012 by former entrepreneurs and McKinsey alums, Careem is the MENA regions leading ride-hailing service and newest Tech Unicorn. With 30% monthly growth, we now operate in 100+ cities across 15 countries and host over 15 million users. With our recent Series D funding success, we are positioned on the cusp of significant scale.

Key Qualifications

  • 3+ years experience with releasing and managing of Hadoop, Hive, Spark, Kafka

  • 2+ years experience with AWS (EC2, S3, EMR, Kinesis, Lambda)

  • Linux system administration experience: ssh, monitoring processes, attaching

    storage, cleaning disk space, tailing logs, etc.

  • Hands-on in Python, Java, Scala and RDBMS (Oracle/MySQL) skills

  • Expert knowledge and experience with software version control systems: GIT

    (GitHub/Gitlab), etc

  • Knowledge of Java build systems and tools including: Maven and SBT

  • Working knowledge of containerization (Docker), and supporting technologies

  • Experience with Mesos or YARN

  • Experience with orchestration tools like Airflow

  • Experience working with server clusters consisting of 10s-1000s of machines, and

    deploying changing with zero downtime

  • Experience with CI tools preferably Jenkins

  • Experience maintaining large clusters using configuration tools such as: Puppet,

    Chef, Salt, Ansible, CloudFormation etc.

Apply for this Job

* Required