At Careem, our mission is to simplify and improve the lives of people and create an awesome organization that inspires. With this vast mission statement, we started by improving transportation and delivery in the region, and now we are expanding into Payment and we’re launching a super-app, hosting multiple Careem and 3rd-party apps, to further simplify and improve people’s everyday life.
We built the first multi-billion dollar tech startup in the MENAP region. The first line of code was written in Pakistan and we built on it further in Dubai and Berlin. We operate in 100+ cities across 11 countries. We joined Uber officially in early 2020. We grew and attracted top global talent and grew a culture for bold ambitions, shooting for the moon, innovation with tight constraints, and being Careem/gracious.
The Careem AI team’s mission is to drive competitive value from data at scale through building AI models to optimize user experiences, decision making, operational efficiencies, and leading the region’s AI ecosystem. As one of the technical leaders of this team, you will be at the forefront of fulfilling this mission. You will be working with the top data science talent of the region, to solve the region’s day-to-day problems, through state-of-the-art, well-integrated, democratized experimentation and AI, across our different products and platforms.
- Work on the development of our in-house Feature Store platform - low-latency storage of features for Machine Learning and Analytics purposes
- Develop new and improve existing features of a Feature Store platform using Kotlin.
- Empowering internal groups of Data Scientists by simplifying their day-to-day job while building and deploying Data Science models into production
- Write high quality, performant, and reliable code that powers the production infrastructure of every Data Science project
- Work closely with other engineers in the team to build highly scalable and robust backend services
- Contribute to and maintain CI/CD pipelines for applications and frameworks
- Participate in growing and contributing to the region’s AI community.
- 7+ years of professional experience in Software Development building scalable microservices, using Python, Java, Go, or Kotlin.
- In-depth knowledge of Big Data platforms and tools, including Hadoop, Spark, Kafka.
- A degree in a quantitative field such as Computer Science, Computational Mathematics, Computer Engineering, or Software Engineering.
- Good understanding of Computer Science fundamentals, including data structures, algorithms, complexity analysis.
- Experience with distributed systems at scale in a cloud-based environment.
- Experience in rapid prototyping and other fast iteration methods for product development.
- Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
- A strong and innovative approach to problem-solving and finding solutions. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
- Deep understanding of the relevant design patterns, software architecture approaches, and best coding practices.