About the position:
These are excellent opportunities to work in a company with a highly technological product that generates hundreds of thousands of events per second. A vast sea of data that not only stored and organized but also consumed to improve all aspects of the operation: pricing, dispatching, marketing, governance, and many others.
At data engineering, we have dozens of services (Scala, Golang, Python), pipelines (Apache Beam, Airflow), and our in-house developed Machine Learning platform. We are a hands-on team: we manage our own infrastructure (GCE and AWS) and several Kubernetes clusters. Our etls platform has more than 300 processes (Python, Airflow, Redshift, S3, Spectrum, Glue, RDS) that we visualize with our Tableau server cluster (Tableau Server Linux, Ec2, Python)
Cabify is a global company with a very complex product, but at the same time with the perfect size to allow you to have a tangible impact on the final product. You will be able to build and improve the platform that provides trusted data at scale to the rest of the company. And you will do it as part of a team of experienced data engineers, helping each other grow technically and professionally.
You will:
- Design and develop end-to-end data solutions and modern data architectures for Cabify products and teams (streaming ingestion, data lake, data warehouse...).
- Evolve and maintain Lykeion, a Machine Learning platform developed along with the Data Science team, to take care of the whole lifecycle of models and features. It includes a feature store, which allows other groups inside Cabify to make better decisions based on data, and a prediction platform to serve ML models.
- Design and maintain complex APIs exposing data at scale, that helps other teams to make better decisions.
- Provide the company with data discoverability and governance.
- Collaborate with other technical teams to define, execute and release new services and features.
- Manage and evolve our infrastructure. Continuously identify, evaluate, and implement new tools and approaches to maximize development speed and cost efficiency.
- Extract data from internal and external sources to empower our Analytics team.
Our ideal candidate has:
We are looking for experienced data engineers with excellent know-how in large-scale distributed systems:
- 5+ years of tenure in coding and delivering complex data engineering projects.
- Fluency in different programming languages (we work with Python, Scala, and Go; you don’t need to master all three of them).
- Deep understanding of:
- Message delivery systems and streaming processing (Kafka, RabbitMQ, Akka streams, Apache Beam…)
- Data processing technology stacks and distributed processing (Hadoop, Spark, Apache Beam, Apache Flink...)
- Storage technologies (file-based, relational, columnar, document-based, key-value...)
- Orchestration tools such as Airflow, Luigi, or Dagster.
- Cloud infrastructures (GCP, AWS, Azure)
- Automation/IaC tools (Terraform, Puppet, Ansible…)
- MLOps
The good stuff:
We’re a company full of happy, motivated people and we never want that to change. Here are more reasons why it rocks to be part of our high-performance team.
⌚Remote position, or on-site/hybrid position at our Madrid HQ (depends on the group)
💶Excellent Salary conditions: L5 up to €99k
🏝️Recharge day: Every 3rd Friday monthly off!
🌍Our office is located in Madrid. This position is open to a full remote and also to a partially onsite model.
⌚Flexible work environment & hours.
🙌Regular team events.
🚗Cabify staff free rides.
🚀Personal development programs based on our career paths.
📚Annual budget for training
🧘♀️ iFeel: Free access to the iFeel platform, so you can take care of your emotional well-being through therapy sessions.
📐Coursera: your own license in Coursera to take as many courses as you wish and continue developing your skills.
💳Flexible compensation plan: Restaurant tickets, transport tickets, healthcare and childcare
💻All the equipment you need (you only have to bring your talent).
🐱A pet room ,so you don’t have to leave your furry friend at home
☕️And last but not least...free coffee and fruit!
Join us!