Do you want to change the world? At Cabify, that’s what we’re doing. We aim to make cities better places to live by improving mobility for the people living in them, connecting riders to drivers at the touch of a button. Maybe one day cities will be places where nobody needs a private car. But we’ve still got a long way to go...fancy joining us?
 
No matter how quickly we grow or how big we become, we’re determined to keep our ‘start-up’ spirit. We are a confident, professional, charismatic and focused team, young at heart and proud to do things the right way.We like to think of ourselves as an ethical and innovative tech company that has created a business where everyone is treated fairly and respectfully, from our engineers to the drivers that use our platform. Moreover, we always comply fully with government regulations and respect local laws.
 
It’s been quite a ride so far, but in reality, our journey has only just begun. If we’re going to turn our vision into a reality, we’re going to need plenty more bright, ambitious people to join us!
 

About the position:

We are looking for new members of the Data Engineering Team (heavy users of Google Cloud, Python/Scala/Go) and here is how you will play an important part in helping us achieve our mission:

  • Designing and developing end to end data solutions and modern data architectures for Cabify products
  • Productionizing Machine Learning models
  • Collaborating with cross-functional teams to define, execute and release new services and features
  • Continuously identifying, evaluating and implementing new tools and approaches to maximise development efficiency

What we’re looking for:

  • At least 3 years tenure in coding and delivering complex data projects
  • Proven track record in Data Engineering
  • Hands-on and continuous delivery attitude
  • Deep understanding and application of modern data processing technology stacks (Hadoop ecosystem, Google Cloud, AWS)
  • Deep understanding of streaming data architectures and technologies for real-time and low-latency data processing (Apache Spark/Beam)
  • Deep understanding of NoSQL technologies including columnar, document, and key-value data storage technologies (other no relational data base will be considered)
  • Understanding of how to develop and architect solutions for data science and analytics such as productionizing machine learning models and collaborating with data scientists
  • Bonus points: Working knowledge of Google Cloud Big Data products (Pub/Sub, Dataproc, Dataflow…)

The good stuff:

We’re a company full of happy, motivated people and we never want that to change. Here are some more reasons why it rocks to be part of our family.

  • Senior Data Engineer [43200EUR-55200EUR], Principal Data Engineer [55200EUR-69600EUR]
  • All the gear you need - just bring yourself 
  • Flexible working hours & environment
  • Monthly free rides
  • Up to 3 days per week of remote working
  • Budget for books, training and conferences
  • Flexible remuneration: subsidized restaurant tickets, transport tickets, healthcare and childcare
  • 24 days paid annual leave
  • A pet room so you don’t have to leave your furry friend at home
  • We offer relocation package to those coming from other country
  • And last but not least...free coffee and fruit!

Apply for this Job

* Required
(Optional)
Almost there! Review your information then click 'Submit Application' to apply.

File   X
File   X