TheFork, previously LaFourchette, part of the TripAdvisor group, is a pure player company and the European leader of online booking for restaurants. We are an innovative and fast-growing company building a unique community of members to transform the way people dine out! The Fork’s  goal is to become the #1 app for online restaurant booking in the world.


Why join us?

You are passionate about data and comfortable in an international and agile environment? You're ready for the challenge of a SaaS service for several thousand customers and B2C interfaces for millions of users? Are you ready to invest in a company whose entire service is subject to heavy use and is present in close to 20 countries, to accompany its rapid growth? You want to secure services that power the leader of online restaurant booking?


You will join a dynamic and agile team in a quickly changing environment. You will be involved in an international organization that empowers its employees, cares about their career and values sharing.


Job description


As a Data Engineer, your mission will be to build & expand theFork Data Platform, used intensively by all theFork internal teams and our products, supporting multiple domains (BI, Data Insights, Products Analytics, Data Science)


You will report to the Data Product Manager and be part of the central data team composed of ~20 people, regrouping:

  • “Data Asset” departments, including Data Engineering team and Data Management team
  • “Data activation” departments, including Data Insights team, Product Analytics Team, a BI factory and a Data Science team


Your daily work will consist of:


  • Designing efficient patterns to store and analyze Terabytes of data
  • Implementing complex acquisition and transformation workflows
  • Building smart data models to serve our product teams and our BI, Insights and data science teams while minimizing the costs
  • Developing tools to help our data scientists and industrializing machine learning projects
  • Working on data quality & reliability to ensure we provide trustable metrics to the whole company


The Data platform tech stack is built on AWS (S3, EC2, EMR) and Snowflake using some open source technologies such as Airflow, Spark, Sqoop, Jenkins, Elasticsearch, Docker, Kubernetes, etc...

With a minimum of 2 years of experience, you have a good knowledge of data engineering patterns, technologies and are proficient in python. You have experience in SW development practices such as CI/CD, unit testing, ..


We will appreciate a large data culture and curiosity for any new technologies. 


  • Willing to work in team and interact with technical and business stakeholders
  • Used to work in an agile environment (meaning priority changes is not 
  • Proficient in english (written and spoken)
  • Curious and humble



  • Min 2 years of data engineering experience
  • Great team player and at ease to interact with technical and business stakeholders
  • Used to work in an agile environment & OK with priorities changes
  • Curious, humble and exhibits a balance between creativity and pragmatism
  • High willingness to learn and teach others
  • Proficient in english (written and spoken)

Technical skills 

  • SQL and Python
  • Snowflake
  • Apache Airflow
  • Infrastructure knowledge, AWS, Docker, Jenkins and Kubernetes appreciated
  • Proficient with git and unix 

Streaming technologies (CDC pipelines, stream processing, ...) and AWS Sagemaker, Databricks, Kubernetes and  Google BigQuery are a plus


Apply for this Job

* Required