Who we are

DoubleVerify is a big data and analytics company. We track and analyze tens of billions of ads every day for the biggest brands in the world like Apple, Nike, AT&T, Disney, Vodafone, and most of the Fortune 500 companies. If you ever saw an Ad online via Web, Mobile or CTV device then there are big chances that it was analyzed and tracked by us. 
We operate at a massive scale, our backend handles over 100B+ events per day, we analyze and process those events in real-time while making decisions on the environment where the ad is running and all the user interactions during the Ad display lifecycle. We verify that all Ads are Fraud Free, Brand Safe, in the right Geo and highly likely to be viewed and engaged, all that in less than a fraction of second.
We are global, we have R&D centers in Tel Aviv, New York, Finland, Belgium and San Diego, we work in a fast-paced environment and have a lot of challenges to solve. If you like to solve big data challenges and want to help us build a better industry than your place is with us.


What will you do

You will join a team of experienced engineers and help them in developing our innovative measurements products.

You will lead projects by architecting, designing and implementing solutions that will impact the core components of our system.

You’ll develop new and awesome features while leveraging cloud native technology stack, do continuous improvements of our development process by adapting new technologies, and using them to solve product and engineering challenges while raising the bar of code quality and standards.


Who you are

  • A versatile developer with a “getting-things-done” attitude.
  • 5+ years of experience with at least two of the following languages: Scala, Java,  Python, Go, NodeJS.
  • Deep understanding of Computer Science fundamentals: object-oriented design, functional programming, data structures, multi-threading and distributed systems.
  • Experience with in-memory distributed cache such as Aerospike and messaging systems such as Apache Kafka, Amazon Kinesis, RabbitMQ, etc...
  • Experience working with Docker, Kubernetes and designing scalable microservices architecture.
  • Experience in working with SQL (MySQL, PostgreSQL) and Columnar/NoSQL Databases such as (BigQuery, Vertica, Snowflake, Couchbase, Elasticsearch, MongoDB, Cassandra, etc…).
  • Experience working in a BigData environment and building scalable distributed systems with stream processing technologies such as Akka Streams, Kafka Streams/Spark/Flink.
  • Proven experience working with cloud providers such as GCP or AWS
  • BSc in Computer Science or equivalent experience.
  • Experienced working in a Linux/Unix based environment.
  • Experience with Agile development, CI/CD pipelines (Git ,GitLab or Jenkins) and coding for automated testing.


Nice to have

  • Previous experience with online advertising technologies is a big plus.
  • Familiarity with the cloud-native computing foundation.

Apply for this Job

* Required