Who we are

DoubleVerify is a big data analytics company, which went public in April 2021 (NYSE: DV). We track and analyze tens of billions of ads every day for the biggest brands in the world like Nike, AT&T, Disney, Vodafone, and most of the Fortune 500 companies, if you’ve seen an online ad via Web, Mobile, or CTV device then there are big chances that it was measured by us.

We operate at a massive scale, we handle over 100B events per day and over 1M RPS at peak, we process events in real-time at low latencies (ms) to help our clients make decisions before, during and after the ad is served. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography and measure the viewability and user’s engagement throughout the ad’s lifecycle.

We are global, we have R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium and San Diego. We work in a fast-paced environment and have a lot of challenges to solve. If you like to work in a huge scale environment and want to help us build products that have a huge impact on the industry, and the web - then your place is with us.

 

What will you do

You will process 10’s of billions of records a day using mainly Python and SQL utilizing many advanced technologies like DataBricks, Spark, BigQuery, Vertica, Kafka, Docker, Kubernetes, Gitlab and more.

You will build the next generation of digital advertising optimization tools used by the biggest advertisers in the world to help them understand and optimize their ROI. This is a true big data product at a huge scale and high impact for our clients and the digital advertising industry as a whole.

You will maintain a very high engineering standard that includes testing, automated deployment, infrastructure as code, high degree of monitoring and availability.

We believe in our people’s abilities to take things end to end; working with product managers, designing solutions, continuously building, deploying and analyzing data - getting things done.

 

Who you are

  • A team Player with an ability to work independently and good communication skills
  • A versatile developer with decision making capability and “getting-things-done” attitude
  • 2+ years of experience with one of the following languages: Python, Scala, Java or a like
  • Experience in working with SQL/NoSQL Databases and data warehouses such as Databricks,BigQuery,Snowflake,Redshift,Vertica, etc
  • Hands-on experience with both streaming and batching technologies such as: Kafka/Kafka Streams/Spark/Flink/Beam - a plus
  • Experience working with public cloud providers such as GCP/AWS/Azure  - a plus
  • Interested in learning about the ad tech industry and a general willingness to learn new things

Apply for this Job

* Required
  
(File types: pdf, doc, docx, txt, rtf)
  
(File types: pdf, doc, docx, txt, rtf)