Who we are
DoubleVerify is a big data analytics company, gone public in April 2021 (NYSE: DV).
We track and analyze tens of billions of ads every day for the biggest brands in the world like Nike, AT&T, Disney, Vodafone, and most of the Fortune 500 companies, if you’ve seen an ad online via Web, Mobile, or CTV device then there are big chances that it was measured by us.
We operate at a massive scale, we handle over 100B events per day and over 1M RPS at peak, we process events in real-time at low latencies (ms) to help our clients make decisions before, during and after the ad is served. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography and measure the viewability and user’s engagement throughout the ad’s lifecycle.
We are global, we have R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium and San Diego, we work in a fast-paced environment and have a lot of challenges to solve. If you like to work in a huge scale environment and want to help us build products that have huge impact on the industry, and the web - then your place is with us.
What will you do
You will join a team of experienced engineers and help them in developing our innovative measurements products.
You will lead projects by architecting, designing and implementing solutions that will impact the core components of our system.
You’ll develop new and awesome features while leveraging cloud native technology stack, do continuous improvements of our development process by adapting new technologies, and using them to solve product and engineering challenges while raising the bar of code quality and standards.
Who you are
- 5+ years of experience coding in an industry-standard language such as Scala, Java, Python, Go etc.
- Deep understanding of Computer Science fundamentals: object-oriented design, functional programming, data structures, multi-threading and distributed systems.
- Experience with in-memory distributed cache such as Aerospike or Redis and messaging systems such as Apache Kafka, etc.
- Experience working with Docker, Kubernetes and designing scalable microservices architecture.
- Experience in working with SQL (MySQL, PostgreSQL) and Columnar/NoSQL Databases such as (BigQuery, Vertica, Snowflake, Couchbase, Cassandra, etc.).
- Experience working in a BigData environment and building scalable distributed systems with stream processing technologies such as Akka Streams, Kafka Streams/Spark/Flink.
- Experience working with cloud providers such as GCP or AWS
- BSc in Computer Science or equivalent experience.
- Experience with Agile development, CI/CD pipelines (Git ,GitLab or Jenkins) and coding for automated testing.
- A versatile developer with a “getting-things-done” attitude.
Nice to have
- Previous experience with online advertising technologies is a big plus.
- Familiarity with the cloud-native computing foundation.