Who we are:
DoubleVerify is a big data analytics company, which went public in April 2021 (NYSE: DV). We track and analyze tens of billions of ads every day for the biggest brands in the world like Nike, AT&T, Disney, Vodafone, and most of the Fortune 500 companies, if you’ve seen an online ad via Web, Mobile, or CTV device then there are big chances that it was measured by us.
We operate at a massive scale, we handle over 100B events per day and over 1M RPS at peak, we process events in real-time at low latencies (ms) to help our clients make decisions before, during and after the ad is served. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography and measure the viewability and user’s engagement throughout the ad’s lifecycle.
We are global, we have R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium and San Diego. We work in a fast-paced environment and have a lot of challenges to solve. If you like to work in a huge scale environment and want to help us build products that have a huge impact on the industry, and the web - then your place is with us.
What will you do:
As a Senior Big Data Engineer, you will be taking a central technical leadership role in designing and implementing our new Big data Lakehouse infrastructure (PB’s of data) as part of our effort of migrating our current on-prem data solutions to Google cloud (GCP).
You will be the owner of the data strategy and as such, you will learn how the data serve our goals, come up with ways to improve our tens of billions of events and TBs of data processes while maintaining high data quality; conduct proof of concepts with latest data tools; and by that, help our clients make smarter decisions that continuously improve their ad-impression quality.
You will work with a wide array of languages and technologies such as GCP, DataBricks, Spark, Python, Scala, SQL, BigQuery, Vertica, Kafka, Docker, Kubernetes, Gitlab and more.
We believe in our people’s abilities to take things end to end; working with product managers, designing solutions, continuously building, deploying and analyzing data - getting things done.
Who you are:
- A team Player with an ability to work independently and good communication skills.
- Actively seek ways to improve software processes and interactions
- A versatile developer with decision making capability and “getting-things-done” attitude.
- 4+ years of experience with one of the following languages: Python, Scala or Java.
- Hands-on experience in one of the following: Kafka/Kafka Streams/Spark/Flink/Beam
- Extensive experience in working with SQL/NoSQL Databases and data warehouses such as Databricks,BigQuery,Snowflake,Redshift,Vertica, etc.
- experience working with public cloud providers such as GCP/AWS/Azure
- Interested in learning about the ad tech industry and a general willingness to learn new things.