LiveRamp is the leading data connectivity platform. We are committed to connecting the world’s data safely and effectively, advancing innovation, and empowering people to do good. Our platform powers customer experiences centered around the needs and concerns of real people, keeping the Internet open for all. We enable individuals around the world to connect with the brands and products they love. LiveRampers thrive on solving challenging problems for the good of humanity—and we’re always looking for smart, kind, and creative people to help us get there.

Mission: LiveRamp makes it safe and easy for businesses to use data effectively.


The Identity team at LiveRamp has a burning passion for building the best Identity Resolution capability in the world. Our platform ingests trillions of records of data, applies data science techniques to create massive, heterogeneous graphs, and then renders those graphs as the backend for extensible, reliable, high-performance APIs.


  • Help your fellow engineers design and implement a scalable SaaS platform that is cloud agnostic. 
  • Gain experience working in multiple cloud environments such as GCP, AWS, Snowflake, and Azure. 
  • Build customizable, high-performance matching engines that can be deployed in regions all over the world to solve localized and client hosted data challenges. 
  • Implement customizable frameworks for graph access. 
  • Treat infrastructure as code leveraging tools like Docker, Kubernetes, Helm, and Terraform. 
  • Lead design for new product offerings in the Identity Resolution space.
  • Design with privacy and security-first principles.


  • Build first of its kind Identity Resolution capabilities globally leveraging a variety of cloud technologies capable of meeting demanding high availability thresholds while serving a spectrum of low latency versus high throughput use cases. 
  • Enable configuration-driven workflows that automate batch and streaming-based processing on top of Identity Resolution services and assets. 
  • Provide an orchestration platform for complex graph builds using the latest open source and cloud native technologies.


  • Have 7+ years of experience writing and deploying production code. 
  • Strong ability to break down complex problems to their essential components and design and implement elegant solutions for them. 
  • Experience with designing and implementing interfaces and infrastructure for large volume services and APIs. 
  • Curious to learn and comfortable evaluating and adapting to the latest tools and industry best practices. 
  • Good understanding of distributed computing, graph algorithms, distributed SQL
  • Proficient in Scala, Java, and Python. 
  • Experience with data processing platforms such as Hadoop, Spark, Apache Beam, and Kafka. Proficient with programming using map-reduce paradigm. 
  • Proficient in cloud technologies 
  • Have a startup personality and enjoy working as part of a team: smart, ethical, friendly, hard-working, and productive.


  • Experience with processing hundreds of terabytes or petabyte scale of data ● Strong mathematical background, knowledge of approximation algorithms and stream processing 
  • Familiarity with programming languages such as Kotlin or Go. 
  • Experience with Data Analytics Platform technologies like CDAP. 
  • Experience with designing and implementing interfaces and infrastructure for large volume services and APIs.

Apply for this Job

* Required