About Pagaya

Pagaya is a global technology company making life-changing financial products and services available to more people nationwide, as it reshapes the financial services ecosystem. By using machine learning, a vast data network and a sophisticated AI-driven approach, Pagaya provides comprehensive consumer credit and residential real estate solutions for its partners, their customers, and investors. Its proprietary API and capital solutions integrate into its network of partners to deliver seamless user experiences and greater access to the mainstream economy. Pagaya has offices in New York and Tel Aviv. For more information, visit pagaya.com.

About the Role

Software is fundamental to research. From the humanities to physics, biology to archaeology, software plays a vital role in generating results. The Data Engineering group is a cross-functional team responsible for all data activities, including integration, monitoring, quality, and accessibility.

The Senior Data Platform Engineer will have responsibility for working on a variety of data projects. This includes building data infrastructures with Big Data tools/architectures as well as designing and engineering data pipelines on top of the infrastructure.

Key Responsibilities

  • Develop and operate a cutting-edge infrastructure, which includes Spark, Iceberg, Nessie and Kubernetes.
  • Build out and operate our foundational data infrastructure, including storage (cloud data warehouse, S3 data lake), orchestration (Airflow), and processing (Spark, DBT).
  • Creates robust and automated pipelines to ingest and process structured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud-native toolset.
  • Developing and maintaining data lake and data warehouse schematics, layouts, architectures, and non-relational databases for data access and Advanced Analytics.
  • Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with other members of the data group, including data architects, data analysts, and data scientists.

Key Takeaways:

  • Use the tools and languages that are best suited to the job - Complete flexibility to problem-solving with novelty and creativity encouraged.
  • Open-source projects and frameworks are recommended.
  • Work with a team of highly motivated, bright, fun, and creative people.
  • Your intellectual curiosity and hard work contributions will be welcome to our culture of knowledge sharing, transparency, and shared fun and achievement.
  • Contribute to our software engineering culture of writing correct, maintainable, elegant, and testable code.


  • At least 4 years of experience as a Python Developer.
  • At least 4 years of experience with developing and maintaining Spark based applications.
  • At least 3  years of experience using data tools and frameworks like: Spark, Flink, Hadoop, Presto, Hive, or Kafka.
  • At least 2 years of experience with Data Warehousing (Snowflake, Redshift, Firebase, BigQuery, etc..).
  • Experience with AWS cloud services: EMR, EKS, Kinesis, EventBridge, DynamoDB, Lambdas.
  • Deep understanding of ETL, ELT, data ingestion/cleansing, and engineering skills.
  • Building and designing large-scale applications.
  • Undergraduate degree in Computer Science, Computer Engineering, or similar disciplines from rigorous academic institutions.

Any of the below would be an advantage:

  • Experience with data pipeline and workflow management tools: Airflow, Azkaban, Luigi, etc.
  • Experience with building, running, and testing DBT models and macros.
  • Experience with Apache Iceberg and project Nessie.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Operating systems, especially UNIX, Linux, and Mac OS.
  • Experience supporting and working with cross-functional teams in a dynamic environment.

Our Team

Pagaya was founded in 2016 by seasoned research, finance, and technology entrepreneurs, and we are now 500+ strong in New York, Los Angeles, and Tel Aviv.

We move fast and smart, identifying new opportunities and building end-to-end solutions from AI models and unique data sources. Every Pagaya team member is solving new and exciting challenges every day in a culture based on partnership, collaboration, and community.

Join a team of builders who are working every day to enable better outcomes for our partners and their customers.

Our Values

Our values are at the heart of everything we do. We believe great solutions are built through a great community.

  • Advance Inclusion- We create a world where everyone can win, designing systems that better represent people and generate sustainable value for our employees, partners and investors.
  • Be Accountable Together- We proudly own our actions and our results, taking initiative to ensure our work gets over the finish line as a team.
  • Continuously Learn- We challenge ourselves for the sake of getting better as individuals, as teams, and as an organization to deliver for our partners.
  • Debate and Commit- We respectfully and openly debate to strengthen our ideas and build shared conviction - once we decide, we go all in, together.
  • Dream Big and Act- We boldly tackle complex problems, pressure-test solutions in real-time, and adapt with speed and energy.

More than just a job

We believe health, happiness, and productivity go hand-in-hand. That's why we're continually looking to enhance the ways we support you with benefits programs and perks that allow every Pagayan to do the best work of their life. 


Pagaya is an equal opportunity employer. Pagaya is encouraging diversity and actively seeking applicants from all backgrounds, as are committed to creating a diverse workforce together with an inclusive environment for all. Employment is decided on the basis of qualifications, skills, and business needs.

Apply for this Job

* Required

resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)

Please reach out to our support team via our help center.
Please complete the reCAPTCHA above.