About PayPay India 

PayPay, a fintech company providing a service enjoyed by over 63 million users (as of April 2024) merely 5 years since its launch in 2018 in Japan. The company is now home to a very diverse team of members from more than 50 countries. We grew to a team of several thousand employees in Japan but are far from over. We are still in the Day 1. Every day, new members join us from all over the world to create new value and deliver it to society.
 
 
 
 

Why India ?

To build our Payment services, we got technical cooperation from Paytm (A large payment service company in India). And based on their customer-first technologies , we created and expanded the smartphone payment service in Japan. Therefore, we have decided to establish a development base in India, because it is a major IT country with many talented engineers, as evidenced by the fact that cutting-edge mobile payments can continue to be generated.

OUR VISION IS UNLIMITED

We dare to believe that we do not need a clear vision to create a future beyond our imagination. PayPay will always stay true to our roots and realise a vision (future) that no one else can imagine by constantly taking risks and challenging ourselves. With this mindset, you will be presented with new and exciting opportunities on a daily basis and have the opportunity to grow and reach new dimensions that you could never have imagined.
 

Job Description

PayPay's growth is driving a rapid expansion of PayPay product teams and the need for a robust Data Engineering Platform to support our growing business needs is more critical than ever. The DaaS team’s responsibility is to design, implement, and operate this platform using cutting edge technologies such as Spark, Hudi, Delta Lake, Scala, and AWS suite of data tools.

We are looking for talented Data Engineers to join our team and help us scale our platform across the organizations.

 

Main Responsibilities

  • Design, develop, and maintain scalable data ingestion pipelines using AWS Glue, Step Functions, Lambda, and Terraform.
  • Optimize and manage large scale data pipelines to ensure high performance, reliability, and efficiency.
  • Implement data processing workflows using Hudi, Delta Lake, Spark, and Scala.
  • Maintain and enhance Lakeformation and Glue Data Catalog for effective data management and discovery.
  • Collaborate with cross-functional teams to ensure seamless data flow and integration across the organization.
  • Implement best practices for observability, data governance, security, and compliance.

Qualifications

  • 6+ years experience as a Data Engineer or in a similar role.
  • Hands-on experience with Apache Hudi, Delta Lake, Spark, and Scala.
  • Experience designing, building, and operating a DataLake or Data Warehouse.
  • Knowledge of Data Orchestration tools such as Airflow, Dagster, Prefect.
  • Strong expertise in AWS services, including Glue, Step Functions, Lambda, and EMR.
  • Familiarity with change data capture tools like Canal, Debezium, and Maxwell.
  • Experience with data warehousing tools like AWS Athena, BigQuery, Databricks.
  • Experience in at least one primary language (e.g. Scala, Python, Java) and SQL (any variant).
  • Experience with data cataloging and metadata management using AWS Glue Data Catalog, Lakeformation, or Unity Catalog.
  • Proficiency in Terraform for infrastructure as code (IaC).
  • Strong problem-solving skills and ability to troubleshoot complex data issues.
  • Excellent communication and collaboration skills.
  • Ability to work in a fast-paced, dynamic environment and manage multiple tasks simultaneously.

 

 

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)


Enter the verification code sent to to confirm you are not a robot, then submit your application.

This application was flagged as potential bot traffic. To resubmit your application, turn off any VPNs, clear the browser's cache and cookies, or try another browser. If you still can't submit it, contact our support team through the help center.