Flyhomes is a place where authenticity, equity, and innovation collide to build the world’s best homebuying experience. Innovation is in our DNA. What will you create here?

We offer brokerage, mortgage and closing services—all under one roof—to ensure our clients have an amazing experience from the moment they start to work with us, to the moment they move in, and beyond. 

Real estate, mortgage and technology are what we do, but people are at the core of our mission. That’s where you come in! Whether you're our employee or our client, we believe it’s about people, not properties. From client-facing roles to technology, and everywhere in between, you’ll work alongside a diverse team who loves to solve problems, think creatively, and fly the plane as we continue to build it. If you’re dedicated to creating an inclusive, equitable, and more enjoyable real estate experience with solutions for every homebuyer, then we want to talk to you!

 

JOB DESCRIPTION 

What You’ll Do 

  • Create and maintain optimal data pipeline architecture, 
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. 
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. 
  • Experience with object-oriented/object function scripting languages: Python, Java, or Ruby.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. 
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. 

 

What is needed 

  • Expert-level SQL skills 
  • 7-10 years experience in database technologies (i.e., Postgres, MySQL, SQL Server, Oracle, RedShift, etc.)
  • Minimum 5 years of experience in Data Warehousing 
  • Experience creating and maintaining automated data pipelines 
  • Working knowledge of dimensional modeling techniques 
  • Working knowledge of data quality approaches and techniques 
  • Experience with AWS tools (S3/Redshift/DynamoDB/IAM) is highly desired 
  • Architectural insight on where to store data and modeling experience to recommend how it should be structured to make it accessible, performant, and resilient to change 
  • An entrepreneurial spirit, a drive to ship quickly, and familiarity with agile software development practices
  • The ability to deal with ambiguity, communicate well with partner teams - both technical and non-technical, and strong empathy for the customer experience 
  • Programming language experience (Python, Java, etc) is a plus
  • API development experience is a plus 
  • The ability to work within an Agile/Scrum development process

 

Apply for this Job

* Required