Flyhomes is reinventing the traditional real estate process to create a radically different experience. We level the playing field by giving any qualified buyer the chance to make a cash offer. We reduce risk and uncertainty for homeowners trying to buy a new home with our Trade Up program.  We streamline the homebuying process by offering brokerage, financing, closing, and home services. Innovation is in our DNA. What will you create here?

When you join Flyhomes, you will have the ability to not only affect change across an industry, but on a human level. Real estate, Mortgage and technology are what we do, but people are at the core of our mission. From client-facing roles to data and technology, we start and end with the people we work with because they’re what matters. You’ll work alongside others who love to solve problems, think creatively, and fly the plane as we continue to build it! If this sounds appealing, we want to connect with you.

 

JOB DESCRIPTION 

What You’ll Do 

  • Create and maintain optimal data pipeline architecture, 
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. 
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. 
  • Experience with object-oriented/object function scripting languages: Python, Java, or Ruby.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. 
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. 

 

What is needed 

  • Expert-level SQL skills 
  • 7-10 years experience in database technologies (i.e., Postgres, MySQL, SQL Server, Oracle, RedShift, etc.)
  • Minimum 5 years of experience in Data Warehousing 
  • Experience creating and maintaining automated data pipelines 
  • Working knowledge of dimensional modeling techniques 
  • Working knowledge of data quality approaches and techniques 
  • Experience with AWS tools (S3/Redshift/DynamoDB/IAM) is highly desired 
  • Architectural insight on where to store data and modeling experience to recommend how it should be structured to make it accessible, performant, and resilient to change 
  • An entrepreneurial spirit, a drive to ship quickly, and familiarity with agile software development practices
  • The ability to deal with ambiguity, communicate well with partner teams - both technical and non-technical, and strong empathy for the customer experience 
  • Programming language experience (Python, Java, etc) is a plus
  • API development experience is a plus 
  • The ability to work within an Agile/Scrum development process

 

Apply for this Job

* Required