Who We Are
HomeLight is a venture-backed technology startup revolutionizing the $1 trillion real estate industry. Our mission is simple – we empower people to make smarter decisions during one of life’s most important moments: buying or selling their home.
HomeLight’s technology analyzes millions of home transactions to determine which agent or cash buyer is right for you. We also offer innovative financing and closing solutions, creating an end-to-end real estate experience that's simple, certain, and satisfying.
We pride ourselves on our company culture – but don’t just take it from us. We’ve been recognized as a best place to work by Forbes, Inc. Magazine, and the San Francisco Business Times. Our team breaks barriers every day while staying committed to HomeLight's goals and core values, which is a crucial element to our shared success.
Who You Are
We are building our Data Engineering team to tackle HomeLight's diverse, data challenges. This position is an excellent opportunity for an engineer that wants to own the development, optimization, and operation of our data pipeline, which collects, processes, and distributes data to a suite of HomeLight products and teams. You will provide mission-critical data to both our algorithms and internal users, refining our product and identifying new markets.
What You'll Do Here
Some projects you will work on:
- Optimize and execute on requests to pull, analyze, interpret and visualize data
- Partner with team leaders across the organization to build out and iterate on team, and individual performance metrics
- Optimize our data release processes, and partner with team leads to iterate on and improve existing data pipelines.
- Design and develop systems that ingest and transform our data streams using the latest tools.
- Design, build, and integrate new cutting edge databases and data warehouses, develop new data schemas and figure out new innovative ways of storing and representing our data.
- Research, architect, build, and test robust, highly available and massively scalable systems, software, and services.
- 3+ years of Python and ETL experience, preferably Airflow
- Experience writing and executing complex SQL queries
- Experience building data pipelines and ETL design (implementation and maintenance)
- Scrum/Agile software development process.
Bonus points for
- Expertise with Ruby on Rails.
- Familiarity with AWS, Elasticsearch, Ruby/Rails, Django, Heroku
- Experience setting up and managing internal API services.
- Experience working on a small team, ideally at a startup.
- Familiarity with the Amazon AWS ecosystem