We're looking for a skilled Data Engineer who is looking to work on a modern property investment platform, employing an all-cloud data stack including Airflow, Docker, DBT, Python, Snowflake, Tableau, Sigma and SQL.
- Design, implement and deploy, scalable, fault-tolerant pipelines that ingest, and refine large diverse (structured, semi-structured and unstructured datasets) into simplified, accessible data models in production
- Built departmental data-marts for supporting analytics across the company
- Collaborate with cross-functional teams to understand data flows and processes to enable design and creation of the best possible solutions
- Provide quality data solutions in a timely manner and be responsible for data governance and integrity while meeting objectives and maintaining SLAs
- Build tools and fundamental data sets that encourage self-service
- Improve and maintain the data infrastructure
- 3+ years professional experience in writing production Python code, shell scripts, and complex SQL
- 3+ years professional experience building robust data pipelines beyond simple API pulls
- 1+ year building and deploying data-related infrastructure (messaging, storage, compute, transform, execution via docker, and/or CI/CD pipelines across dev/stage/prod
- Experience in data warehousing and dimensional data modeling
- Desirable: Experience with Airflow (or similar tools), AWS, Azure and DBT.
- Payment in USD
- Free credentials for e-learning platforms
- Remote workshops & activities