About us:

Want to infuse a $25B sector of the insurance and real estate industry with predictive analytics and a tech-forward customer experience? States Title is intelligently transforming closings by applying machine intelligence to the age-old processes and procedures in the $25B Title and Settlement industry. Our streamlined, efficient algorithms have revolutionized the title and escrow process and allowed us to scale rapidly. We are poised to transform this industry, repurposing the billions wasted in rote, manual tasks to make homeownership easier and less risky, helping people invest time and money into more meaningful parts of their lives.

You’re fired up to:

  • Design, build, and maintain data pipelines for extraction, transformation, and loading of data from a wide variety of data sources to various data services.
  • Build a next-generation enterprise data lake with raw production data as source of truth and always-on, versioned data pipelines.
  • Document data sources, data pipelines, and data infrastructure to share knowledge and understanding of the solutions being implemented.
  • Work with teams across the organization to assist with data-related technical issues and support their data infrastructure needs.

You definitely have:

  • Experience with scripting languages, particularly Python.
  • Experience with cloud services, particularly Azure: Blob Storage, VMs, Data Factory, SQL Data Warehouse, HDInsight, etc.
  • Ability to write complex SQL joins in your sleep and experience with stored procedures, triggers, etc.
  • Experience with the command line for Linux systems (also preferably Windows).
  • Knowledge of database modeling and data warehousing concepts.

You might even have:

  • Experience with big data platforms and tools like Spark, Hive, Presto, Kafka, etc.
  • Experience with compiled languages like Java, Scala, C#, etc.
  • Familiarity with ML concepts (supervised learning, feature engineering, etc) and experience with ML frameworks (Scikit-learn, TensorFlow, SparkML, etc).
  • Familiarity with data lake architecture.
  • Knowledge of DevOps practices and experience with related tooling (containers, infrastructure-as-code, observability, etc).

We want the work you do here to be the best work of your life.

We believe the most valuable investment we can make - and the greatest boost we can give to your career - is to build an outstanding team of colleagues who are passionate about our mission.

We currently offer the following benefits and will continually evolve them with the goal of efficiently attracting, retaining, and leveraging the very highest quality talent.

  • Our passionate, capable team will always be our #1 benefit
  • Learn something new every day
  • Get more done than you would anywhere else
  • Highly competitive salaries and stock option grants
  • Health, dental, and vision benefits for you and your family
  • Flexible work hours
  • Unlimited vacation policy
  • A modern, helpful 401(k) plan
  • Wellness and commuter benefits

Apply for this Job

* Required