Envoy is transforming modern workplaces, challenging the status quo with products that make office life and work more meaningful. Envoy has redefined how offices interact with visitors and manage deliveries in over 13,000 locations around the globe while building products for a new era of workplace experience. Companies like Slack, Asana, Pinterest, and Warby Parker rely on Envoy to create an unrivaled first impression, and keep their offices secure and compliant.
Our mission is to challenge the status quo of workplace technology. This idea started at the front desk, where we set a new standard for visitor sign-in. Now, we’re looking around the office—to the mailroom, meeting rooms and beyond—and asking how can we make this better, too? We envision a world where technology is woven through our workplaces, all of it working together to make our time there delightful.
Build out our data transformation pipeline (we use Airflow, dbt, Spark, Redshift) that powers all our analysis, research, and reporting, and ensure it is reliable, fast, and scalable
Build full-stack data products that power key tools and apps (Flask, React, Vega)
Drive key decisions on standards, frameworks, and tools across the data team
Serve as the domain expert on the technologies that power our data stack, and ensure that best practices are always being followed in both code and process
Learn and implement innovative technologies to continuously improve the value that our data platform can deliver
Explore data from all sides of the business (Sales, Success, Marketing, Finance) and drive insights through modeling (marketing attribution, churn prediction, account health, lead scoring, etc.)
5+ years of software engineering experience or coding experience
BS / MS in Computer Science or a related technical field
Mastery of SQL
Experience with Airflow (or another orchestration framework)
Experience with dbt, ideally worked on a large dbt project
Experience with Python
Have built ETL pipelines from scratch and scaled them up to process high volumes of data
Have built and maintained APIs
Have built real-time data pipelines using streaming technologies (Kafka, Kinesis)
Deployed ML models in a production environment
Experience with AWS data services (Redshift, EMR, S3, etc.)
If this kind of work sounds interesting, we'd love to hear from you! We're open to all backgrounds and levels of experience, and we believe that great people can always find a place. People do their best work when they can be themselves, so we value uniqueness. We never discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability status.