At Curology, we're revolutionizing dermatology by making effective skincare accessible to everyone.
We're part telemedicine startup/part skincare lab—and completely focused on helping hundreds of thousands of people get medical care previously available to only a tiny percentage of the population.
Our consumer-facing app powers 1:1 chats with real dermatology providers and enables users to engage with a vibrant community, and our customized skincare formulation help people see life-changing improvements in their skin.
We are successful in what we do. As a business, we have grown more than 3x over the last year!
Design, develop and scale data pipelines (currently using Airflow and Redshift)
Work closely with the data science, product and marketing teams to conceive and implement new data features
Develop internal tooling for automating the deployment of data infrastructure
Manage the monitoring of our applications
Research and implement improvements to how we manage our infrastructure and data pipelines
Skills and Experience
3+ years of experience working in a Software Engineering/DevOps/Data Engineering role
Experience with AWS, shell scripting/UNIX, and Python
Familiarity with an ETL tool like Airflow or Luigi
Experience managing and automating AWS deployments
Previous experience in distributed systems, data processing and analytics is a plus
Strong foundation in programming, algorithms, and software application design
Passion for solving challenging problems and iterating quickly
Passion for learning and always improving yourself and the team around you
Curology is headquartered in the beautiful Hayes Valley neighborhood of San Francisco and has additional offices (and roots!) in San Diego.