Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on talented engineers to help us create and maintain a 24x7, no downtime, global infrastructure system that powers DoorDash’s three-sided marketplace of consumers, merchants, and dashers.
Doordash’s Streaming & Realtime Data Platform team is responsible for the infrastructure which enables Doordash’s on-demand logistics products and business to obtain data on a near-real-time basis in a reliable and scalable fashion . The domain of work includes messaging via Apache Kafka, Stream processing using Flink and Spark Streaming, OLAP Infra, and Realtime Analytics queries. The team consists of a mix of experienced veterans of Stream processing from prestigious companies and some aspiring and highly passionate budding Streaming engineers.
If you want to solve the toughest engineering challenges in the world and learn from some of the smartest people in the industry, Doordash’s Streaming & Realtime Data Platform is the right place for you. Come join us and be part of the mission.
What You’ll Achieve
- You will help identify the right technical stack to help build and adopt a Realtime data ecosystem at Doordash.
- You will work alongside our Data Analysts, Data Scientists, ML Engineers and Data Infrastructure engineers to collaborate on important projects that need Realtime data.
- You will help build high performance and flexible streaming pipelines that can rapidly evolve to handle new technologies, techniques and modeling approaches
- High-energy and confident - you’ll do whatever it takes to win
- You’re an owner - driven, focused, and quick to take ownership of your work
- Humble - you’re willing to jump in and you’re open to feedback
- Adaptable, resilient, and able to thrive in ambiguity - things change quickly in our fast-paced startup and you’ll need to be able to keep up!
- Growth-minded - you’re eager to expand your skill set and excited to carve out your career path in a hyper-growth setting
- Desire for impact - ready to take on a lot of responsibility and work collaboratively with your team
- B.S., M.S., or PhD. in Computer Science or equivalent
- Exceptionally strong knowledge of CS fundamental concepts and OOP languages
- 5+ years of industry experience
- Prior experience working on stream processing systems in production such as enabling real time data analytics at scale using technologies similar to Apache Kafka, Flink, Spark, Druid, Cassandra, Elasticsearch etc.
- Systems Engineering - you've built meaningful pieces of infrastructure in a cloud computing environment. Bonus if those were data processing systems or distributed systems
Nice To Haves
- Experience with real-time technology problems
- Familiar with Pandas / Python machine learning libraries
- Familiar with Spark, MLLib, Databricks MLFlow, Apache Airflow and similar related technologies.
- Familiar with a cloud based environment such as AWS
Founded in 2013, DoorDash is a San Francisco-based technology company passionate about transforming local businesses and dedicated to enabling new ways of working, earning, and living. Today, DoorDash connects customers with their favorite local and national businesses in more than 850 cities across the United States and Canada. By building intelligent, last-mile delivery technology for local cities, DoorDash aims to connect people with the things they care about — one dash at a time.