Who We Are:
Materialize is the streaming SQL database company that makes it easy for any developer or analyst to understand streaming data, answer complex questions and build intelligent applications using standard technology. Whether it’s delivering personalized experiences, accurately identifying fraud, building predictive AI, or discovering new business opportunities, the ability to run complex queries on multiple streams of data and keep their answers up to date is critical to making better decisions about the changing world around us.
We are focused on bottom-up developer adoption, our core software is free to use and source available, our business model is a cloud product that handles management of Materialize and lets businesses focus on building value. We have a vibrant developer community on Slack and as much as possible we work in public on GitHub.
The Materialize team includes engineers who were early employees of Cockroach Labs, Ververica, and Stripe. Our work builds on top of Timely Dataflow and Differential Dataflow, both created by our co-founder Frank McSherry, a world-leading computer scientist with decades of award-winning research in all aspects of data.
Kleiner Perkins, Redpoint Ventures and Lightspeed Venture Partners.
About the Role:
We are looking for seasoned engineers to join our Sources and Sinks team. This team owns the experience of getting data into and out of Materialize efficiently. It is performance-sensitive distributed systems work with an emphasis on correctness that encompasses integrations with many adjacent data infrastructure projects.
- Design, implement, ship, and maintain substantial parts of Materialize in Rust.
- Iterate on Materialize to discover and adapt to customer needs.
- Collaborate with other engineers and product management.
- 5+ years of experience with software engineering focused on systems-level software.
- Solid programming fundamentals (e.g. in Java, Go, C++) and interest in learning Rust.
- Track record of learning new technologies and concepts quickly.
- Ability to work both autonomously and collaboratively, as needed.
- Strong written and verbal communications skills.
- Strong working knowledge of computer science fundamentals, equivalent to a B.S., M.S., or Ph.D in Computer Science. Timely and Differential Dataflow are built upon years of academic and industrial research, and you’ll need to become familiar with the research areas.
- Knowledge of stream processing.
- Familiarity with message brokers (Kafka, Kinesis, RabbitMQ) and CDC (Debezium).
- Experience with Rust.
- Experience implementing data infrastructure.
- Experience with distributed systems or high-performance systems.
We understand it takes a diverse team of highly intelligent, passionate, curious, and creative people to develop the exceptional product we are building. Our dynamic team has incredible perspectives to share, just as we know you do, and we take great pride in being an equal opportunity employer.