About sweetgreen:

sweetgreen is on a mission to build healthier communities by connecting people to real food. We passionately believe that real food should be convenient and accessible to everyone. Every day in each sweetgreen, our 4K plus workforce make food from scratch, using fresh ingredients and produce delivered that morning. And in our local communities, we’re committed to leaving people better than we found them. We’re in the business of feeding people, and we’re out to change what that means.

We have a talented collective of people in our data, digital, and technology teams to deliver first class data-driven products to our customers, fleet and internal operations. If you want to be in a horizontal environment that cares more about work product, pushing out quality services and accountability focused than this is the job for you.

Ideally, we want someone passionate about and experienced with data: small, large and BIG. Be at the center of understanding, applying and able to process data for our thriving fast-casual marketplace and build streamline cohabitation with our digital product and synergy with our four-wall market using data.

Tasks & Activities 

The Data Infrastructure Engineer is responsible for the design, development and maintenance of our current data platform. This is not a Data Science role, you’ll be focus on infrastructure, building, iterating and optimizing our ecosystem using an object-oriented framework. Understanding sweetgreen’s complex data ecosystem and being able to build, deploy and maintain scalable/reusable pipelines that are mission critical to support our strategic decision making and promote data adoption across the company. From our customer-facing digital team to our back of house operations team, you will create language agnostic data solutions that enable actionable business insight and the creation of essential operational tools.

  • Design, model, architect and integrate disparate data systems from relational, NoSQL, HDFS and object stores into our Data Lake
  • Develop integrations from all data sources (structure/unstructured) to refine, clean, transform and apply through various experience channels
  • Work with the data team to automate pipelines for analytics and deep business insights from available data sets
  • Define and apply best practices for building scalable and secure systems that are interoperable
  • Self-managed, team player and able to manage partners to coordinate delivery on multiple concurrent projects
  • Working knowledge with cloud-based systems (GCP, Azure, AWS)
  • Performance tune and optimize data systems
  • Define specifications for functional and technical deliverables 

BASIC QUALIFICATIONS

  • Must Have 5+ years hands on experience in at least one or more first class object oriented languages (C++, Java, C#, or similar).
  • BS/BA in Computer Science, Information Systems, Engineering or related field and at least 5+ years experience or equivalent training.
  • 5+ years hands-on experience with open source, enterprise and cloud database systems like MySQL, Postgres, Oracle, MSSQL, Redshift and Netezza
  • Applied experience in programming in distributed environments and clusters.
  • Experience working in a highly agile and dynamic environment.
  • Experience designing, developing and programming API solutions for integration from various client end points, including native applications.
  • Experience working with small, rapid development teams from project definition, scoping, estimating, planning, development, testing and launch.
  • Experience with software security best practices, a plus.
  • Strong verbal and written communication skills, with an ability to express complex technical concepts in business terms. 

PREFERRED QUALIFICATIONS

  • 5+ years applied experience with streaming producer/consumer patterns like Kafka, Kinesis and RabbitMQ
  • Applied working knowledge in at least one scripting language (Python, Scala or similar).
  • 1-4 years hands-on experience with non-relational object stores like DynamoDB, Cassandra, Redis and Memcache
  • Familiarity with orchestration tools like Jenkins, Circle/CI and CloudFormation
  • A plus if you have experience in Airflow
  • Know the concepts of containerization and its management i.e. Docker and Kubernetes

What you'll get:

  • Competitive pay + bonus plan granted based on performance 
  • Health, dental + vision insurance
  • Flexible PTO, because we respect the need for work/life harmony
  • An opportunity to make a real impact on the people around you, both by growing them and by connecting them to real food
  • To live the sweetlife and celebrate your passion + purpose
  • A collaborative family of people who live our core values and have your back
  • A clear career path with opportunities for development, both personally and professionally
  • Our annual Impact Retreat offsite
  • Free sweetgreen swag
  • Complimentary sweetgreen

Come join the sweetlife! 

sweetgreen provides equal opportunities for everyone that works for us and everyone that applies to join our team, without regard to sex or gender, gender identity, gender expression, age, race, religious creed, color, national origin, ancestry, pregnancy, physical or mental disability, medical condition, genetic information, marital status, sexual orientation, any service, past, present, or future, in the uniformed services of the United States (military or veteran status), or any other consideration protected by federal, state, or local law. 

sweetgreen participates in the federal government's E-Verify program to determine employment eligibility. To learn more about the E-Verify program, please click here.

Apply for this Job

* Required
File   X
File   X