MatHem pioneered and leads online grocery shopping in Sweden. The Swedish grocery market is worth $30B annually and we are going after a large slice of that pie. In comparison, the global music streaming market is worth $19B annually. We are dedicated to creating an outstanding experience for our customers - from planning purchases with your family until the milk is in your fridge. At MatHem you have a rare opportunity to use Tech to put a positive dent in peoples’ lives. The smile on people’s faces when the groceries are handed over is priceless. We create more time for whatever people love - cooking, family time, exercise and work.
We look for strong people with ambition to make a large impact on our customers and our company. You will have all opportunities to do so. MatHem is backed by Kinnevik who has a track record of backing companies who successfully transform industries - from media to telco.
We're looking for an experienced Data Engineer to join our tech department in our Stockholm office. As a part of an innovative, agile, international team in one of Sweden’s fastest growing tech companies you’ll have a unique opportunity. We’re working in a multi-cloud environment (AWS and GCP), but with primary emphasis for the data product and services on GCP. For further details on our setup, please read our very own Robert Sahlin’s blog post https://robertsahlin.com/fast-and-flexible-data-pipelines-with-protobuf-schema-registry/.
- Work closely with data scientists, product teams to understand data and analysis requirements
- Design and build new dimensional data models and schema designs to improve accessibility, efficiency, and quality of data
- Drive the design, building, and launching of new data pipelines and data products in production
Skills & Wish list
- Bachelor's degree and/or MSc in Computer Science or equivalent experience
- 3+ years of experience working for a technology company
- Strong software engineering fundamentals, including data structures and complexity analysis
- Fluency in an OOP language, such as Python, Java, C#, or similar, as well as an ANSI-SQL variant
- Experience working with cloud-based data warehouses and data lakes (such as BigQuery, Redshift, or similar)
- Interest in AI/ML, serverless architectures and experience in workflow orchestration (e.g. Airflow, Cloud Composer) as well as streaming data processing technologies (such as Apache Beam, Spark Streaming, Kafka Streams, or similar) is a plus