unybrands is an equal opportunity employer and considers all applicants for employment without any regard to race, skin color, religion, gender identity, sexual orientation, and age. Nor are applicants discriminated against based on disability or protected classes.
unybrands was founded in 2020 by a group of partners who shared a common vision to create the leading next-generation e-commerce platform for micro-brands. The company operates globally, with our headquarters located in Miami and additional teams based in Berlin, London, New York, Seattle and Shanghai.
unybrands acquires e-commerce brands that operate on and off Amazon. unybrands integrates the brands into its platform, optimizes the business operations and economics, and expands to new product lines and geographies. With us, e-commerce brands reach new heights with expert operators and infrastructure.
About the role:
The Data Engineer will collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs) and Infrastructure Engineers (IEs) in the unybrands’ technology team. This role will report directly to the Director of Data Engineering of the company.
Responsibilities include - but not limited to:
- Collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs) and Infrastructure Engineers (IEs) in the Unybrands’ technology team.
- Focus on building a modular, flexible data engineering infrastructure that we can iteratively improve on continuously. Your customers will include eCommerce, marketing, supply chain, finance.
- Business development, and legal (i.e., essentially every part of Unybrands).
- Understand that our data is a strategic resource, and will build data products to support every aspect of our business.
- Build Unybrands’ data model(s) in addition to our data infrastructure (in collaboration with the Director of Analytics and Head of Infrastructure Engineering.
Requirements
Technical Expertise
Data Architecture and Modeling:
- Experience working with data architectures for scalable and performant systems.
- Experience with dimensional data modeling, OLAP systems, and data warehousing concepts (star/snowflake schema, etc.).
- Strong experience with GCP
- Experience with Big Query, Looker Studio
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra) and SQL databases (e.g., PostgreSQL, MySQL).
- Design, implement, and optimize ETL pipelines for efficient data movement and transformation.
- Hands-on experience with workflow management tools such as Apache Airflow
- Proficiency in Python and other relevant programming scripting languages for data processing.
- Strong SQL expertise for querying large datasets
- Knowledge of API integration for extracting data from multiple sources (internal and external systems such as Amazon's API for FBA, sales data, etc.).
DevOps and Automation:
- Experience with CI/CD pipelines, automation scripts, and infrastructure-as-code tools (e.g., Terraform, CloudFormation).
- Ability to implement data versioning and data quality checks.
Data Privacy:
- Strong understanding of data privacy and compliance regulations like GDPR or CCPA.
- Ensure data governance best practices, including encryption, access controls, and audit trails.
Monitoring and Observability:
- Expertise in implementing data quality checks, monitoring data pipelines, and ensuring that systems are reliable and efficient.
- Proficiency with monitoring tools for real-time visibility of data flows.
- Experience with ecommerce/retail/supply chain data modeling is a big plus.
- Experience with Amazon seller ecosystem is a big plus.
Communication Skills
- Ability to solve complex data challenges, propose innovative solutions for scaling and optimizing data architecture, and proactively drive improvements.
- Ability to adapt to fast-paced, constantly evolving scale-up environment
- Capable of handling well defined projects and prioritizing based on business impact.
- Proficient in English