Sauce Labs provides the world's largest automation cloud for testing web and native/hybrid mobile applications. Founded by the original creator of Selenium, Sauce Labs helps companies accelerate software development cycles, improve application quality and deploy with confidence across 500+ browser / OS platforms. Join us in making the world a better place for continuous integration and software development. We’re building a next generation infrastructure as a service platform and looking for passionate Senior Data Engineer to join our Data Analytics team.

Our Data Analytics team consists of data scientists, data engineers, and data analysts working together to unlock the values of our data and to uncover new revenue opportunities as well as to create more compelling and differentiating user experience.  As a Senior Data Engineer in the team, you will be playing a key role in architecting and building our next generation data analytics platform.

Responsibilities:

  • Design, architect and support new and existing data and ETL pipelines and recommend improvements and modifications
  • Design, build, and support the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Responsible for ingesting data into our data lake and data warehouse and providing frameworks and services for operating on that data using various toolsets
  • Responsible for extending data analytics platform to include real-time streaming analytics, big data analytics, and machine learning analytics capabilities
  • Work with the team to define and build datasets for dashboard/reporting, and pre-processed datasets for advanced analytics and machine learning
  • Work with security to implement data privacy and data security requirements to ensure solutions stay compliant to security standards and frameworks (such as ISO, SOC-II, etc.)
  • Contribute to the core design of data architecture, data models and schemas, and implementation plan
  • Use best practices in terms of testing, monitoring, alerting, auto-recovery, design patterns, security etc. 

Requirements:

  • BS or MS in Computer Science, Engineering, or a related technical discipline or equivalent experience
  • Experience in implementing complex ETL pipelines preferably in connection with mySQL, 3rd party systems (Salesforce, mixPanel,…), AWS S3, and cloud data warehouses
  • Work with cloud data warehousing for ad-hoc and advanced analytics such as Redshift, Athena, BigQuery, or Snowflake
  • Experience in designing and implementing data lakes and familiar with dimensional data modeling & schema design in data warehouses
  • Experience with BI and data visualization tools like Looker, Tableau or other BI tools
  • Proficient in Python programming
  • Familiar with cloud data platforms like AWS or GCP
  • Experience with cloud ETL tools like Stitch or Alooma and workflow management tools like Airflow or Luigi
  • Familiarity of how mysql/replication/clustering works
  • Excellent communication skills, particularly translating between technical and non-technical stakeholders

Apply for this Job

* Required