What is Teachable?

Teachable is trusted by creator-educators around the world to grow their impact and income. From online courses and communities to memberships and downloads, Teachable's digital learning products help creator-educators drive meaningful connection and sustainable revenue. With industry-best ecommerce tools, easily toggled on directly within the platform, creators can confidently maximize their earnings, while getting paid directly by their audience. Teachable's unmatched focus on the student learning experience also ensures creators can make a positive and influential impact on their communities—entirely on their own terms. Today, tens of thousands of creator-educators use Teachable to share their knowledge, reaching millions of students around the world. To learn more, visit teachable.com.

Are you ready to join a dynamic, cross-cultural team at an exciting turning point in our company’s journey? Now part of the global Hotmart Company portfolio, whose platforms have helped creators earn more than $10 billion, Teachable continues to take the creator economy by storm as a true industry leader. Together, Teachable and Hotmart are delivering market-leading products that prioritize creator control and flexibility, alongside meaningful partnership and support from our team. If you have big ideas, relish the chance to challenge convention, and deeply believe in the power of creators to shape the future, we want you on our team!

About You

We are seeking a skilled Analytics Engineer to join our dynamic Data Team. The ideal candidate will have a comprehensive understanding of the data lifecycle from ingestion to consumption, with a particular focus on data modeling. This role will support various business domains, predominantly Finance, by organizing and structuring data to support robust analytics and reporting.

This role will be part of a highly collaborative team made up of US and Brazil-based Teachable and Hotmart employees.


What You’ll Do

  • Data Ingestion to Consumption: Manage the flow of data from ingestion to final consumption. Organize data, understand modern data structures and file types, and ensure proper storage in data lakes and data warehouses.
  • Data Modeling: Develop and maintain entity-relationship models. Relate business and calculation rules to data models to ensure data integrity and relevance.
  • Pipeline Implementation: Design and implement data pipelines using preferrable SQL or Python to ensure efficient data processing and transformation.
  • Reporting Support: Collaborate with business analysts and other stakeholders to understand reporting needs and ensure that data structures support these requirements.
  • Documentation: Maintain thorough documentation of data models, data flows, and data transformation processes.
  • Collaboration: Work closely with other members of the Data Team and cross-functional teams to support various data-related projects.
  • Quality Assurance: Implement and monitor data quality checks to ensure accuracy and reliability of data.
  • Cloud Technologies: While the focus is on data modeling, familiarity with cloud technologies and platforms (e.g., AWS) is a plus.

What You’ll Bring

  • 3+ years of experience working within data engineering, analytics engineering and/or similar functions.
  • Experience collaborating with business stakeholders to build and support data projects.
  • Experience with database languages, indexing, and partitioning to handle large volumes of data and create optimized queries and databases.
  • Experience in file manipulation and organization, such as Parquet.
  • Experience with the "ETL/ELT as code" approach for building Data Marts and Data Warehouses.
  • Experience with cloud infrastructure and knowledge of solutions like Athena, Redshift Spectrum, and SageMaker.
  • Experience with Apache Airflow for creating DAGs and various purposes.
  • Critical thinking for evaluating contexts and making decisions about delivery formats that meet the company’s needs (e.g., materialized views, etc.).
  • Knowledge in development languages, preferably Python or Spark.
  • Knowledge in SQL.
  • Knowledge of S3, Redshift, and PostgreSQL.
  • Experience in developing highly complex historical transformations. Utilization of events is a plus.
  • Experience with ETL orchestration and updates.
  • Experience with error and inconsistency alerts, including detailed root cause analysis, correction, and improvement proposals.
  • Experience with documentation and process creation.
  • Knowledge of data pipeline and LakeHouse technologies is a plus.

What You’ll Bring

While Teachable maintains our NY office for local employees to use, we operate as a remote-first culture in order to give our employees added flexibility. In order to maintain connection and create a community beyond the screen, Teachable holds in-person events throughout the year, where employees and teams can come together for bonding, strategic alignment, goal-setting, and celebrations!
 

 
Teachable encourages individuals from a broad diversity of backgrounds to apply for positions. We are an equal opportunity employer, meaning we're committed to a fair and consistent interview process. Please tell us in your application if you require an accommodation to apply for a job or to perform your job.

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)


Enter the verification code sent to to confirm you are not a robot, then submit your application.

This application was flagged as potential bot traffic. To resubmit your application, turn off any VPNs, clear the browser's cache and cookies, or try another browser. If you still can't submit it, contact our support team through the help center.