Founded in 2015, Apollo is a leading sales intelligence and engagement platform trusted by over 15,000 paying customers, from rapidly growing startups to the largest global enterprises. Our platform unifies a database of 200 million business contacts with advanced intelligence and engagement tools, to help over 500,000 sales, marketing, and recruiting professionals to connect with the right person at the right time with the right message, at speed and scale.
In the last year, we’ve grown ARR 3x, quadrupled our active users, maintained profitability 18 out of the past 20 months, and recently closed a $110M Series C led by Sequoia Capital to fuel the next phase of our growth.
Working at Apollo
We are a remote-first inclusive organization focused on operational excellence. Our way of working ensures clear expectations and an environment to do your best work with ample reward.
YOUR ROLE AND MISSION:
As a Staff Data Engineer, you will be responsible for maintaining and operating the data warehouse and connecting in Apollo’s data sources.
• Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
• Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
• Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
• Write unit/integration tests, contributes to engineering wiki and document work.
• Define company data models and write jobs to populate data models in our data warehouse.
• Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
• Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.
• Self-motivated and self-directed
• Inquisitive, able to ask questions and dig deeper
• Organized, diligent, and great attention to detail
• Acts with the utmost integrity
• Genuinely curious and open; loves learning
• Critical thinking and proven problem-solving skills required
• 8+ years experience in data engineering or in data facing role
• Experience in data modeling, data warehousing, and building ETL pipelines
• Deep knowledge of data warehousing with an ability to collaborate cross-functionally
• Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
• Experience using the Python data stack
• Experience deploying and managing data pipelines in the cloud
• Experience working with technologies like Airflow, Hadoop and Spark
• Understanding of streaming technologies like Kafka, Spark Streaming
What You’ll Love About Apollo
Besides the great compensation package and culture that thrives in openness and excellence, we invest tremendous effort into developing our remote employees’ careers. The team embraces that we have a sole purpose: to help customers maximize their full revenue potential on the Apollo platform. This mindset opens us up to a lot of creative approaches to making customers successful at scale. You’ll be a significant part of a lean, remote team, empowered to really own your role as a proactive educator. We’re very collaborative at Apollo, so you’ll be able to lean on your teammates, even in adjacent departments, to help you achieve lofty goals. You’ll be supported and encouraged to experiment and take educated risks that lead to big wins. And, you’ll have a whole team remotely by your side to help you do it!