Come Join the Wrike Family
At Wrike, we believe that work should be both challenging and fun. We're growing rapidly and providing excellent opportunities for professional growth. We owe our success to our talented and energetic team that's really fun to work with. We're smart, passionate, friendly, and professional, and we are looking for the same qualities in you.
Senior Data Engineer
We're looking for a Senior Data Engineer to organize reliable and scalable infrastructure, and help in designing and maintaining clean data sources for analysis of business operations.
You will be responsible for conceptualizing, designing, and building data pipelines and services, while contributing towards the evolution of our data platform as a strategic asset for driving business decisions across Wrike. Depending on your experience and scope you will be working on data warehousing organization, SaaS integrations, tooling, or practical AI implementations. We like to see people interested in more than one scope, and will gladly help to acquire extra ones.
Job Scope and Accountabilities:
- Own data assets and data pipelines that provide actionable insights into customer, product, GTM, and other key business functions
- Design, develop and maintain scalable data pipelines and transformations using data from a variety of engineering and business systems (e.g., Salesforce/CPQ, NetSuite, Marketo)
- Collaborate with Analysts to improve data models that feed business intelligence tools, increasing data accessibility and drive adoption of data
- Deploy ML models together with Data Science teams according to best practices of ML life cycle, and improve our AI infrastructure
- Implement processes and systems to manage data quality, ensuring production data is always accurate and maintains SLAs
- Work experience building & maintaining data pipelines on data-heavy environments (Data Engineering, Backend with emphasis on data processing, Data Science with emphasis on infrastructure)
- Strong knowledge of Python required
- Advanced working SQL knowledge and experience working with relational databases
- 7+ years experience with Data Warehousing Solutions (BigQuery, Redshift, Snowflake, Vertica, or similar)
- 7+ years experience with Data Pipeline Orchestration (Airflow, Dagster, Prefect, or similar)
- Confidence in using Git, CI\CD, and containerization
- Google Cloud Platform, AWS or Azure experience
- Database architecture experience
- Experience with major B2B vendor integrations (Salesforce/CPQ, NetSuite, Marketo, etc.)
- Good understanding of Data Modelling (Kimball, Inmon, SCDs, Data Vault, Anchor)
- Knowledge of Python Data Libraries (Pandas/SciPy/NumPy/Sci-Kit Learn/TF/PyTorch)
- Experience with Data Quality Tools, Monitoring and Alerting
- Experience with Enterprise Data Governance, Master data management, Data Privacy and Security (GDPR, CCPA)
- Familiarity with Data Streaming and CDC (Google Pub/Sub, Google DataFlow, Apache Kafka, Kafka Streams, Apache Flink, Spark Streaming, or similar)
- Experience with building analytic solutions in a B2B SaaS environment
- Experience partnering with go-to-market, sales, customer success, and marketing teams
- Good communication, collaborative demeanor and ability to work in distributed, multi-functional, multinational teams with the ability to articulate a point of view.
- Fostering a fun and productive team environment
Your recruitment buddy will be Pavel Kucera, Senior Tech Recruiter.
Who Is Wrike and Our Culture
Wrike promotes a hybrid work mode and we meet in the office 3 times a week. This work mode supports our culture of collaboration and solving problems fast to deliver business outcomes and win together.
Our culture and Values
📈 Deliver Business Outcomes
🥇 Be better than the competition
🚀 Move fast. Then, move faster
🤝 Know our customers
🏆 We win together
💪 Have courage