Here at Syndigo, we're enabling our clients to deliver better eCommerce experiences. We've mastered the right data, right now. From creation to sale, that's the value our partners get from us - a holistic, truly differentiated end-to-end solution that closes the loop while increasing sales.
Basically, we're the accurate data behind how people feel when they shop online with confidence!
We cannot do all of this without our amazing people! Our employees make the magic happen here at Syndigo and we're growing rapidly! We're ready for you to collaborate with us to challenge the status quo!
Sr. Data Integration Engineer (Onsite- Bangalore)
The Sr. Data Integration Engineer is responsible for Architecting and implementing data ingestion, validation, and transformation pipelines at Syndigo. In collaboration with the Product, Development, and Enterprise Data teams, the Sr. Data Integration Engineer will design and maintain batch and streaming integrations across a variety of data domains and platforms. The ideal candidate is experienced in big data, cloud architecture, and is excited to advance innovative analytics solutions! The Sr. Data Integration Engineer should be able to effectively communicate ideas and concepts to peers and have experience leading projects that support the business objectives and goals.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
- Take ownership in building solutions and proposing architectural designs related to building efficient and timely data ingestion and transformation processes geared towards analytics work loads
- Manage code deployment to various environments
- Be proficient at positively critiquing and suggesting improvements via code reviews
- Work with stakeholders to define and develop data ingest, validation, and transform pipelines
- Troubleshoot data pipelines and resolve issues in alignment with SDLC
- Ability to diagnose and troubleshoot data issues, recognizing common data integration and transformation patterns
- Estimate, track, and communicate status of assigned items to a diverse group of stakeholders
REQUIREMENTS:
- 5+ years of experience in developing and architecting large scale data pipelines in a cloud environment
- Demonstrated expertise in Scala (Object Oriented Programming) / Python (Scala preferred), SPARK SQL
- Experience with Databricks, including Delta Lake
- Experience with Azure and cloud environments, including Azure Data Lake Storage (Gen2), Azure Blob Storage, Azure Tables, Azure SQL Database, Azure Data Factory
- Experience with ETL/ELT patterns, preferably using Azure Data Factory and Databricks jobs
- Fundamental knowledge of distributed data processing and storage
- Fundamental knowledge of working with structured, unstructured, and semi structured data
- Excellent analytical and problem-solving skills
- Ability to effectively manage time and adjust to changing priorities
- Bachelor’s degree preferred, but not required
- Work location : Bangalore ( Hybrid)
Diversity, Equity & Inclusion
Authenticity fuels our work. In fact, it’s one of our Syndigo Values. To achieve the best version of our organization, we know it takes new ideas, new approaches, new perspectives and new ways of thinking. A purpose we are 100% committed to cultivating.
Diversity is woven into our fabric at Syndigo and it’s how we stay an industry leader, innovating technology solutions that equip our customers with everything they need to be successful!
All are welcome here and we invite you to join our team if you are ready to help us continue that growth!
GDPR/CCPA
Syndigo, to process applications, holds onto data for a "reasonable time" after applications are submitted. This data is stored for Syndigo's internal use by HR/Recruiting Staff only. Verified requests for data deletion and exports will be completed upon request.
Syndigo Job Applicant Privacy Notice
At Syndigo, we care about your privacy. As you go through our recruitment process, we are committed to being transparent about how we process your personal data. To learn more about how Syndigo processes your personal data, go to our Job Applicant Privacy Notice.