Location: HITEC City, Hyderabad

As a Technical Lead, you will be responsible in building a highly-scalable and extensible big data platform that provides the foundation for collecting, storing, modeling, and analyzing massive data sets from multiple channels.

This position reports to Engineering Manager.

Responsibilities: 

  • Align Sigmoid with key Client initiatives
    • Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
    • Connect with VP and Director level clients on a regular basis.
    • Travel to client locations
    • Ability to understand business requirements and tie them to technology solutions
  • Facilitate in Technical Aspects
    • Design, develop and evolve highly scalable and fault-tolerant distributed components using Big data technologies.
    • Excellent experience in Application development and support, integration development and data management.
  • Provide technical leadership and manage it day to day basis
    • Guiding developers in day to day design and coding tasks
    • Play a key role in hiring technical talents to build the future of Sigmoid.
  • Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
    • Hands on coder with good understanding on enterprise level code
    • Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
    • Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
  • Culture
    • Must be a strategic thinker with the ability to think unconventional / out:of:box.
    • Analytical and data driven orientation.
    • Raw intellect, talent and energy are critical.
    • Entrepreneurial and Agile : understands the demands of a private, high growth company.
    • Ability to be both a leader and hands on "doer". 

Qualifications: -

  • 7+ year track record of relevant work experience and a computer Science or a related technical discipline is required
  • Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff
  • Proven track record of building and shipping large-scale engineering products and/or knowledge of cloud infrastructure such as GCP/AWS preferred
  • Experience working with large, complex data sets from a variety of sources
  • Experience with Hadoop, Spark, or similar stack a must
  • Experience with functional and object-oriented programming, Python or Scala a must
  • Effective communication skills (both written and verbal)
  • Ability to collaborate with a diverse set of engineers, data scientists and product managers
  • Technical knowledge in Spark, Hadoop & GCS Stack.
  • Comfort in a fast-paced start-up environment   

Preferred Qualification:

  • Experience in agile methodology
  • Development and support experience in Big Data domain
  • Architecting, developing, implementing and maintaining Big Data solutions
  • Experience with database modeling and development, data mining and warehousing.
  • Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, KAFKA, etc)

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
When autocomplete results are available use up and down arrows to review
+ Add another education


Enter the verification code sent to to confirm you are not a robot, then submit your application.

This application was flagged as potential bot traffic. To resubmit your application, turn off any VPNs, clear the browser's cache and cookies, or try another browser. If you still can't submit it, contact our support team through the help center.