Location - HITEC City, Hyderabad
This position will be part of a growing team working towards building world class large scale Big Data architectures. This individual should have a sound understanding of programming principles, experience in programming in Java, Python or similar languages and can expect to spend a majority of their time coding.
Work Experience : 3 - 5 Years
Responsibilities:
- Good development practices
- Hands on coder with good experience in programming languages like Java, Python, C++ or Scala.
- Good understanding of programming principles and development practices like checkin policy, unit testing, code deployment
- Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments
- Excellent experience in Application development and support, integration development and data management.
- Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
- Culture
- Must be a strategic thinker with the ability to think unconventional / out:of:box.
- Analytical and data driven orientation.
- Raw intellect, talent and energy are critical.
- Entrepreneurial and Agile : understands the demands of a private, high growth company.
- Ability to be both a leader and hands on "doer".
Qualifications: -
- Years of track record of relevant work experience and a computer Science or related technical discipline is required
- Experience with functional and object-oriented programming, Python or Scala a must
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment .
Preferred Qualification:-
- Technical knowledge in Spark, Hadoop & GCS Stack. Vertica, Snowflake, Druid a plus
- Development and support experience in Big Data domain
- Experience in agile methodology
- Experience with database modeling and development, data mining and warehousing.
- Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff
- Experience working with large, complex data sets from a variety of sources