Data Operations Engineer

Bottomline is seeking a Data Operations Engineer in Bangalore, India location to design, configure, monitor and operate its Data and analytics platform for its BTIQ ecosystem

The ideal candidate will work closely with product development, internal functional managers, QA engineers, and central teams (networking, security, cloud ops, site reliability) in provisioning secure, scalable and highly available deployment architectures which deliver 99.5%+ uptimes. The role requires an eye for detail with strong problem resolution skills.

Data Ops function is an extension of Devops, focused on hardening the infrastructure of data systems and increasing the quality of, and trust in, the data that flows through those systems. 

Primary responsibilities include:

  • Use standard tools to provision software tools and necessary hardware, e.g. puppet, terraform, cloud-native scripting for Infrastructure setup
  • Monitoring infrastructure to log, store and analyze performance and operational metrics for the overall data system
  • Set up or incorporate continuous deployment tools, e.g. Jenkins, GitLab with adherence to continuous integration standards
  • Create deployment pipeline for developer code inclusive of tests in CI/CD pipeline
  • Work with and support QA engineer for Code quality
  • Ensure quality checks in data pipelines at each stage to back-validate against previous stage, and reporting and alerting capabilities for quality checks
  • Monitor latency in each stage of pipeline
  • Add statistical reporting of aggregate data flowing through pipelines
  • Data SLAs and monitoring for data arrivals in ingestion
  • Track and report data system latency and data availability against agreed SLAs
  • Add capability to track, monitor machine learning models and production deployment

The ideal candidate will have the following: 

  • Experience in operating enterprise class or cloud scale applications 24x7
  • Strong articulation skills in partnering with internal teams and external clients
  • Deep understanding of *nix operating systems, networking, load balancers
  • Experience with nginx, tomcat, docker, kubernetes
  • Strong scripting skills in bash, groovy, python/ruby
  • Experience in monitoring, metrics collection, and reporting using open source tools
  • Experience with automation and configuration management using terraform and puppet
  • Knowledge of best practices & IT Operations on HA, multi-tenant & secure system 
  • Experience with AWS or Azure

Required Skills/Experience: 

  • BE or Master’s degree in Computer Science, Information Technology or related field
  • 5 to 7 years experience working as a hands-on DevOps or CloudOps engineer
  • He needs have experience in monitoring tools deployment, terraform or equivalent orchestration
  • Some level of tools integration around gitops, gitlab, bitbucket and bug tracking
  • Must possess strong communication and interpersonal skills

About CapitalCloud, A Bottomline Company:

Bottomline Technologies provides collaborative payment, invoice and document automation solutions to corporations, financial institutions and banks around the world. The company's solutions are used to streamline, automate and manage processes involving payments, invoicing, global cash management, supply chain finance and transactional documents. Organizations trust these solutions to meet their needs for cost reduction, competitive differentiation and optimization of working capital.

Serving industries such as financial services, insurance, health care, technology, communications, education, media, manufacturing and government, Bottomline provides products and services to approximately 80 of the Fortune 100 companies and 70 of the FTSE (Financial Times) 100 companies.

Bottomline is a participating employer in the Employment Verification (E-Verify) program EOE/AA/M/F/V/D/E-Verify Employer

Bottomline Technologies is an Equal Employment Opportunity and Affirmative Action Employer.


Apply for this Job

* Required