The Lead Data Engineer is a hybrid technologist position that requires multiple skillsets, attention to detail, desire to learn, and the ability to work well with some of the coolest people in the world. Here at Javelin, we leverage a mix of open source and proprietary tools to ingest data of all types and from all sources (API, structured, unstructured, streaming, etc.), clean and transform this data into accessible data sets, implement QA/QC, and load data to multiple destinations (RDBMS, S3, etc.).

This process requires a collaborative team effort, with members able to flexibly float between skillsets. We are looking for candidates who demonstrate great leadership and mentoring skills to lead this effort. Additionally, this candidate would be hands-on and able to utilize Pentaho/Kettle (or similar) for ETL, develop and implement scalable data architectures and processes, and ensure insights are accessible for our clients and analytic teams.  Documentation, flexibility and the ability to adapt to changing priorities are also key to being successful in this role.

If this interests you, and you’ve been working with data of all sizes (10’s of thousands of rows, up to and above 100’s of millions of rows) in all sorts of formats (CSV, JSON, XML, etc.) from a variety of sources (DB, API, S3, etc.) and you are interested in stretching your skills, then you may be the perfect person to join our team, so submit your resume and let’s chat. The details below are to capture keywords and provide additional information; ultimately, we’ll work together to tailor the job to ensure everyone is successful.

Job Responsibilities:

  • Leads development of sustainable, efficient, scalable, adaptable data processing, loading cleansed data into production processing working with Stakeholders and Project Management teams.
  • Supports new business development by building sales material and project hours/cost estimation.
  • Helps manage and execute production and ad hoc ETL and data warehouse activities.
  • Provides strategic thinking leadership pertaining to new ways of leveraging information to improve business processes.
  • Understands change and release management for production business applications.
  • Troubleshoots and resolves issues related to database performance, capacity, replication, and integrity.
  • Design and Executes strategies for data acquisition, warehouse implementation, and backup / archive recovery.
  • Provide architectural and administrative best in service support for SQL Structures (query and data), Pentaho, Tableau, and other Reporting Solutions based on data universes.
  • Communicate analysis results and make recommendations to management and clients.
  • Improve processes and recommend efficiencies through active and proactive engagement in peer design and code reviews.
  • Keep familiar with evolving technology landscape and trends - understanding the latest offerings for visualization, database, and ETL solutions.


Candidate Qualifications:

  • 5+ years experience in data management, integration, and reporting working with relational databases; experience with PostgreSQL, MySQL, SQL Server Enterprise, Oracle, or other enterprise-grade RDMBS (including SQL coding / scripting experience) is required.
  • 5+ years ETL experience working with Informatica, TalenD, SSIS, Hitachi Vantara’s Pentaho suite, or similar required (Pentaho’s Data Integrator a.k.a. PDI Kettle highly preferred).
  • 2+ years Linux experience.
  • Experience integrating with third-party API’s (e.g. via REST).
  • Experience leading a team of developers of varied skillsets.
  • Ability to review, validate, and optimize code and data produced by others.
  • Programming language experience a plus (Python, PHP, JavaScript).
  • Experience with Tableau and/or another BI tool a plus.
  • Proven ability to couple client-responsiveness / flexibility with a passion for automation.
  • Experience working with multiple groups to document data flow diagrams (DFD) and assist in evolving the data architecture in alignment with the vision for the Enterprise Data Warehouse.
  • Experience integrating marketing, sales (transactional) and operational data in a consolidated data warehouse.
  • Experience tuning data for analytic / reporting tools such as Tableau, Business Objects, SAS, etc. is a plus.
  • Proven self-starter who takes initiative and accountability for seeing a job to completion.
  • Experience with various SDLC (software development life cycle) methodologies, and ability to work effectively in an Agile development environment.
  • Understanding of Data Warehouse concepts such as star and snowflake schemas.
  • Responsible individual with strong analytical skills and judgment to independently handle assignments.
  • Strong attention to detail person on all methods of communication including but not limited to written, in-person, and telephone communication skills.
  • Attention to detail, the understanding of the difference between accuracy and precision is a plus.
  • Must be able to help in proposal/SOW preparation and business development related activities.
  • Four-year college or university; OR equivalent training, education and experience highly preferred.


Apply for this Job

* Required

When autocomplete results are available use up and down arrows to review
+ Add Another Education