Adaptive Management is a SaaS company building a unified ecosystem for leveraging data. Our cloud-based platform enables companies to interface with thousands of data providers to quickly find answers from data. We give clients the power to Discover, Combine, Visualize, Forecast and Analyze data in an efficient and user-friendly platform.
This is an exciting time to get in on the ground-floor of a well-funded and fast-growing startup that is solving complex problems and setting the standard for how data is found, visualized, and ultimately used. We have revenue, a healthy sales pipeline, and a clear pathway to the next funding round. If you are looking to join a highly talented and passionate team of software engineers, data scientists and business leaders then reach out to us at email@example.com or apply now.
Adaptive is developing a state-of-the-art technology stack to ingest, normalize, aggregate and distribute data to clients, and provide analytics back to data vendors.
As a ETL Data Analyst on the Ingest Team, your job will be to build out our data onboarding capabilities and ETL pipelines and support our analysis efforts. You thrive in a rapidly evolving environment and enjoy solving user problems and discovering requirements. You will be adept at designing and implementing ETL processes according to industry standards and best practices.
- Learn key vendor data sets and build robust ETLs that address our clients’ needs.
- Work with and manipulate data, develop aggregations, understand biases in data, and apply visualization techniques in order to better tell stories found in the data
- Collaborate with other data engineers on your team and with other engineering teams in the company.
- Serve internal stakeholders, such as the director of data partnerships, VP of engineering, data scientists, and the sales team
- Take on increasing complex projects and maintain a “client first” attitude with all of your work
- Produce high quality results – quality assurance is one’s own work
- Bachelor's degree or higher in technical discipline
- 1+ years’ experience working with complex datasets
- Experience transforming data with SQL (we use Spark SQL)
- Experience working with RESTful APIs
- Experience working with a variety of data formats (parquet, csv, avro, xlsx)
- Hands-on development mentality with a willingness to solve complex problems
- Strong written and verbal communication skills
NICE TO HAVE
- Experience working in cloud environments (AWS, GCP, Azure)
- Experience programming in Python or another high-level language
- Experience with PostgreSQL, Apache NiFi, Data Visualization, Jupyter/Zeppelin, Linux
- Experience working in a startup environment
- Experience working with financial datasets