About Messari.

Messari is the leading provider of crypto market intelligence products that help professionals navigate crypto/Web3 with confidence. We bring transparency and smarter qualitative and quantitative analytics to the industry by combining a global research database with a comprehensive suite of data visualization and asset discovery tools. We help drive smarter participation in crypto from individuals and institutions alike.
The name “Messari” came from the Franciscan monks who declared "clean books" a moral imperative during the renaissance and pushed merchants to use proper accounting methods.  This led to the flourishing of investment via "trust but verify" methods and industry growth throughout Europe. That's what we aim to do. Provide participants, investors, builders, platforms, and everyone else with reliable information to better participate in the crypto ecosystem.
Our users range from some of the most prominent analysts, investors, and crypto individuals to top organizations including Coinbase, BitGo, Anchorage, 0x, Chainanalysis, Ledger, Compound, MakerDAO, and many more.


The Role

Messari is seeking a skilled Data Platform Engineer to join our team. In this role, you will be responsible for developing and maintaining scalable data pipelines, designing and managing a robust data platform infrastructure, and implementing DevOps best practices.Your mission is to ensure our analysts can access data in a smooth and seamless way.  

This position is a full-time role. We are currently a remote company and are open to any remote candidates. Our team is distributed around the world and has the processes in place for productive remote work, but we meet up in person as often as possible.

What you’ll do

  • Develop and maintain scalable data pipelines using tools like DBT and Airflow/Dagster.
  • Design and manage a robust data platform infrastructure that ensures high availability and performance.
  • Collaborate with data engineers and analytics engineer to improve data models and optimize data workflows.
  • Implement DevOps best practices, including CI/CD pipelines and automated testing for data solutions.
  • Utilize containerization technologies such as Docker and Kubernetes for efficient deployment and scaling of data platform components.
  • Monitor and troubleshoot platform issues, ensuring smooth data ingestion and processing.
  • Work closely with the data science and analytics teams to enable efficient data access and integration.

Who you are

  • Background in data engineering with proficiency in managing data platform architecture.
  • Expertise in using DBT and Apache Airflow for orchestration and data transformation.
  • Solid understanding of DevOps practices, particularly CI/CD, infrastructure as code, and observability.
  • Familiar with data warehousing, ETL frameworks, and data governance principles.
  • Strong problem-solving skills with a proactive approach to identifying and resolving platform issues.
  • Effective communicator who thrives in a collaborative, cross-functional environment.

**Bonus Points if you have experience with:**

  • Snowflake data warehouse platform
  • Dagster for data orchestration
  • Pulumi for infrastructure as code
  • AWS services such as S3, SNS, SQS, Lambda

Tier 1 USA salary range $120-180k 

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)

Our system has flagged this application as potentially being associated with bot traffic. Please turn off any VPNs, clear your browser cache and cookies, or try submitting your application in a different browser. If this issue persists, please reach out to our support team via our help center.
Please complete the reCAPTCHA above.