We are Auros! 

Auros is a leading algorithmic trading and market-making firm specialising in digital asset liquidity provision. We trade across 10+ global locations, facilitating 3-4% of global daily volumes, and have through connectivity to over 50 venues.

We’re proud of the strong reputation we’ve built by combining our systematic approach, sophisticated pricing models, and state-of-the-art execution capabilities to provide robust, reliable trading performance and bring liquidity to crypto markets worldwide.

What sets us apart, though, is our culture. Our flat structure means you’ll have autonomy and plenty of opportunity to bring your ideas to life and help shape the systems that will power our business into the future.

 

The Role

This is a rare opportunity for an experienced Data Engineer to become both steward and champion for the firm's market and trading data archives and internal data products

You will work with our existing data pipelines and databases while designing and implementing the next generation of Auros data and analytic capabilities. You’ll enjoy taking on responsibilities where you’ll have the opportunity to make a substantial impact on the business outcomes through the work you do every day.

You’ll learn from our experienced trading team and help develop and support systems that execute millions of trades on crypto exchanges across the globe.  

What You'll Do

  • Develop, test and maintain high throughput, high volume distributed data architectures
  • Analyze, define and automate data quality improvements
  • Develop and maintain real time data collectors for time series databases
  • Build and improve trading analytics systems
  • Create tools to automate the configuration, deployment and troubleshooting of the data pipeline
  • Develop strategies to make our data pipeline efficient, timely and robust in a 24/7 trading environment
  • Implement monitoring that measures the completeness and accuracy of captured data
  • Manage the impact that changes to trading systems and upstream protocols have on the data pipeline
  • Back populate and clean historical datasets
  • Collaborate with traders and trading system developers to understand our data analysis requirements, and to continue to improve the quality of our stored data
  • Develop tools, APIs and screens to provide easy access to the archived data

 

What You'll Bring

  • Extensive experience using python for data analysis and other ad hoc tooling to analyse time series data sets and other large sets of data
  • Ideally you’ll have experience with developing real time large scale data pipelines, with petabytes of data
  • Experience with distributed, high performance SQL and NoSQL database systems
  • A bachelor's degree (or above) in Computer Science, Software Engineering or similar, with excellent results.

Highly Desirable Skills

  • Experience with data lakes, Amazon S3 or similar
  • Experience developing in C++ on linux
  • Protocol level network analysis experience
  • Experience with terraform
  • Experience with Clickhouse
  • Experience with technologies such as Hive, Hadoop, Snowflake, Presto or similar.

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)

Enter the verification code sent to to confirm you are not a robot, then submit your application.

This application was flagged as potential bot traffic. To resubmit your application, turn off any VPNs, clear the browser's cache and cookies, or try another browser. If you still can't submit it, contact our support team through the help center.