Dkatalis Labs is a financial technology company with multiple offices in the APAC region. In our quest to build a better financial world, one of our key goals is to create an ecosystem linked financial services business. 

Combining the best domain knowledge in financial services, data, artificial intelligence, and credit rating technology, Dkatalis brings the next generation of data centric platforms to transform the financial service industry in Asia. We intend to progressively cover all areas of consumer finance. 

Dkatalis Labs is built and backed by experienced and successful entrepreneurs, bankers and investors in Singapore and Indonesia who have more than 30 years of financial domain experience and are from top-tier schools like Stanford, Cambridge London Business School, JNU with more than 30 years of building financial services/ banking experience from Bank BTPN, Danamon, Citibank, McKinsey & Co, Northstar, Farallon Capital and HSBC. 

We’re looking for a Senior Data Warehouse Engineer to join our high performing team. If you’re looking to be a part of a team who tackle real-world architectural problems, Dkatalis Labs might just be the place for you! Work alongside world-class talent and join us as we use sophisticated data and analytics to make a change in the financial world.

Job specification

We are seeking a hands-on senior data warehouse architect to help us build out and manage our data warehouse and surrounding systems, which will need to operate reliably at scale using a high degree of automation in setup and maintenance. The focus of the role will be on taking ownership of the data warehouse data model design and implementation, data ingestion, data lineage, data quality and data governance. This will be within the context of a cloud-native data platform consisting of both batch and streaming pipelines, but which prioritises event streaming based data acquisition and processing.

The individual will also need to be able to manage multiple stakeholders at an executive level and make well informed architectural choices when required. A high degree of empathy is required for the needs of the downstream consumers of the data artefacts produced by the data engineering team, i.e. the software engineers, data scientists, business intelligence analysts, etc and the individual needs to be able to produce transparent and easily navigable data pipelines. Value should be assigned to consistently producing high quality metadata to support discoverability and consistency of calculation and interpretation. 

 

A solid understanding of the retail banking domain is highly desirable. 

 

Candidates should have a wide set of experience across the following systems and languages:

  • Ideally GCP, but strong experience in another platform such as AWS or Azure will suffice
  • Cloud data warehouses such as BigQuery, Redshift or Snowflake
  • Data warehousing concepts such as Star Schema and Snowflake
  • Fluency in materializing data warehouse schema and relation through ER diagrams
  • Good understanding of relational databases as well as NoSQL databases such as MongoDB
  • Highly proficient in SQL
  • Event streaming platforms such as Kafka
  • Workflow scheduler such as Apache Airflow
  • Familiarity with Kubernetes
  • Python and Java.
  • Comfortable writing detailed design documents

Apply for this Job

* Required