We are currently looking for a talented and highly motivated software engineer to develop Data (Data Engineering, Data Architecture, Data Science) frameworks for next-generation web, mobile, and IoT Applications.

This team combines leading open source data frameworks (Spark, Jupyter, TensorFlow, NiFi, Kafka, Cassandra, Elasticsearch, HDFS and MPI) with D2iQ developed frameworks to provide highly complex and robust solutions for our customers. The customers deploying these services include some of the most innovative names in tech, cloud, and financial services.

This position will give you the opportunity to collaborate with the brightest engineering minds in big data and datacenter computing space. As a senior engineer, you should excel with minimal technical supervision, embrace time constraints, and work with team members to deliver high quality products and features.

The ideal candidate will have spent some time within a HPC, Scalable Computing, IoT, Research Institute or similar environment utilizing Docker, k8s, Mesos, DCOS and similar tools within an AWS, GCP, Azure cloud. 

Responsibilities

  • Design and implement new Mesos frameworks
  • Enhancing existing Mesos frameworks
  • Dive deep into data science/engineering technologies like Spark, Jupyter, TensorFlow, NiFi, MPI, Cassandra, Kafka, Elasticsearch, HDFS to integrate them into DCOS
  • Effectively estimate time to implement designs
  • Consistently make systems simpler

Basic Qualifications

  • BS or Master’s degree in Computer Science, related degree, or equivalent experience
  • 5+ years experience with OOP, and infrastructure design / coding skills
  • Self-driven and motivated, with a strong work ethic and a passion for problem solving
  • Experience in Java development / debugging and multithreaded programming
  • Able to debug, troubleshoot and resolve complex technical issues reported by customers
  • Currently residing in Europe

Preferred Qualifications

  • Data-engineering and modelling with distributed and relational databases
  • Experience with three OR more of the following:
    • Hive, Hadoop, Mongo DB, Tensorflow, DataStax, Cloudera, Spark, Jupyter, SAS, Kafka, Cassandra, Elasticsearch, or HDFS
  • Expert level of one or more of the following
    • Scala, Java, Python, R, SQL
  • Experience designing, implementing and operating large-scale stateful distributed systems
  • Experience with RDBMS internals, JDBC and SQL

D2iQ - Your Partner in the Cloud Native Journey

On your journey to the cloud, you need to make numerous choices—from the technologies you select, to the frameworks you decide on, to the management tools you’ll use. What you need is a trusted guide that’s been down this path before. That’s where D2iQ can help.

D2iQ eases these decisions and operational efforts. Rather than inhibiting your choices, we guide you with opinionated technologies, services, training, and support, so you can work smarter, not harder. No matter where you are in your journey, we’ll make sure you’re well equipped for the road ahead.

Backed by T. Rowe Price, Andreessen Horowitz, Khosla Ventures, Microsoft, HPE, Data Collective, and Fuel Capital, D2iQ is headquartered in San Francisco with offices in Hamburg, London, and Beijing.

Apply for this Job

* Required