Do you want to join a leading artificial intelligence solutions provider? Stradigi AI is looking for a Big Data Developer to join its growing team of data experts. You will be working with a passionate team to solve challenging problems and ensure that we deliver the best solutions to our customers. The ideal candidate should be ambitious, driven and comfortable in supporting our architects, data scientists, research scientists, analysts and software developers on data initiatives.

Responsibilities:

  • Designing, developing, testing, implementing and maintaining new and existing databases;
  • Designing back-end data structure through tables, views and stored procedures;
  • Identifying, designing, and implementing internal process improvements - automating manual processes, optimizing data delivery, etc.;
  • Building and operating scalable data pipelines that cleanse, structure and integrate disparate big data sets into a readable and accessible format for the data scientists facing Machine Learning pipelines and for end users facing reports, ad-hoc analyses;
  • Building processes supporting data transformation, data structures, metadata, dependency and workload management;
  • Working closely with the back-end developers, data scientists, research scientists and architects to assist with data-related technical issues and support data infrastructure needs.

Qualifications:

  • 5+ years of experience in a similar role;
  • Bachelor’s degree in Computer Science, Software Engineering or similar field;
  • Solid understanding of distributed computing principles;
  • Demonstrated proficiency with relational databases like: SQL Server, Oracle, MySQL, PostgreSQL, etc.;
  • Experience with data warehousing, ETL, reporting/analytics tools and scripting;
  • Familiar with NoSQL databases like: MongoDB, Cassandra, etc.;
  • Knowledge of AWS cloud services such as S3, EC2, EMR, ECR;
  • Experience with programming languages Java and Python;
  • Good knowledge of Big Data querying tools, such as Spark SQL, Presto, Impala;
  • Prior experience with any of data visualization tools like: Tableau, Amazon Athena, Kibana will be considered as an asset;
  • Experience in designing, building and maintaining ‘big data’ pipelines on cloud and On-premises will be considered as an asset;
  • Knowledge of various messaging systems like: Kafka, Redis, RabbitMQ will be considered as an asset;
  • Comfortable in supporting and working with cross-functional teams in a dynamic environment;
  • Excellent verbal and written communication skills;
  • Strong work ethic and team player.

What we offer:

  • Group Insurance and 3% company contribution to RRSP;
  • ½ paid day for your birthday;
  • Weekly team breakfast;
  • Monthly Lunch & Learn;
  • Referral Bonuses;
  • Company events organized by our social committee;
  • And much more!

 

Apply for this Job

* Required
File   X
File   X