WayRay is a global deep-tech company which is leading the world in holographic AR, and we are looking to hire out-of-the-box thinkers with a deep-tech mindset.

As a company, we are deeply ambitious, and we want everyone who works with us to share that ambition. We are not an average startup, and if we are a match, that means you are not an average tech professional. If you are ready to face challenges head-on and push yourself beyond your limits, then you are right for us. It won't be easy, but we promise it will be awe-inspiring.

At present, we are looking for a Computer Vision Algorithm Developer to be a key member of the Software department that is developing an augmented reality platform for cars equipped with our holographic head-up displays.

As a Computer Vision Algorithm Developer with a strong understanding of visual localization algorithms and multiple view geometry, your primary goal will be enhancing our localization stack and reconstruct the surrounding environment for proper placement of AR objects. Our software team utilizes an agile methodology to be more flexible and result oriented.

Primary tasks
  • Implement and integrate Visual-Inertial Odometry / SLAM algorithm with our localization
  • stack-based on GNSS, IMU and vehicle speed data.
  • Benchmark and test existing Visual-Inertial Odometry / SLAM solutions.
Expected Results

Improve existing Visual-Inertial Odometry algorithm:

  • Evaluate current Visual-Inertial Odometry algorithm performance on real data and compare with its performance on synthetic data obtained from a simulator.
  • Investigate critical performance issues and propose ways for quality improvement.
  • Solve critical issues of 2D features tracker in terms of quality and computational resources consumption.

Adaptation of developed algorithm for production:

  • Fine-tune Visual-Inertial Odometry algorithm implementation.
  • Computational performance improvement using SIMD/GPU + efficient algorithms to reach real-time speed preserving the original quality from the previous step.
  • Unit tests + integration tests covering all computer vision parts inside the localization stack.
  • Improve the quality of feature tracker preserving mean error of tracking in different conditions (including bad weather).

Long-term tasks:

  • Real-time 3D surrounding world reconstruction with dense point clouds/depth maps.
  • Real-time tracking of dynamic objects (vehicles, pedestrians) as part of SLAM algorithm.
Requirements
  • R&D experience in computer vision area, image processing or sensor fusion for at least 3 years.
  • Solid understanding of basic image processing algorithms: image filtering, image segmentation, camera calibration, object detection, classification.
  • Good knowledge of Multiple View Geometry and state of the art Visual SLAM approaches.
  • Solid math background including probability theory, statistics and numerical optimization.
  • Good knowledge of C++, including STL, OpenCV, Eigen.
  • Experience with Python, including numpy, scipy, pandas, matplotlib.
  • Good knowledge of Computer Science: understanding of data structures, asymptotic complexity, design basic algorithms.
As a plus
  • Experience in localization and attitude estimation for AR/VR applications.
  • Understanding of Kalman filter algorithm.
  • Familiar with dense images matching and point cloud processing.
  • Knowledge of optimization libraries as Ceres, g2o, GTSAM.
  • Familiar with OpenCL/CUDA, Neon/SSE, CMake, ROS, Linux Shell.
  • Familiar with modern practices of software development: git, TDD, CI/CD, etc.
  • Publication in top conferences and journals, PhD degree.
What we offer
  • Opportunity to work in a game-changing company alongside the best professionals from all over the world.
  • Transparency and openness on all levels.
  • Immersion in the Deep Tech culture where we give birth to real innovations.
  • Benefits package, including educational opportunities and stock option plan for all positions and grades.
  • Competitive salary and bonus for outstanding results.

Apply for this Job

* Required