At DeepMind, we value diversity of experience, knowledge, backgrounds and perspectives and harness these qualities to create extraordinary impact. We are committed to equal employment opportunity regardless of sex, race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, pregnancy, or related condition (including breastfeeding) or any other basis as protected by applicable law. If you have a disability or additional need that requires accommodation, please do not hesitate to let us know.

Snapshot

As DeepMind continues to grow, we are seeking Research Engineers to join our new security and privacy team. 

Research Engineers at DeepMind lead our efforts in developing scalable software infrastructure to support the development of novel algorithms and architectures towards the end goal of solving and building Artificial General Intelligence, responsibly. Research Engineers work closely with other engineers and scientists to advance the field producing groundbreaking research artifacts.

About Us

DeepMind's mission is to "Solve intelligence to advance science and benefit humanity".

We are a growing global security team with a diverse set of skills supporting the organisation’s wider mission, objectives and priorities. The Security team plays a critical role in this mission in two main ways:

  • Protect our people, operations, and research
  • Develop our technology responsibly with a keen focus on safety, security and privacy

As an early member of the team, you will make decisions that will have a significant impact on the DeepMind mission now and into the future.

The Role

As a Research Engineer in the technical security and privacy team you will collaborate with other DeepMind Engineers and Researchers to research and build security and privacy into AI systems at the algorithm, data, and infrastructure level to minimise risks from emerging threats. You will work closely with stakeholders to deliver on the strategy for bringing safe and secure AI systems to the world. Your goal is to ensure DeepMind technology is built responsibly, with security and privacy at the forefront of every technical design discussion.

Key Responsibilities:

  • Devise novel or implement existing methods to assess, or to defend against, emerging risks to AI systems and associated data and infrastructure
  • Partner with DeepMind Research and Engineering teams to conduct research in the areas of safety, security, and privacy
  • Build tooling, infrastructure, and platforms that implement the latest research ideas in security and privacy of AI systems, at scale
  • Set priority and vision for security and privacy, research and engineering initiatives
  • Educate and bring awareness to security and privacy best practices and research at DeepMind, and the larger Google research community
  • Establish thoughtful leadership and drive community engagement through external engagements, publishing, and open sourcing

About You

To set you up for success as a Research Engineer at DeepMind, we look for at least the following skills and experience:

  • Master’s degree in Computer Science, Machine Learning, Computer Vision, or equivalent
  • Experience with at least one general purpose programming language (e.g. Java, Python, C++) and relevant ML frameworks (JAX, TensorFlow, PyTorch, scikit, etc.)
  • Experience leading research and tool development in one or more of the following: adversarial ML, differential privacy, confidential computing, vulnerability research, penetration testing (application, network), threat modelling, cryptographic systems, network security, identity and access management
  • Experience and track record of publishing research and presenting on the security and privacy of machine learning systems, and presenting at machine learning conferences (CAMLIS, ICML, NeurIPS, ICLR, etc.)
  • Passion for AI safety, security and privacy research, (eg. adversarial machine learning, privacy-preserving machine learning, etc.)

In addition, the following would be an advantage: 

  • Experience discovering and responsibly disclosing security vulnerabilities 
  • Experience presenting research at well known security conferences (e.g. Black Hat, Def Con, BSides, etc.)

Competitive Salary applies

 

Apply for this Job

* Required
  
(File types: pdf, doc, docx, txt, rtf)
  
(File types: pdf, doc, docx, txt, rtf)