In June 2017, Sumo Logic announced another $75M funding round led by Sapphire Ventures, with participation from new and existing investors including DFJ Growth, Greylock Partners, Sequoia Capital, and others ( https://www.sumologic.com/press/2017-06-27/75-million-funding-round/). This brings our total funding to $235.5M to date. Sumo Logic’s business has scaled significantly. We have also enjoyed consistent growth in our recurring revenue and customer count to over 1,600+ customers, reflecting every major vertical and company size.
Who Are We?
We are a secure, cloud-native, machine data analytics service, delivering real-time, continuous intelligence from structured, semi-structured and unstructured data across the entire application lifecycle and stack. Our mission is to democratize analytics, making it accessible, simple and powerful for businesses of all sizes to build, run and secure their organizations. With Sumo Logic, customers can harness the power of machine data to gain operational business and customer insights that lead to competitive advantage and differentiated customer experience.
What Do We Do?
Sumo Logic was founded in 2010 by experts in log management, scalable systems, big data, and security. We imagined a world of Yottabyte-scale machine data, where machine learning algorithms and advanced analytics could make sense of it all. Today, our purpose-built, cloud-native service analyzes more than 100 petabytes of data, more than 16 million searches, and delivers 10s of millions of insights daily – positioning Sumo among the most powerful machine data analytics services in the world. Our customers around the globe rely on Sumo Logic for the analytics and insights to build, run and secure their modern applications and cloud infrastructures. With Sumo Logic, customers gain a service-model advantage to accelerate their shift to continuous innovation, increasing competitive advantage, business value, and growth.
Our SANA team?
Sumo’s Security Analytics (SANA) team is building a world-class platform that will use bleeding-edge techniques to provide our customers insight in to the security of their cloud and on-prem operations. Our platform examines hundreds of billions of events each day, searching for threats by continuously identifying suspicious patterns of behavior. Our innovative investigation capabilities rapidly propel our customers to understanding so that they can react appropriately.
As part of the Security Analytics team, you’ll build the system that enables the identification and investigation of network security incidents at a greater speed and in greater volumes than ever before achieved. You’ll work with emerging technologies including data mining, machine learning, and graph databases. And you’ll solve difficult problems in data representation, data exploration, and data discovery.
You are a strong software engineer who is passionate about large-scale systems. You care about producing clean, elegant, maintainable, robust, well-tested code; you do this as a member of a team, helping the group come up with a better solution than you would as individuals. Ideally, you have experience with performance, scalability, and reliability issues of 24x7 commercial services.
Design and implement extremely high-volume, fault-tolerant, scalable backend systems that process and manage petabytes of customer data.
Analyze and improve the efficiency, scalability, and reliability of our backend systems.
Write robust code; demonstrate its robustness through automated tests.
Work as a member of a team, helping the team respond quickly and effectively to business
Push the envelope on deployment, management, and reliability of NoSQL databases.
Help manage exabytes of data using the latest and greatest technologies such as Kafka, Kubernetes and Docker!
B.S., M.S., or Ph.D. in Computer Sciences or related discipline
5+ years of industry experience with a proven track record of ownership and delivery
Minimum 2+ years of industry experience with Cassandra/NoSQL and/or code contributions for big data projects
Experience in multi-threaded programming and distributed systems.
Object-oriented programming experience, for example in Java, Scala or C#.
Understand performance characteristics of commonly used data structures (maps, lists, trees, etc).
Desire to learn Scala, an up-and-coming JVM language (scala-lang.org).
Experience in big data and/or 24x7 commercial service is highly desirable.
You should be happy working with Unix (Linux, OS X).
Agile software development experience (test-driven development, iterative and incremental
development) is a plus.
Within 3 months you will:
- Complete QoS/reliability of big data repository
Within 6 months you will:
- Deliver successful preview of big data repository
Within 9 months you will:
- Deliver successful beta launch of big data repository
To apply, please submit your resume, no cover letter necessary. We encourage you to submit any work you have done in the open. (Open source contributions, GitHub repos, etc.)